‘Maybe it’s just better not to speak at all’: Ex-Google employee spills the beans on ‘echo chamber’

In the wake of the disastrous launch of Google’s demonstrably woke Gemini AI chatbot, a former Google software engineer warns that the “echo chamber problem” at the Big Tech behemoth has only been “amplified” by artificial intelligence.

As BizPac Review reported, when asked to generate images of America’s Founding Fathers, Gemini reimagined them as people of color, and questions about pedophilia were enough to make users sick with fury.

Mike Wacker worked at Google from 2014 to 2019.

The company fired him, he believes, because he voiced his conservative political views. According to FOX Business, he “ran a Republican newsletter that became the target of several HR complaints during his time at the company.”

Google’s flaws, Wacker said, go a lot deeper than the algorithm.

“The AI is more humanlike,” he explained. “But, on the flip side, it is more susceptible to the biases of the humans who are training these systems.”

ADVERTISEMENT

“The echo chamber problem has always been there for a long time, but the introduction of AI has really amplified it,” he said.

If your values don’t align with the progressive agenda, Wacker said, Google’s workplace environment is hostile and you risk being punished. The mindset, according to the engineer, bled into Google’s products.

“If you are a conservative or you’re Christian, you must very carefully measure your words,” he said. “If you went into the wrong topic, maybe it’s just better not to speak at all.”

“That,” he added, “definitely has an impact when they’re getting feedback internally — which voices do you hear, and which voices don’t you hear?”


In a statement following Gemini’s spectacular fail, Google pulled the plug on the service, admitting the AI image generator “missed the mark.” CEO Sundar Pichai sent a company-wide memo in which he called Gemini’s performance “unacceptable.”

ADVERTISEMENT

But, according to Wacker, Google’s search team also manually manipulated abortion query results.

“There was a list of search query terms,” Wacker recalled. “An alternative algorithm would be used for those search terms. I found the exact change which added abortion and abortions to the special list.”

“This happened weeks after Pichai told Congress in sworn testimony, ‘We don’t manually intervene in any particular search result,'” he noted.

On X, Wacker continues to voice his concerns, calling out Google for its woefully woke agenda.

ADVERTISEMENT

“Google employees are afraid to say the emperor has no clothes,” he wrote on Wednesday. “Gemini, Google’s new AI tool, said there was no definitive proof Hamas committed rape in Israel.”

“As an ex-Google engineer,” Wacker stated, “I know the Gemini disaster is a symptom of Google’s DEI orthodoxy.”

Melissa Fine

Comment

We have no tolerance for comments containing violence, racism, profanity, vulgarity, doxing, or discourteous behavior. If a comment is spam, instead of replying to it please click the ∨ icon below and to the right of that comment. Thank you for partnering with us to maintain fruitful conversation.

Latest Articles