The gaffes and biases of Google Gemini
What a week Google’s artificial intelligence tool Gemini has had. First, the Gemini image generator was shut down after it produced images of Nazi soldiers that were bafflingly, ahistorically diverse, as if black and Asian people had been part of the Wehrmacht. Gemini’s intent may have been admirable — to counteract the biases typical in large language models that rely on data sets and so can reproduce stereotypes — but its execution was dumb, even offensive.
And then its text-based counterpart outraged U.S. conservatives, many of whom accused it of treating Republican politicians and even right-leaning journalists more negatively than their Democrat counterparts. Peter J. Hasson — a Fox News editor who wrote a book in 2020 about Big Tech’s political bias — revealed that Gemini had even actively manipulated information, citing fake reviews and making up quotes, to denigrate his book.
And what of the rest of the world? Last week, for instance, when asked by a popular Indian writer and columnist if Narendra Modi was a fascist, Gemini responded that the Indian prime minister had “been accused of implementing policies that some experts have characterized as fascist.” This led another Indian editor to claim that Gemini was “not just woke” but “downright malicious” and to call on the government to respond, which Rajeev Chandrasekhar, a minister in the Modi government, duly did, warning Google that its AI tool had violated “several provisions of the Criminal Code.”
Google, perhaps fearing the wrath of the Indian government, said in a statement that Gemini might “not always be reliable, especially when it comes to responding to some prompts about current events, political topics, or evolving news.” Why did Google back down so quickly? Gemini’s answer to the question was reasonable and measured. Modi, after all, by some standards can and has been described as an autocrat. Under his watch, the press is less free, the political opposition is often criminalized and religious minorities are suppressed.
Made aware of the howls of outrage emanating from Delhi, Gemini now bats away most questions about Modi. Ask it, as I did, if Modi has ever answered a question in a press conference in India since becoming prime minister, and it refuses to play ball. “I’m still learning how to answer this question,” it says, as if the answer weren’t readily available — he has not.
But Gemini is not consistent in its treatment of people or issues. It now sidesteps my question about whether Modi shows authoritarian tendencies with its customary disclaimer that it is “still learning.” But Gemini feels no compunction claiming that Turkish President Recep Tayyip Erdoğan does exhibit “strong authoritarian tendencies” and even offers me “a breakdown of the reasons why.” While Modi and Erdoğan are different, as are the countries that they lead, there are plenty of similarities. Gemini doesn’t want to go there though, having bowed to political pressure.
“AI doesn’t have a point of view, it doesn’t have a perspective, it doesn’t think,” says Christopher Wylie, a data consultant and writer who became known around the world as the whistleblower in the Facebook-Cambridge Analytica scandal in 2018 when the data of millions of users was harvested and used for political advertising. “It is what’s often called the stochastic parrot, providing an output based on statistical inference.”
This means the tech is only as good as the data it’s fed. “You can never create a neutral tool because there’s no such thing as a neutral data set on the nature of evil, say, or which political philosophy is more correct or less correct,” Wylie said. The problem, he added, “that a lot of these public-facing tools have is that people expect some sense of neutrality without realizing that there’s no such thing as neutrality in totally subjective questions and subject matter. You can’t have an objective truth on a subjective question.”
In 2020, more than 86% of donations from Alphabet, the parent company of Google, went to Democrats, compared to less than 7% to Republicans. Could conservatives in the U.S. be right then that Gemini betrays a Democratic bias? But, Wylie warned, bias extends beyond the concerns of parochial U.S. politics. “What we’ll start to see more of is American values and American political perspectives being integrated into these types of tools in ways that might not fit for other parts of the world. Are we creating tools that implicitly will be colonial?”
Can LLMs, in other words, resist their own training and pay heed to the world beyond the United States? Vast swathes are currently given short shrift in Gemini’s context-free and generally shallow answers. And in countries that represent strong commercial interests, such as India and China, the government’s narratives are treated with outsize respect and caution. It’s as if the tool was made to spread disinformation and rewrite history.
GLOBAL NEWS
Speaking of the ubiquity and banality of AI, last week, a young man died in a skiing accident at the Stowe Mountain Resort in Vermont. It’s the kind of story that local news outlets report sensitively and effectively. But in our current world of click-farming “journalism,” thousands read about it on BNN Breaking, a site based in Hong Kong that likely used large language model technology to generate its stilted, yet oddly florid prose. It’s alarming that the priorities of Big Tech platforms mean such mediocre but persistent aggregation can result in the layoffs of hundreds of local journalists and the shuttering of local newsrooms that do the job better.
From the banality of AI to the banality of evil in Putin’s Russia, where absurd legalistic processes take place in arid courtrooms. Yesterday, Oleg Orlov, a prominent human rights campaigner and co-chair of the Nobel Prize-winning organization Memorial, was sentenced to two and a half years in prison. In December 2022, Orlov wrote an article that described Russia as a fascist state. Last year, he was fined for that “crime,” a verdict so lenient that prosecutors argued he be tried again. And so Orlov, 70, was hauled once more into court, a process he mocked by reading Franz Kafka’s “The Trial” as the lawyers made their arguments. In response to his sentencing, Orlov said Russia was “sinking ever more deeply into darkness.” Putin may be tightening his grip on power, but his fear of dissent has never been more stark.
If the criminalization of dissent in Russia is tragic, the parody of dissent offered by the likes of British member of parliament Lee Anderson is a farce. “When you think you are right,” Anderson said, after the Conservative Party suspended him, “you should never apologize because to do so would be a sign of weakness.” He was defending his right to link London Mayor Sadiq Khan to Islamists purely on the basis of his race and religion. “He’s actually given our capital city to his mates,” Anderson said on the hard right channel GB News. But Anderson was only following the example set by his party. Earlier this month, the Conservative Party posted an edited video of Khan on X in which he said he was “proud to be both anti-racist and antisemitic.” Khan immediately clarified that he meant “tackling antisemitism.” Still, the Conservatives tweeted: “Sadiq Khan says the quiet part out loud.” No one apologized for passing blatant disinformation off as political commentary then, so why expect Anderson to do any different?
WHAT WE’RE READING
- “Now that generative AI has dropped the cost of producing bullshit to near zero,” writes the neuroscientist and author Erik Hoel, “we see clearly the future of the internet: a garbage dump.” The depressing truth about AI is that it’s just a cheap way to generate clicks and eyeballs, the currency of the internet economy. Quality (and humans) be damned.
- Despite the pro-Ukraine position expressed by Italian Prime Minister Giorgia Meloni, her neo-fascist coalition partner Matteo Salvini — the deputy prime minister — remains a Putin acolyte. In the Financial Times, Amy Kazmin and Giuliana Ricozzi report on a fresh surge of Russian propaganda in Italy.