BENOIT TESSIER/AFP via Getty Images

newsletter

Facebook’s fatal failures in Ethiopia, terrorism on YouTube and the not-so-new problem of toxic Twitter

Meareg Amare, an Ethiopian chemistry professor, was assassinated last year after people on Facebook called for his killing. War has raged in Africa’s second-largest country for more than two years, taking the lives of over half a million people, and Facebook has been a key organizing space for opposing factions and paramilitary groups in the conflict. This fact is central to a landmark legal action against Meta, Facebook’s parent company, that was filed before Kenya’s High Court last week. A group of tech-savvy legal advocates from Kenya and the U.K. are representing the plaintiffs, who include Amare’s family, and will be arguing that the company systematically underserves users in the Global South by allowing hate speech and threats of violence to spread unchecked. The petition calls on Meta to start “demoting violent incitement” regardless of geography — and notes how quickly the company has responded when chaos ensues at home, like after the U.S. Capitol riots in January 2021. It also proposes that Meta pay steep fines whenever it fails to remove posts that could bring harm to people, a measure that has succeeded in some jurisdictions — see Germany’s NetzDG, a.k.a. “Lex Facebook.” This and similar legal actions (I’m thinking specifically of lawsuits concerning Facebook’s role in the genocide of Rohingya Muslims in Myanmar) might be marking a new chapter in public efforts to hold Big Tech companies accountable for the real-world impacts of their products.

On a related note, here’s a tough question: Should YouTube be held responsible for recommending videos that “radicalize” terrorists? The U.S. Supreme Court is expected to deliver a decision on Gonzalez v. Google in the first quarter of 2023, alongside a ruling in Twitter v. Taamneh, Twitter’s appeal of a case with similar contours. Reynaldo Gonzalez filed suit against Google (the owner of YouTube) after his daughter was killed in a 2015 terror attack in Paris for which ISIS claimed responsibility. Gonzalez says his daughter’s killers were radicalized on YouTube and that the platform’s recommendation algorithm, which constantly suggests new videos for users to watch, played a key role in how this happened. 

“Sorry, but that’s not our fault,” says Google. So far, the $1.16 trillion company has successfully argued that Section 230 of the Communications Decency Act — also known as the “26 words that shaped the internet” — protects it against liability for whatever videos people see when they’re on the site. But what if YouTube’s algorithm suggests that you watch a certain video, and then you decide to kill other people as a result? When the law was inked back in 1996, I’m pretty sure no one was thinking about this possibility. So the court has agreed to hear the case, in a moment when many politicians in the U.S. (and at least one Supreme Court justice, Clarence Thomas) have their sights set on tightening the scope of the law. A smattering of amicus briefs from tech law scholars, engineers and tech industry experts have been filed in recent weeks, promising a stimulating debate around whatever the court decides. We’ll have more on this in the new year.

A ‘CRUEL TWIST OF FATE’ FOR TWITTER ELITES

There seem to be new catastrophes at Twitter every few minutes — hate speech on steroids, media censorship, security vulnerabilities and regulatory doom. Elon now says he wants someone else to take over, but this probably won’t cause any of the big problems at hand to evaporate. After several prominent U.S. journalists had their accounts suspended last week, presumably over their coverage of Elonjet, a few colleagues told me they wanted to quit the platform altogether. With so many people getting censored or going silent, this is fair enough.