As far-right violence and anti-Semitism rears its head in Germany, lawmakers believe that holding digital platforms to account could help bring far-right extremists to justice.
Ministers in Chancellor Angela Merkel’s cabinet recently approved a bill to toughen existing online speech laws and force social networks such as Facebook and Twitter to report criminal posts to the police. Under the planned new law, social media platforms will not only have to delete certain kinds of hate speech, but also flag the content to the Office of the Federal Criminal Police (BKA).
Companies will be required to report posts which include information about preparations for terrorist attacks and the “formation of criminal and terrorist groups.” The new law extends to multiple forms of hate speech, including racial incitement and the distribution of child pornography. The networks would also have to deliver “the last IP address and port number most recently assigned to the user profile” — which could be passed over to prosecutors.
The new measures, which require the approval of Germany’s Parliament, were first proposed in the aftermath of a terror attack in the east German city of Halle in October, when a gunman targeted a synagogue, killing two people on Yom Kippur, the holiest day in the Jewish calendar.
Germany’s most recent law against hate speech and disinformation, the Network Enforcement Act (NetzDG), was passed in 2017. The legislation, which covers social media networks and media sites with more than two million members, allows users to report content, such as incitement to violence or threats that could be considered illegal, and obliges tech companies to delete posts within 24 hours under penalty of hefty fines.
German officials say the rules have improved how digital platforms respond to illegal hate speech.
While the new law covers YouTube, Facebook and Twitter, messaging services like Telegram, Discord and other video and gaming platforms like BitChute and Steam are not included.
The proposal comes as Germany continues to reckon with the first murder of a politician by the far-right since reunification in 1990 — Walter Lübcke, the former president of the local government of the city of Kassel was shot dead last June. A far-right extremist, Stephan Ernst, 45, has confessed to murdering Lübcke.
Critics of the new law say it could give tech companies the responsibility of deciding what deserves prosecutors’ attention in the first place.
“Tech companies will need to establish whether a certain content is punishable. This exacerbates the issue of private companies having a say on legal matters that should be a prerogative of the judicial system,” said Simone Rafael, editor in chief at Belltower News, a Berlin-based digital publication covering extremism and its internet grammar.
Switching platforms
One of the central assumptions of the law is that online hate can have dangerous consequences in real life if not curbed by robust legislation. But critics say the methods used by far-right groups to communicate on the web might get in the way of the law’s effectiveness.
One recent report by the Amadeu-Antonio Foundation, a Berlin-based think tank that studies right-wing extremism, racism and anti-Semitism, shows that right-wing web figures have sought to bypass existing regulation and platforms’ own community standards for hate speech.
The report shows how far-right influencers have toned down their video content on platforms like YouTube, while using divisive or hateful language on other video platforms like BitChute.
“One of the big problems is when you don’t consider what people do on other platforms, so it can clearly be that a person is a Nazi and argues for violent ethno-genocide, but if he has a YouTube channel and doesn’t do it there, then it’s not a problem,” said Miro Dittrich, a project leader at the Amadeu Antonio foundation.
The foundation’s report singled out 11 far-right YouTube channels with followings ranging from 43,000 to about 200,000. Eight of them are managed by independent right-wing influencers, two by the right-wing Alternative for Germany party, and one by the alt-right German news outlet Compact TV. All of the organizations also operate channels on BitChute.
BitChute did not return answers to questions submitted by Coda Story for this article. Research from the Amadeu-Antonio Foundation indicates the video hosting service had 26,000 users in 2017.
One of the biggest YouTube channels is run by German far-right influencer Tim Kellner — a former policeman who was once investigated for the promotion of prostitution, attempted extortion and assault — and has more than 200,000 subscribers. But while his content is regularly deleted under YouTube’s hate speech guidelines, the same doesn’t happen on BitChute, where he has 2,300 subscribers.
Kellner encourages his subscribers to leave YouTube in favor of BitChute and other platforms.
A German neo-Nazi called Dennis-Ingo Schulz, known online as “The True Association,” was banned from YouTube in 2016, but his videos are available on BitChute, where he has 1,379 followers and hosts, among other things, revisionist discussions on the Holocaust with pseudo-historians.
While the new law aims to target exactly this type of content, digital experts are skeptical about its effectiveness as it applies to only the most popular social media platforms.
According to Matthias Kettemann, a lawyer and senior researcher in digital communication regulation at the Leibniz Institute for media research in Hamburg, “de-platforming” hate groups or individuals from leading digital platforms is sometimes counter-productive. “No doubt that extremist content is still much present on platforms, but de-platforming those sites isn’t the solution because [content] will migrate to more obscure ones,” he said, in a phone interview.
Dittrich’s most recent research indicates the crackdown on hate speech from media companies like Facebook, YouTube and Twitter has led to an exodus of extremist accounts to other platforms.
“Telegram for example is becoming more and more like a social network than a messaging platform,” said Dittrich.
Last month, German police arrested twelve men suspected of planning attacks against mosques, migrant reception centers and politicians after a nationwide investigation into an extreme-right group. The arrests followed raids in 13 locations in six German states. According to the national news agency Dpa, the founders of the cell first met on a Telegram channel.
Telegram, which has two million users in Germany, has proven to be a fertile ground for right-wing extremism. In September 2018, far-right demonstrators who gathered in Chemnitz used Telegram channels to network and organize marches and meetings.
Telegram did not respond to questions submitted by Coda Story for this article.
Dittrich’s monitoring project currently counts more than 300 German-speaking far-wing Telegram channels, many of them containing calls for a violent takeover of the state to safeguard whites, as well as conspiracy theories about Jews and disinformation about race and Islam. Some channels are used to exchange racist memes and pictures of Nazi memorabilia.
One of the most vocal promoters of a switch from YouTube to Telegram is Martin Sellner, the de-facto leader of the far-right European Identitarians. The group was banned from Facebook in 2018 for violating the platform’s policies. Sellner, a 31-year-old Austrian citizen with past links to neo-Nazi movements, leveraged his popularity on YouTube last summer to call for his followers to move to Telegram and “free themselves from Silicon Valley overlords.”
The new law has attracted criticism from press freedom organizations like Article19 who have raised concerns that it could undermine free speech. “We believe that addressing hate speech and underlying root problems and prejudices that hate speech is symptomatic of, requires more than knee-jerk responses,” wrote Barbora Bukovska, senior director of law and policy, in an email.
The threat from far-right extremism resurfaced last month when a deadly shooting rampage at two shisha bars in the west German city of Hanau left nine people dead. While the suspect Tobias R, 43, doesn’t appear to have had a presence on Telegram, his name was celebrated on the platform after the attack.