Donald Trump’s second impeachment trial began with a haunting video. Representative Jamie Raskin, the lead impeachment manager, screened a compilation of clips gathered from the Washington D.C. riots in January, tracing the mob’s route down Pennsylvania Avenue and documenting how its members smashed their way into the Capitol building.
The footage was filled with violence, obscenities and disturbing images of an enraged crowd. The video was compiled from a variety of sources, including clips posted to social media by the rioters themselves. It played to a hushed chamber. When it had finished, Raskin said: “You ask what a high crime and misdemeanor is under our constitution? That’s a high crime and misdemeanor.”
Although Trump was acquitted by the Senate, evidence gathered from social media played a central role in the case for impeachment. “There were countless social media posts, news stories, and, most importantly, credible reports from the FBI and Capitol Police that the thousands gathering for the president’s Save America March were violent, organized with weapons and were targeting the Capitol,” said Raskin on Wednesday. “As they would later scream in these halls and as they posted on forums before the attack, they were sent here by the president.”
The storming of the U.S. Capitol was one of the most extensively and closely documented events in modern history. Around the world, people watched the violence unfold in real time, via livestreams, selfies and social media posts. Rioters uploaded videos and commentary of themselves entering the building. But as quickly as the evidence was shared, it began to disappear. Within hours, a call went out from the online investigation platform Bellingcat: Save everything you can, while you can.
“There’s a very short lifespan for these videos,” said Aric Toler, who leads the organization’s research projects. Rioters quickly understood that their content could be investigated by law enforcement and began to delete it. “As soon as they’re finished livestreaming — maybe 15 minutes later — they realize, ‘I probably shouldn’t have done that,’ and take it down.”
Platforms also wiped large amounts of data. YouTube flagged and removed footage filmed inside the Capitol that showed people carrying firearms or encouraging violence, with the gaming platform Twitch taking similar steps. Meanwhile, Facebook blocked any posts with the hashtag #StormtheCapitol and removed those celebrating the insurgency. Before the sun had set over Washington, the digital evidence of that darkly historic day had begun to disappear.
Enormous public interest helped Bellingcat’s crowdsourced effort, an attack on the heart of American democracy ensuring global attention. Within minutes of the riots starting, the team began to create its own archive of the insurrection. They drew on contributions from hundreds of volunteers from around the world, who submitted livestreams and videos of the day’s events, which they had scraped and saved before it was too late. Without their efforts, some of the evidence might never have been salvaged.
Platforms face a number of conflicting demands. They are legally required not to incentivize violence with the content they host. But, by removing images and videos without preserving them in a systematic way for researchers, journalists and historians to consult, they are wiping away chunks of potentially vital evidence, creating blank spaces in what amounts to a digital record of a crime scene.
Facebook and Twitter said they were complying with law enforcement requests to preserve content in the wake of the Capitol riots. But collecting online evidence remains a strikingly haphazard process for law enforcement agencies: the FBI asked the public to submit their own footage and images of the riots on the day after the insurrection.
The Capitol was not the first time that history was recorded — and then erased — in real time, online. When Syria began to slide into civil war in 2011, the value of social media as a documentary tool came into sharp focus. As President Bashar al-Assad’s forces shelled the city of Homs, residents and citizen journalists began to upload hundreds of hours of footage to YouTube. Then the platform started to delete some of the content, wiping away valuable evidence of war crimes committed by the regime.
Shiraz Maher is a historian at King’s College London, specializing in political extremism. Describing how, until that point, he had not considered systematically backing up the videos of the attack, he explained that on YouTube’s side, “there were pretty crude efforts to remove anything that was seen as gory war kind of stuff.”
In response, projects like the Syrian Archive were established to scrape, save and document as much of the existing footage of the conflict as possible. Maher said the perpetrators of Syria’s war crimes may escape justice in the short term, but that, decades from now, “the arc of justice may well catch up with them. They may well find themselves in The Hague or elsewhere, having to account for what they’ve done.” An archive of videos that would otherwise have been expunged from social media platforms could hold crucial evidence.
Everything we experience, from conflicts to natural disasters and protests, is now uploaded to the internet — but it is not necessarily stored. Platforms are continually removing extremist or criminal content. People, like those at the Capitol, self-censor, removing their own videos and photographs as soon as they realize the legal risk of such posts. Then, there’s what Toler calls “natural digital rot,” where content disappears because platforms and hosting services cease to exist. But most concerning is the fact that the battle social media platforms are waging against online disinformation is leading them to inadvertently rewrite modern history.
Maher described how, during the early days of his academic career, he spent his time in the hushed reading rooms of the British Library in London, leafing through old Foreign Office documents and letters. As a director at the International Centre for the Study of Radicalization and Political Violence, he now has to race to preserve online content before it is deleted.
“I look at my daughter, and I think ‘Where will she be, if she’s a historian in 20, 30 years’ time?’” he said.
The impermanence of the internet was made particularly clear when a gunman opened fire on worshippers outside a synagogue in the German city of Halle in October 2019. The attacker live-streamed the rampage on Twitch. Maher’s team started to download the file, but before they could finish, the video disappeared.
“We had only got half of it, but one of my colleagues still had it open on her screen,” Maher explained. In the end, one of his colleagues recorded it by filming the computer screen with a mobile phone.
In June, YouTube said 80% of videos flagged by its artificial intelligence software in the second quarter of 2019 were deleted before anyone had seen them.
Elsewhere, social media companies have seen the consequences of violent content on their platforms. Facebook admitted in 2018 that it had been used to “foment division and incite offline violence” in Myanmar after users were given almost free rein to post content targeting Rohingya Muslims.
Myanmar’s military campaign against the Rohingya, described by the United Nations as a “textbook example of ethnic cleansing,” has forced more than 740,000 people to flee to Bangladesh. Human rights groups say Facebook was used by the military to spread genocidal commentary and incite violence.
In early 2015, the company had only two Burmese-speakers to moderate content for the entire country. In an apparent effort to make up for this shortfall and eradicate anti-Rohingya posts, Facebook then removed hundreds of pages, accounts and groups, which collectively formed a vast body of evidence of human rights violations. When Gambia brought a genocide case against Myanmar before the United Nations’ International Court of Justice in early 2020, Facebook refused to comply with requests for access to it.
“We support action against international crimes and are working with the appropriate authorities as they investigate these issues,” said a Facebook spokesperson at the time.
Depending on the country, there is no consensus on whether social media content can be recovered. “It’s so inconsistent,” said Gabrielle Lim, a technology researcher at Harvard Kennedy School’s Shorenstein Center. “If you suspect you’ve been deplatformed and you happen to be a journalist or an activist or American, you probably have a better chance of getting some sort of remedy or answer to why this happened.”
Though some countries — including the U.K., Germany, and the U.S. — have data preservation laws that require platforms to preserve deleted content, campaigners have lobbied for more transparent, consistent access.
Lim and her colleagues have been calling for a centralized “human rights locker,” where deleted data can be kept safe and accessed by journalists, researchers, law enforcement and historians.
“It’s a conversation our team has, like, every week: ‘What will historians write in 50 years’ time?’” Lim said. “If so much of our everyday life is now online, I think that historians also should have access to that data, so they can piece it together.”
When Twitter permanently suspended then President Donald Trump last month, his page went blank. His posts have been preserved on thetrumparchive.com and the National Archives, but are no longer available via Twitter. In January, the platform announced that it was opening up its entire archive to researchers and historians, but that it would not provide access to tweets from banned accounts – including Trump’s.
In the days and weeks following the Capitol riots, Bellingcat’s loyal volunteers have continued to submit evidence of the violence. Toler believes the organization will end up with around 500 videos, “which is not everything, but is a pretty big chunk of what was happening.” The footage Raskin showed during the impeachment may change history. Whatever is still missing, however, may now have been ripped from its pages.