Authoritarian tech, even in its computerized form, is not as new as you might think. This month, the journal Security Dialogue published an article that shows that. It’s called “Sensing, territory, population,” and it studies the deployment of something called the Hamlet Evaluation System (HES) in the Vietnam war. The HES was a kind of proto-big-data program, which let the U.S. military collect and aggregate data on small settlements (hamlets) around Vietnam, so that they could keep track of the war’s progress. The paper’s author wants us to think about what it meant for a complex, controversial war to be transformed into a series of data points:

“I argue that acts of translating the rich texture of hamlet and village life into an objectified information format constituted a unique form of ‘epistemic violence,’ rooted…in the pure abstraction of life into a digitally stored data trace.”

So, in 1967, the U.S. military was already doing something that we can now see as the “abstraction of life into a…data trace.” 

If this isn’t authoritarian tech, what is? That’s not a rhetorical question. We at Coda chose to create a channel called “Authoritarian Tech” because we think that something interesting is happening at the intersection of politics and technology. But what is that something, and is it a coherent phenomenon we can name? More and more, I think the “abstraction of life into a digitally stored data trace” is at the center of the answer.

I’m reading a book called New Dark Age: Technology and the End of the Future, by James Bridle. It argues that digital technology has, among other things, reduced our ability to know and act on the present. This is a paradox: The proliferation of data, the universal availability of more and more information, and the increased sophistication of computer models all suggest the opposite. But, to Bridle, when life gets digitized, we lose sight of it. Crucially, this distortion is political. When we pass the world through a technological lens, we get a version that reflects the political views of those who created the technology. Twitter, say, reflects a certain view of what a public sphere means, how political discourse happens, and what a user is interested in seeing. 

“Technology is not mere tool making…it is the making of metaphors,” Bridle writes. “In making a tool, we instantiate a certain understanding of the world that…is capable of achieving certain effects in that world.”

So here is how I see this as connected to authoritarian tech: Technology is a tool, but it is a tool shaped heavily by corporations and governments with specific political agendas. The technology we use, in turn, shapes our view of the world, our ability to make decisions, and circumscribes the realm of political possibility. 

This is a political power grab, away from the democratic subject, and towards unaccountable product designers, government censors, and technocrats. “Computation, at every scale, is a cognitive hack,” Bridle writes, “offloading both the decision process and the responsibility onto the machine.”

Further reading:

  • Web browsers are uniting to stop Kazakhstan’s plans to monitor citizen’s internet use. (Axios
  • Climate change endangers the internet. This article about it is interesting, but it focuses entirely on the literal physical effects of a warming planet on internet infrastructure. There’s surely a political-sociological side to this story that is just as interesting, if more speculative. (Gizmodo)
  • How Spain’s far-right Vox party is mastering social media to reach young voters. (Open Democracy)
  • AI is being pitched as a way to reduce hate speech on social media. There is one problem: *drum roll* the AI might itself be racist. (Vox)
  • Is technology actually addictive? Skip the tepid debates and read this in-depth psychology paper trying to answer the question in good faith. (Journal of Public Policy & Marketing)