The Guardian recently published a story about how the Pentagon is piloting swarms of surveillance balloons that will watch over several U.S. states in order to support criminal investigations. They would essentially track all vehicles.

These kinds of law enforcement projects have become unremarkable. After all, what can we do to halt the march of technology? New products come to market, and their most invasive applications seem to follow soon after. From this pilot program, it’s likely not far to full implementation across the States. 

A side note: For the past few weeks, I’ve been talking to historians of technology for another Coda project, and a few of them have expressed sentiments about the inadequate framing of conversations about technology. Our debates about technology are too much about technology, they say. Is AI dangerous? Is facial recognition racist? Is social media bad for our politics?

Back to the Guardian article on surveillance balloons, where there’s an interesting quote in the piece: “[I]f they decide that it’s usable domestically, there’s going to be enormous pressure to deploy it,” said an ACLU representative.

Notice the word “pressure.” What is this “pressure”? Where is it coming from?

The premise behind these questions is that technology is the actor. Once you have the technology, it changes society in a certain way. Certainly, people talk about regulation, but it is usually conceived of as a Sisyphean task, perpetually a step behind the tech. But what these scholars have suggested to me is that we’re missing the human element: The way we design and deploy technology, as well as the way we talk about it, are human constructs.

So let’s revisit this balloon program: The U.S. is testing a balloon mass-surveillance system. If it works, it may be deployed across the country. There would be “pressure” for this to happen. But the pressure is not neutral, or inherent in the technology. According to Allied Market Research, the global video surveillance market will be worth more than $80 billion by 2025, while US defense companies spend over $100 million on lobbying members of Congress every year. The company behind the balloons, Sierra Nevada, spent over a million last year. These processes, not the balloons themselves, are the cause of proliferating mass surveillance.

A similar “just because we can” attitude in tech circles also came up in a recent New York Times op-ed from, of all people, Palantir founder Peter Thiel.
Thiel is upset that Google opened an artificial intelligence lab in China, which could easily expose its AI advances to the Chinese military. “A.I. is a military technology,” writes Thiel. He quotes Google saying that “A.I. and its benefits have no borders.” Here, Thiel criticizes a similar mentality, a pressure, if you will, to advance and spread innovation, the idea that more technology in more places and more deployments is always good.

A similar attitude is the subject of an article Charles Rollet reported for us last week, on the role of Western academic institutions in developing “ethnicity detection” technology in China. It seems that there, the idea that scientific knowledge is always good, and should be shared, became an alibi for developing racist technology.

In a newsletter from three months ago, I mentioned a researcher who had suggested that China’s social credit system might become a kind of trade barrier by discouraging Chinese consumers from purchasing foreign products. Something not entirely dissimilar is beginning to happen, but instead of consumers the social credit system is targeting foreign companies directly, according to Axios.

An interesting example: Remember when China told U.S. airlines to stop listing Taiwan as an independent country? Apparently, a potential punishment for non-compliance was to reflect “your company’s serious dishonesty” through the social credit system.

OTHER NEWS:

  • “Gamification” has been a buzzword for years, fueled by excitement about using technology to make boring tasks more exciting. But a Logic Magazine essay, built around a Lyft driver’s personal experience, persuasively argues that gamification makes it easier to exploit workers. Workers, rather than seeing a situation in which they get mistreated, focus on beating “the game.” The illusion of possible victory makes them both more productive and more docile. Of course, governments are happily applying these lessons to ideology — recall China’s now-notorious Xi Jinping education app that gamified party indoctrination. (Logic Magazine)
  • China’s state-sponsored hackers are likely hacking video games on the side — yet another snapshot into the links between government and criminal hacking. (MIT Technology Review)
  • In April, we reported on two Saudi sisters who escaped their country by breaking into an app that regulated their travel. Media organizations are reporting those travel restrictions have been relaxed. (Channel 4)
  • New U.S. government rules will make it more difficult to legally go after algorithmic discrimination. (Vice)
  • A new report analyzes the possibility of “rogue” countries like North Korea setting up an alternative financial system based on cryptocurrency. This would help them evade US sanctions. (Foundation for Defense of Democracies)