Local governments are weakening facial recognition bans
If you follow news about Clearview AI, you probably know that the controversial facial recognition firm was dealt a substantial blow last week after it settled a lawsuit brought by the American Civil Liberties Union under Illinois’ Biometric Identification Privacy Act (BIPA), which regulates how private companies use biometrics.
There were some big wins for privacy advocates. According to the settlement, Clearview can’t sell its tech in Illinois for five years, and people in the state will be able to opt out of their facial scans appearing in Clearview’s database search results. As a result of the settlement, Clearview is prohibited from selling access to its facial recognition database to most private companies.
Is this a landslide victory for privacy? Regular readers of this newsletter may know I am fascinated with BIPA and the unique role the Illinois law plays in giving people in the U.S. a pathway to push back against companies using their biometrics. But the Clearview settlement also showed some of the limitations of BIPA, at a time when other privacy regulations are under attack. This has not gone unnoticed.
“One of the most important protections that we would hope for is not in that settlement, which is Clearview having to get opt-in consent from an Illinois person before taking their face print,” said Adam Schwartz, a senior attorney at the Electronic Frontier Foundation (EFF), a digital rights group who filed an amicus brief supporting the ACLU’s case.
In Illinois, private companies need your permission to collect your biometric data. But Clearview, which has built its business on partnering with law enforcement, gets around this requirement because the law makes an exception for government contractors.
“This is a reminder that we don’t just need BIPA nationwide. We also need to be stronger. We need to get rid of that exception for government contractors. And we also need a ban on the government using face surveillance, including through contractors,” said Schwartz.
EFF has been pushing for a similar law in California that, notably, doesn’t have that government contractor exception.
The Clearview settlement comes at a time when local governments have been trying to chip away at privacy laws across the U.S.
There has been a wave of cities and states cracking down on government use of facial recognition over the past few years. EFF now counts 17 cities and counties with bans in place. Other jurisdictions, like the state of Massachusetts, have imposed strict limits on how police can use the technology.
But the mood is starting to change. With local politicians sounding the alarm about crime rates, hard-won privacy legislation is facing new pushback in some places.
“Any time that privacy and racial justice advocates try to limit a surveillance technology that invades privacy and civil rights, there’s a fight because police want these tools, and corporations want to sell these tools,” said Schwartz.
The fight is ongoing in New Orleans. In February, Mayor LaToya Cantrell called on the city council to scale back the blanket ban on facial recognition that the city enacted in December 2020. The mayor wants to introduce exceptions to that ban, including facial recognition to be used investigations into violent crimes, sex crimes and crimes against juveniles. It also would allow police to use information collected by facial recognition systems run by other agencies. The city council is expected to take up the issue in the coming weeks.
Other jurisdictions are further ahead. The state of Virginia will repeal its ban on local police using facial recognition on July 1, exactly a year after that law went into effect. Police will be able to use facial recognition if they have “reasonable suspicion” a suspect committed a crime or even to identify victims or witnesses. Can you guess which facial recognition company hired lobbyists to push for the repeal? Clearview AI.
Virginia will now require any facial recognition tool used to be at least 98% accurate, per evaluations conducted by the National Institute of Standards and Technology.
This is important, since research by academics and federal institutions alike has shown that leading facial recognition tools have a tendency to misidentify people of color, especially when they’re female.
But to Schwartz at EFF, replacing a ban with an accuracy test is a step backward.
“If there was a bill that was nothing except let’s have accuracy testing, to us, that would be a garbage bill. It does not go nearly far enough,” he said.
In the U.S., the overall trend is still moving towards establishing privacy legislation and restrictions on facial recognition. But Virginia and New Orleans show how the pendulum does swing back and forth. To privacy campaigners like Schwartz, it underlines the need for constant vigilance.
“The lesson from this campaign to push back is to stay permanently mobilized to protect data privacy. And that means permanently trying to pass new bills and permanently trying to protect the old laws and just never stop fighting for privacy.”
IN OTHER GLOBAL NEWS:
The internet went dark in Iran’s Khuzestan Province, amid protests that broke out on May 6 in response to rising food prices. Human rights activists are warning that the disruptions, which have spread to other parts of the country, might be an ominous signal of violent crackdown on protesters. Internet shutdowns are nothing new in Iran. In 2021, authorities cut off mobile internet service in select provinces due to violent confrontations between fuel traders and authorities. The government instituted a nationwide shutdown during protests over fuel prices in 2019, when security forces killed over 300 people, according to this report by Amnesty International, the Hertie School and the Internet Outage Detection and Analysis project.
RuTube, Russia’s attempt at a localized YouTube, was taken offline for three days last week after a crippling cyber attack. Hackers from Anonymous claimed responsibility for the attack. Russia has shut down access to most foreign social media and invested in home-grown platforms like RuTube, but this incident shows that local companies are vulnerable to hacking. A Twitter account tied to Anonymous tweeted: “#RuTube is probably GONE FOREVER,” noting that the amorphous online collective had destroyed RuTube’s backups as well. But that turned out not to be the case. RuTube is back online.
Belgium’s federal government approved plans to create a biometric database that will monitor migrants and people from non-EU countries traveling within the bloc. The digital tracking system will store people’s biometric data and travel documents, as well as information about where and when they entered the EU. Proponents say it will streamline the process of tracking down people who exceed their stay or move between EU countries without the proper documentation. The database, which would be accessible to all member states, is part of the EU’s Entry/Exit System, a plan for registering travelers from outside the bloc that is to be fully in place later this year.
WHAT WE’RE READING:
- Decisions made in Silicon Valley impact people way beyond that wealthy, image-obsessed tech bubble. But it would be a mistake to think that major tech decisions and innovations are only coming from there. Our colleagues at Rest of World have identified the most influential people in Big Tech outside the west. Check out their inaugural RoW100: Global Tech Changemakers list.
- Writing for FiveThirtyEight, Julia Craven offers a personal account about the dark side of fitness trackers and the overwhelming amount of health data they provide. She writes: “I spent many nights mining the depths of my iPhone’s Health app, wondering how I could use this data to be perfect. Whatever ‘perfect’ meant.”