At a supermarket in the British seaside city of Portsmouth, on a road lined with cafes, Indian takeouts and novelty shops, customers race down aisles grabbing last-minute items before Christmas Day. Attached to the ceiling above the gray shiny floor, watching as people enter the store, is a camera. The device scans faces, matching them against a database of suspicious, potentially criminal shoppers who have been placed on a watchlist.

This store on Copnor Road is part of the Southern Co-op chain, which has become embroiled in a battle with privacy rights campaigners over its use of real-time facial recognition technology. In July, civil liberties group Big Brother Watch filed a complaint to the U.K.’s Information Commissioner’s Office against Southern Co-op and Facewatch — the company providing the surveillance system.

Joshua Shadbolt, a duty manager at the Copnor Road supermarket, told me that high levels of theft have forced him and his colleagues to hide, for instance, all the cleaning products behind the till. Without the technology, he fears customers would be given free range to steal. Since Covid restrictions were lifted in the U.K. in early 2021 following a third national lockdown, shoplifting has been on the rise. This is likely to have been compounded by a cost-of-living crisis. Still, even if theft has not reached pre-pandemic levels, for Shadbolt, the biometric camera has been an effective and necessary tool in tackling crime.

For Big Brother Watch, the camera is a breach of data rights and individual privacy. Every time a customer walks into a shop or business that uses Facewatch’s system, a biometric profile is created. If staff have reasonable grounds to suspect a customer of committing a crime, whether it’s shoplifting or disorderly conduct, they can add the customer to a Facewatch list of “subjects of interest.” Facewatch’s policy notice says that the police also have the power to upload images and data to Facewatch’s system.

Anyone uploading the data, which includes a picture of the suspected person’s face, their name and a short summary of what happened, must confirm that they either witnessed the incident or have CCTV footage of it. But the policy does not indicate what the bar for “reasonably suspecting” someone is.

When a subject of interest is reported to the Facewatch system, it automatically shares that person’s data with any client within an eight-mile radius in London, a 15-mile radius in other cities and a 43-mile radius in very rural areas. This means that a person banned from one store in West London could walk into a store owned by an entirely separate company in East London and be refused entry. Every month, Facewatch also adds to their watchlist subjects of interest posted on police websites and on the website of Crimestoppers, a crime prevention charity.

The data of subjects of interest can be stored for up to two years, unless the police ask Facewatch to keep their data in the system, while everyone else’s data is held for three days.

Big Brother Watch’s complaint alleges that the Southern Co-op chain and Facewatch lack transparency about how they process people’s data and argues that they process more data than is necessary for generating and storing watchlist entries.

“It’s really hard with private surveillance systems like this for citizens to really know what’s going on, how their data is being processed, who goes on the watchlist and who doesn’t,” said Silkie Carlo, the director of Big Brother Watch under whose name the complaint was made. “I feel very, very confident that this is not only unlawful,” she added, “but a significant breach of people’s privacy rights and data protection rights and that this precedent setting is actually really, really important.”

The former Information Commissioner, Elizabeth Denham, in June 2021 raised concerns about the use of live facial recognition technology in public places, stating that “there is often a lack of awareness, choice or control for the individual in this process.”

The U.K. government has been slow to implement sufficient guidance on the use of live facial recognition technology, while the European Union has been better at dealing with the issue. One Dutch supermarket was forced to stop using facial recognition in 2019 due to pressure from the country’s data protection authority.

The EU is in the process of drafting new regulations on the use of artificial intelligence, including the use of facial recognition technology. But the AI Act has been criticized by consumer groups for failing to address the use of facial recognition technology by companies in public areas.

As customers filtered out of the Southern Co-op into an overcast afternoon in Portsmouth, they were largely unaware of, and did not care about, the presence of a biometric camera. Abbie Grove, a middle-aged woman clad all in black, told me: “I couldn’t give less of a shit, unless I was a shoplifter.”

A survey commissioned by the Information Commissioner in January 2019 found that only 38% of the public supported the use of live facial recognition technology by retailers. But when it came to policing, over 80% of respondents said that it was acceptable for law enforcement to use the technology.

Despite the survey showing that most people don’t support private businesses using facial recognition technology, much of the debate so far has focused on its use by law enforcement. In August 2020, the U.K. Court of Appeal found that South Wales Police’s use of facial recognition technology was a breach of privacy, data protection laws and equality rights. But since then South Wales Police have continued using it with some tweaks, and last year the Metropolitan Police, who cover most of London, ramped up its use of the technology.

One of the complaints made in the South Wales case was that the facial recognition systems pose the risk of subjecting people to racial bias. Studies have shown that the technology can be worse at identifying people of color than white men. In one recent case, a Black man in Georgia, U.S., was incorrectly matched with a suspect in a robbery and jailed for a week.

For Carlo, Big Brother Watch’s complaint is a landmark. “If it were lawful for private companies to create watchlists…of people that they don’t want in their shops, without a criminal threshold,” she said, “especially in the moment of technological advance that we’re living in, it would really open the floodgates.” While businesses argue that facial recognition technology is an essential aid to ensuring the safety of both customers and employees, there is mounting evidence in the U.S. that the tech is often used punitively and opaquely, and is frequently inaccurate.

As Carlo put it, the use of such technology by corporations is “privatized policing with the backing of extreme biometric surveillance.”

If Facewatch, whose systems are expanding into other stores, is absolved by the Information Commissioner of any wrongdoing, it will be a win for supporters of additional digitalization of security. For privacy rights campaigners, the commissioner’s decision is a first line of defense in a long battle to protect people’s right to privacy, to protect their right to be free of near constant surveillance.