If the police wouldn’t help, then Simon Gordon would. For eight years, he’d been the proprietor of London’s oldest wine bar, his family business Gordon’s, where some fifteen crimes happened every month. It was mostly pickpockets: two of his friends even had their wallets stolen whilst visiting him at the bar. But since the culprits of petty crimes, such as pickpocketing and “low value shoplifting” (anything under £200), are rarely prosecuted, the police would never help. So Gordon took matters into his own hands: in 2010, he set up Facewatch, a centralized CCTV system, to protect businesses like his from thieves.
Some 15 years on, Facewatch surveils most British high streets. Instead of mere CCTV, they offer live facial recognition (LFR) technology, which connects businesses to a private watchlist of “subjects of interest” (SOI): individuals suspected of theft or antisocial behaviour. If an SOI enters the store, an alert is sounded. Facewatch’s signs are posted on shopfronts and windows nationwide, informing shoppers how their data will be used, turning every visit to a surveilled site into an unwitting act of consent. Given that shoplifting rates are currently at a record high, it’s obvious why shopkeepers want to use them.
LFR is lawful, said the Information Commissioner’s Office (ICO), if it is “necessary and proportionate to [the user’s] objectives”, and the ICO concluded that Facewatch’s use is legitimate for the purpose of preventing retail crime. Facewatch insists that its work complies with the Data Protection Act (GDPR), and the Surveillance Camera Code of Practice. SOIs may only be added to the watchlist if there is CCTV or witness evidence that a crime has been committed. Faces that are scanned but do not match are not added to the database, but deleted well within legal CCTV limits.
LFR has crept unnoticed into common use this past decade, much as CCTV did in the Nineties. Like LFR, CCTV started out as a useful crime-fighting technology that soon overreached. Suddenly, it was everywhere: policing bus lanes, issuing speeding tickets, all whilst transforming Britain into one of the most watched countries per capita, with nearly one million CCTV cameras on the streets of London today.
But LFR transforms the ubiquitous cameras into an automated identification system. Facewatch, like all civilian users of facial recognition technology, exists in the interstices of overlapping legal and regulatory frameworks designed for other purposes. Surveillance cameras did not, originally, analyse what they were filming in real time. Biometric technology for unlocking smartphones is used knowingly by the person whose face is being recognised, in their own interests and with their consent. But LFR is different by being indiscriminate: it is directed towards everyone in a particular area, rather than specific individuals.
“LFR is different by being indiscriminate: it is directed towards everyone in a particular area, rather than specific individuals.”
Defenders of the technology note that its accuracy is improving all the time: its disproportionate misidentification of non-white and female faces, for example, is less of an issue (though still a problem). More powerful technology is not necessarily less liable to discriminating against minority populations, however. “Special category personal data,” revealing an individual’s ethnicity, sex, or religious beliefs, can now be inferred through facial recognition software, so its collection and processing is specifically governed by data and equality laws.
Other countries take LFR more seriously. The Swedish Data Protection Authority issued a €20,000 fine to a school that used facial recognition to track school attendance, whilst in 2020 the Dutch equivalent issued a formal warning to a supermarket which intended to use it. “Facial recognition makes us all walking bar codes,” said the Deputy Chairman of the DPA, Monique Verdier. “If we have cameras with facial recognition technology everywhere, everything all of us do can be continuously monitored.”
That is what the UK campaign group Big Brother Watch fears will happen here. Private systems operate outside the judicial system, which means that individuals can be “blacklisted from their high street… at the discretion of a security guard”, without police involvement or fair trials, or even informing an individual that they’re on a watchlist.
Earlier this year, Facewatch falsely accused a Mancunian woman of stealing £10 worth of toilet roll from Home Bargains. She was not aware of this until she returned to the store, only to hear the alarm go off before security guards pounced on her and told her to leave. She asked Home Bargains why this happened, and they just told her to contact Facewatch. But for a long time Facewatch wouldn’t reply to her emails. Until they did, she remained on Facewatch’s list: unable to enter a business protected by them without setting off a siren. “I have never stolen in my life and so I was confused, upset and humiliated to be labelled as a criminal in front of a whole shop of people,” she said.
But still, the technology continues its creep into public life. The UK is not alone in introducing LFR to streamline border controls. The Home Office is trialling its use to identify returning deportees. And of course police forces across the UK are gradually adding it to their armoury.
This year alone, Metropolitan Police use of LFR has led to more than 962 arrests. One van recently scanned over 50,000 faces in a day. In September, Sir Mark Rowley, the Metropolitan Police Commissioner, described it as a “gamechanger”. The Met started using LFR in 2016, and since 2020, their LFR watchlist has ballooned from between 6,000 and 7,000 to over 16,000. Reasons to be on the watchlist can include a court order forbidding you from travelling to certain places (sometimes applied to people arrested at protests), a civil order made without court involvement “to protect a person or persons from criminality”, or being a vulnerable person or a victim of serious crime.
In August, then home secretary Yvette Cooper announced plans to introduce 10 LFR vans across England to help catch “sex offenders or people wanted for the most serious crimes”. Chris Philp, the shadow home secretary, strongly supported “I strongly support the national rollout of this technology which will take dangerous criminals off the street.” This is one area on which both Tories and Labour agree. Only last weekend, LFR vans were rolled out in Greater Manchester, Surrey and Sussex.
Public opinion generally supports police use of the technology. One recent survey found that over half of UK adults are “comfortable” with it. Catching sex offenders is just one of the reasons for its popularity. People also accept it as part of enhanced security at embassies, concerts, and sports stadiums to prevent terrorist attacks. At those events, with bag searches, ID checks, metal detectors, and other intrusive technologies already in place, LFR’s introduction isn’t a radical development.
However, police use has gone far beyond these situations. In 2017 South Wales Police were already using it to target ticket touts, and in 2020 that same force lost a court case after using the technology to monitor a peaceful protest outside a defence exhibition.
After this, the College of Police issued guidelines for using LFR that require all deployments to be “targeted, intelligence-led, and geographically limited.” If they want to use LFR, they must explain why it’s necessary. In 2020, the London Policing Ethics Panel, an independent body with no statutory powers of enforcement, recommended some general principles including “necessity and proportionality” and called for “robust voluntary self-regulation with independent oversight”. There is, however, no external monitoring of the Met’s policy. Supposedly “intelligence-based, targeted and proportionate” deployments often, in practice, scan passers-by in public spaces against thousands of watchlist faces, under the vague criterion of being near a crime hotspot.
Again, the UK stands out in its enthusiasm for LFR. The EU prohibits its use in law enforcement, except in exhaustively listed and narrowly defined situations, where the use is strictly necessary to achieve a substantial public interest, the importance of which outweighs the risks. Those situations involve the search for certain victims of crime including missing persons; threats to life; or terrorist attacks. Mere shoplifting would fail that test.
The UK’s Equality and Human Rights Commission regards the Metropolitan Police policy as incompatible with the human rights to privacy, freedom of expression, freedom of assembly and association. Its use at demonstrations, in particular, could have a chilling effect on the exercise of those rights and freedoms. Almost half of UK adults would be unlikely to attend a protest or stand on a picket line if live facial recognition was in use, according to a September 2025 survey.
This lies at the heart of the problem with LFR in a democratic society. If your face is scanned several times a day, and automatically compared to a digital watchlist, that means the end of privacy in the public space. If you can at any point be asked to prove that you are not the “subject of interest”, then you have lost the freedom to be anonymous. Everything you do — visiting a friend, going to the supermarket, joining a protest — will be monitored. First by a machine but then, if your face resembles somebody else’s, by a police officer asking you to prove your innocence by verifying your identity.
Fear of crime has allowed the widespread use of novel technology to run ahead of public awareness, let alone discussion of its pitfalls. Even now, we are expected to trade our freedom to go unnoticed for protection from harms which may be quite disproportionate to what we are losing. A degree of privacy is essential to our freedom to participate in public life, and such privacy will not be assured until private and police use of the technology is regulated. The freedom to be anonymous is intrinsic to our political freedom.
















