Add My Company
Are You Ready for Live Facial Recognition
Surrey and Sussex police forces are the latest to roll out the use of live facial recognition. In our latest blog we look at what LFR is, what it can do and consider people’s concerns about this new technology.
Facial Recognition in the UK
Facial recognition technology is rapidly becoming an essential frontline tool in both policing and retail security. The roll out of this new technology has been criticised by some who describe it as “incredibly intrusive”, but police forces have described it as a precise and targeted tool that will free up officers to spend more time dealing with emergencies and investigating crime.
In November 2025, Surrey and Sussex Police officially launched their first live facial recognition (LFR) vans, marking a significant moment in the national debate over surveillance, privacy, and public security. But what exactly is LFR, how is it used, and why is it so controversial? Read on to learn how LFR works, examine its use in policing and retail, and explore the key ethical, legal, and social concerns.
What Is Live Facial Recognition and How Does It Work?
Live facial recognition (LFR) is a technology based system that captures real-time images of people on a video camera feed and automatically compares them against a predetermined “watchlist” of individuals of interest. In the case of Surrey and Sussex, this watchlist includes people suspected or convicted of serious crimes such as sexual offences or domestic abuse.
Van mounted cameras continuously monitor and scan the faces of anybody who walks past the van. A face-matching algorithm compares the captured images with those on the predetermined watchlist. If a match is detected, police officers are alerted and they will then carry out their own comparison between the captured facial image and the face on the watchlist before taking any action.
Importantly, privacy protection is built into the police system. Images of peoples faces, who aren’t on the defined watchlist, are immediately and permanently deleted and watchlist images are also deleted within 24 hours after each day of use.
There are currently no specific “facial recognition laws” in the UK and the use of LFR effectively falls under the data protection (GDPR), human rights law and policing legislation. Both Surrey and Sussex police forces have said that whenever LFR is being deployed it will be publicly announced and authorised by a police superintendent.
Surrey & Sussex’s New Deployment
As noted, in November 2025, Surrey and Sussex Police launched a number of Home Office-funded LFR vans. Their stated purpose was to detect and deter serious offenders, especially those on a high-harm watchlist (for example sex offenders and domestic-abuse perpetrators).
The forces emphasise the privacy safeguards that are being followed. The vans will only be deployed where necessary, they will be clearly marked, and deployment locations will be publicised in advance. Before any arrest or engagement, a police officer will always manually confirm the facial match between the target and image on the watchlist.
Surrey Police have highlighted how the facial recognition algorithm they use has been tested by the National Physical Laboratory (NPL) and, according to their statements, shows no statistical bias related to race or gender.
Expansion Across the UK
This latest LFR rollout by Surrey and Sussex police forces is part of a wider national expansion in the use of facial recognition. The government has announced plans to deploy more LFR-equipped vans across various other police forces including Greater Manchester, West Yorkshire, Thames Valley, and Hampshire.
Those who support this new technology describe it as a powerful intelligence-based tool that’s focused on the identification of high risk offenders, protecting communities and making policing more efficient. But increased use of LFR technology is also raising many questions and concerns regarding privacy, oversight, transparency and civil liberties.
Facial Recognition in UK Retail Security
Facial recognition technology is not only being used by the police, its also being trialled by retailers who are struggling with increased shoplifting and violence toward shop workers.
Back in September, Sainsburys announced an eight week trial of facial recognition at stores in London and Bath. The supermarket chain stated the introduction of the new technology was being carried out as part of their efforts to identify shoplifters and curb the sharp increase in retail crime in recent years. But privacy campaigners condemned the plans as “chilling”.
The technology provider ‘Facewatch’ has been working with the retailer. Their system works by matching faces with those on a database of known offenders, reported by multiple retailers. When a match is detected by the system staff are automatically alerted, enabling them to take appropriate action.
The primary aim of facial recognition in retail environments is, as noted, to deter and prevent crime. Retail crime has been rocketing with a 20% increase in shoplifting offences from 2023 to 2024. Violence, threats and abuse toward retail workers has also massively increased with around 2,000 recorded incidents every day. USDAW (Union of Shop, Distributive and Allied Workers) has reported that 77% of shop workers had experienced verbal abuse in the past year, 53% of workers were threatened by a customer and 10% reported being physically assaulted.
Retailers are arguing that facial recognition technology helps deter crime, reduce losses and importantly, protect staff.
Key Concerns About Facial Recognition Technology
As LFR and biometric surveillance become more common, a number of critical worries have been raised. Here are the main ones.
Accuracy & Bias
Even though the face-matching algorithm used by Surrey and Sussex police forces is claimed to have “no statistical bias,” facial recognition systems historically have struggled with racial and gender bias. False positives are a concern, particularly for minority groups, who may be wrongly flagged. A false positive occurs when an innocent person, whose image is not on the “watchlist”, is wrongly identified. This can potentially happen when somebody has similar facial features to an image on the database.
Facial recognition systems use a confidence score to determine whether two faces match. If the threshold is set too low (to avoid missing real matches), the system becomes more likely to generate false positives.
Privacy & Mass Surveillance
One of the most commonly cited concerns about LFR, especially when deployed in public areas, is that it treats everyone as a potential suspect. Cameras capture images of all faces within range, regardless of who they are and whether they are persons-of-interest. This mass-scanning can inhibit people’s behaviour and may prompt them to feel less able to assemble, protest or simply go about their day-to-day business.
Data Retention & Use
Although Surrey and Sussex police forces have stated that non-watchlist facial images are automatically and immediately deleted and watchlist data is also deleted within 24 hours of use, questions still remain regarding who has access to this personal data, how secure it is and whether any of this data is shared or reused beyond the initial operation.
In retail based facial recognition, how data is shared, secured and accessed is not openly disclosed. The Facewatch system, for example, is based on a shared database of offender’s images derived from participating stores. This means if a persons face is added by one store it can influence how they are treated in others and the targeted individual may have no knowledge of what has been recorded and shared.
Legal & Regulatory Gaps
As noted, here in the UK the use of facial recognition technology is regulated under existing laws regarding data protection and human rights. This is an area where many campaigners are arguing for the introduction of dedicated facial recognition legislation along with an independent regulator.
Both Surrey and Sussex police forces have emphasised their strong governance and transparency in the use of facial recognition. In retail security, facial recognition data counts as biometric data, which is classed as a special category of personal data. This means retailers must meet strict legal tests including:
- Having a lawful basis for processing (usually “legitimate interests”).
- Meeting a special category condition (usually “substantial public interest” for crime prevention).
- Conducting a Data Protection Impact Assessment (DPIA) before deploying facial recognition.
- Demonstrating that the system is necessary and proportionate to the problem (e.g., shoplifting, violence toward staff).
- Applying data minimisation, meaning only essential data is captured, stored, and shared.
In practice, what this means is retailers require a documented justification for using facial recognition and must show they have considered less intrusive retail security measures first.
Ethical & Social Implications
As well as the legal considerations there are also significant ethical concerns. Some people think that LFR undermines the fundamental presumption of innocence and the core principle that everyone is innocent until proven guilty. Traditionally, police require ‘reasonable suspicion’ before searching or checking someone. LFR effectively treats everyone as a potential match for facial images on a watchlist, so every person’s face that’s captured by the camera is effectively treated as a potential suspect, even if they were simply passing by.
Another significant concern is the risk of disproportionate deployment. If LFR systems are deployed in specific neighbourhoods or areas, perhaps where crime rates are higher or low income areas, it could exacerbate social inequalities and contribute to accusations of discriminatory policing.
There is also concern that, as LFR systems and infrastructure proliferates, it could be used and misused in ways that go beyond its originally stated purpose.
Why Authorities are Supporting LFR
Despite the concerns, police forces, local authorities, central government and retailers see LFR as a valuable tool for the following reasons:
- Crime Prevention & Public Safety: Police argue that LFR helps them quickly identify and arrest individuals wanted for serious crimes. This can help keep communities safer by quickly identifying dangerous and wanted individuals. Deployment in Surrey and Sussex has already resulted in the apprehension of wanted individuals.
- Efficiency: Rather than stopping people randomly or relying on manual checks, LFR can focus efforts on high-priority suspects, freeing up valuable police resources.
- Transparency and Safeguards: Surrey and Sussex police forces emphasise that their deployment will be open, with clear signage, public notification, and senior-officer approval.
- Independent Testing: The face-matching algorithm used by police force systems has been vetted by the National Physical Laboratory. They claim this is reliable and not susceptible to any forms of bias.
- Protects Staff and Reduces Theft: Retailers highlight how LFR technology protects staff, reduces losses and is highly focussed on known offenders. Regular customers are unaffected and theft reduction means costs are not passed on to consumers.
What Needs to Happen
Facial recognition has the potential to provide many benefits that are good for communities, consumers, retailers and policing. But there are clearly significant risks that need to be recognised and addressed. Here’s a summary of what’s needed:
- Stronger Regulation: Many are calling for the introduction of a clear, dedicated legal framework for facial recognition that addresses surveillance, data storage, sharing, and redress.
- Independent Oversight: Another cited requirement is the need to establish or empower a body (or use the Information Commissioners Office - ICO) to audit LFR deployments, test for bias, and enforce accountability.
- Community Engagement: Police forces need to involve local communities in decisions about where and how LFR is used, and explain its purpose clearly.
- Transparency: Full transparency should include public reporting of where LFR equipped vans are deployed, how many matches are made, how many false alerts, and what happens to data.
- Continuous Testing & Bias Audits: It is essential to regularly reassess the LFR technology for bias (racial, gender, age) and refine the system accordingly.
- Redress Mechanisms: People need to be provided with clearly defined, accesible ways for individuals, who believe they’ve been wrongly flagged, to challenge and correct their data.
Facial Recognition is Here and Evolving
The arrival of live facial recognition equipped police vans in Surrey and Sussex represents a pivotal moment in the UK’s roll-out of biometric surveillance. For police, it’s a powerful new tool that could help crack down on high-harm offenders more efficiently. For retailers, live facial recognition offers a potentially powerful way to deter repeat shoplifters and protect staff from violence and abuse. But alongside those potential benefits lie serious risks to privacy, civil liberties, and social justice.
If this technology is going to become more widespread, it’s vital that its growth is accompanied by strong legal frameworks, transparent practices, and meaningful public debate. As citizens, we should stay engaged, ask questions, and demand that our rights are protected as policing and retail security tactics evolve.
If you have any questions about retail security, or if you have any special requirements, remember we are here to help. Give us a call on 01273 092921 and we’ll provide you with free, expert advice.
For more information on Are You Ready for Live Facial Recognition talk to Insight Security