Police in Essex, in eastern England, have paused the use of live facial recognition cameras after a study suggested the system may more reliably identify Black individuals.
The cameras are designed to detect wanted suspects. One possible explanation for the findings, reported by The Guardian, is that the algorithm may have been over-trained on images of Black faces. Police said the software will now be reviewed in cooperation with the provider.
The study was commissioned by Essex Police and carried out by the University of Cambridge. A total of 188 participants walked past facial recognition cameras mounted on police vehicles in Chelmsford. The results, published last week, showed that around half of those on a watchlist were correctly identified, while false matches were rare.
Men were more likely to be correctly identified than women, and it was “statistically significantly more likely that Black participants were correctly identified than participants from other ethnic groups,” according to the study.
The findings raise “questions about fairness that require ongoing monitoring,” the report noted, recommending further analysis across demographic groups to ensure equitable use. One of the authors, criminologist Matt Bland, said that individuals passing such cameras would be more likely to be identified as a person of interest if they were Black.
Live facial recognition is currently used by multiple police forces across the United Kingdom, with cameras deployed either at fixed locations or mounted on vehicles for mobile use. Earlier this year, Home Secretary Shabana Mahmood announced plans to expand the rollout, including a significant increase in the number of operational units.
In London alone, the technology contributed to around 1,300 arrests between January 2024 and September 2025, involving suspects accused of offences including rape, serious assault, and domestic violence.


