The technology, which can be mounted on vans or fixed locations, was intended to help police track wanted criminals. In January, the home secretary announced an expansion of the program, increasing LFR vans fivefold so every force could access them.
Essex commissioned University of Cambridge academics to test the system. They had 188 actors walk past active cameras in Chelmsford. About half of the people on a police watchlist were correctly identified, while false positives were rare. But the study found men were identified more reliably than women, and black participants were “statistically significantly more likely to correctly identify” than people from other ethnic groups.
Dr Matt Bland, a criminologist involved in the study, told the Guardian and Liberty Investigates: “If you’re an offender passing facial recognition cameras which are set up as they have been in Essex, the chances of being identified as being on a police watchlist are greater if you’re black. To me, that warrants further investigation.”
The bias differs from the more familiar concern about LFR wrongly identifying innocent people. Last month, police mistakenly arrested a man for a burglary in a city he had never visited, confusing him with someone of South Asian descent.
Experts say overtraining the AI on black faces may explain the disproportion, and adjustments to the system could fix it. A separate study by the National Physical Laboratory found black men were most likely to be correctly matched, though that result was not statistically significant.
The Home Office said LFR cameras in London between January 2024 and September 2025 contributed to over 1,300 arrests, including for rape, burglary, domestic abuse, and grievous bodily harm. Critics argue the latest findings confirm long-standing warnings about bias.
“Police across the country must take note of this fiasco,” said Jake Hurfurt, head of research at Big Brother Watch. “AI surveillance that is experimental, untested, inaccurate or potentially biased has no place on our streets.”
Essex police said they paused deployments “while we worked with the algorithm software provider to review the results and seek to update the software. We then sought further academic assessment. As a result of this work we have revised our policies and procedures and are now confident that we can start deploying this important technology… We will continue to monitor all results to ensure there is no risk of bias against any one section of the community.”




