Connect with us

Science

Expert Questions Met Police’s Claims on Facial Recognition Bias

Editorial

Published

on

The Metropolitan Police’s assertion that its use of live facial recognition (LFR) technology is free from bias has come under scrutiny from a leading expert. Prof. Pete Fussey, who conducted the only independent academic review of police LFR usage, stated that the report the police reference does not support their claims. The police plan to deploy LFR at the upcoming Notting Hill Carnival in west London, marking their most significant use of the technology to date.

Despite concerns raised by the Equality and Human Rights Commission about the legality of LFR, the Metropolitan Police insists on its implementation. The LFR will be operational at two locations leading up to the carnival, with police emphasizing its role as a deterrent against crime.

Prof. Fussey, who has previously reviewed LFR for the Metropolitan Police from 2018 to 2019, expressed strong reservations regarding the police’s interpretation of a study conducted by the National Physical Laboratory (NPL). The Met claims that reforms based on this study have rendered their LFR system effectively bias-free. However, Fussey countered, stating, “The claims the Met are making about the absence of bias from the NPL report are not substantiated by the facts in that report.”

The NPL study assessed the LFR technology’s performance using 178,000 images and was conducted in ideal conditions over a total of 34.5 hours. While the report concluded that there was no statistically significant bias at a sensitivity setting of 0.6 or higher, it also indicated bias at a setting of 0.56. Notably, at the 0.6 setting, the study identified seven cases where individuals from ethnic minorities were incorrectly flagged as wanted, raising concerns about the technology’s reliability.

Fussey criticized the Met’s reliance on a small sample size to support broad claims about the technology’s fairness. He noted that the decisive conclusions drawn by the police were based on only seven false matches out of millions of faces analyzed. “It is a weak statistical basis to make universal claims from such a small sample of false matches,” Fussey added, emphasizing the need for larger and more comprehensive testing.

The current sensitivity setting employed by the Metropolitan Police is 0.64, which the NPL study indicated produced no false matches. However, Fussey pointed out that the Met’s assertions about the absence of bias at this setting lack adequate testing and robust evidence. “Few, if any, in the scientific community would say the evidence is sufficient to support these claims extrapolated from such a small sample,” he stated.

In response to Fussey’s criticisms, Lindsey Chiswick, the Met’s director of intelligence, defended the police’s actions. She described the NPL report as a factual document from a reputable organization and reiterated that the Met operates above the level identified to avoid bias. “When we use LFR at the setting of 0.64 – which is what we now use – there is no statistically significant bias,” Chiswick said.

As the Notting Hill Carnival approaches, signs will be prominently displayed to inform attendees about the deployment of LFR technology. The police have made significant preparations, having already arrested 100 individuals, with 21 of those recalled to prison and 266 banned from attending the carnival. Additionally, the police reported the seizure of 11 firearms and more than 40 knives in connection with their operations.

While many support the use of technology for public safety, experts like Fussey emphasize the importance of accountability and adherence to human rights standards. The Metropolitan Police claims that since 2024, LFR’s false positive rate has been one in every 33,000 cases, though they declined to specify the total number of faces scanned. In 2024, there were 26 false matches, and eight so far in 2025. The Met clarified that no individuals flagged by the system were detained without an officer’s assessment of the situation.

As public scrutiny continues, the debate around the implications of facial recognition technology in law enforcement remains active, highlighting the need for careful consideration of both efficacy and ethical standards.

Our Editorial team doesn’t just report the news—we live it. Backed by years of frontline experience, we hunt down the facts, verify them to the letter, and deliver the stories that shape our world. Fueled by integrity and a keen eye for nuance, we tackle politics, culture, and technology with incisive analysis. When the headlines change by the minute, you can count on us to cut through the noise and serve you clarity on a silver platter.

Continue Reading

Trending

Copyright © All rights reserved. This website offers general news and educational content for informational purposes only. While we strive for accuracy, we do not guarantee the completeness or reliability of the information provided. The content should not be considered professional advice of any kind. Readers are encouraged to verify facts and consult relevant experts when necessary. We are not responsible for any loss or inconvenience resulting from the use of the information on this site.