Met Police’s facial-recognition software wrong for 98% of matches, report says

You might not have realised it, but facial-recognition technology is already in use at large-scale events, from concerts to sports matches. It was deployed at the Notting Hill Carnival, the Anthony Joshua boxing match, and has even been used in public appearances from Meghan Markle. But as it becomes adopted more widely by police forces, can we trust it? 

According to data obtained by The Independent, the answer is no. The report suggests that the facial-recognition tech used by the Metropolitan Police is wildly inaccurate, flagging up false positives in 98% of the cases. Out of the 104 matches that the software has so far produced, just two of them have been accurate. 

Similarly, the facial-recognition software being trialled by the South Wales Police is also highly error-prone. Out of the 15 deployments made by the police force since June 2017, there have been 2,400 false positives. Only 234 (10%) of the software’s matches have been accurate. 

The technology works by analysing a photo of a person’s face and pinpointing certain biometric identifiers, such as the distance between the eyes or the length of the person’s nose. These results are then checked for positive matches in the police forces’ databases. 

“In terms of governance, technical development and deployment is running ahead of legislation and these new biometrics urgently need a legislative framework, as already exists for DNA and fingerprints,” the UK’s independent biometrics commissioner, Paul Wiles, told The Independent.

According to the report, one person was detained erroneously at the Notting Hill Carnival after the software found a match. The Met Police told The Independent that they were not arrested, however, and were released after the police realised they had already been spoken to for a public order offence.

With results such as these, it’ll be difficult to gain public acceptance of a technology that is, currently, highly unreliable. Silkie Carlo, director of the Big Brother Watch pressure group, believes that the technology isn’t fit for purpose, and “police must immediately stop using real-time facial recognition if they are to stop misidentifying thousands of innocent citizens as criminals.”

Notably, such software is being trialled in numerous other countries, with China’s police force most recently going ahead with its face-recognising sunglasses after a successful trial

Leave a Reply

Your email address will not be published. Required fields are marked *

Disclaimer: Some pages on this site may include an affiliate link. This does not effect our editorial in any way.