UK police facial-recognition database has gone “far beyond custody purposes”
A watchdog has warned that a facial-recognition database held by UK police has exceeded 19 million images, and that the situation could lead to false intelligence and wrongful allegations of innocent people.
Biometrics commissioner Paul Wiles said in his annual report that the Police National Database (PND) has far exceeded its original purposes, and it is leading police to use facial-recognition technology to profile people in public spaces.
Wiles points in particular to this year’s Notting Hill carnival, where police used facial imaging to check members of the public against a force watch list.
“The use of facial images by the police has gone far beyond using them for custody purposes,” he writes. “In July 2016 there were 19 million facial images on the Police National Database (PND), 16,644,143 of which had been enrolled in the facial image recognition gallery and were (and remain) searchable using facial recognition software, although it is not clear how many of these are duplicate images or which relate to unconvicted persons.
“In addition, not all forces are uploading images to PND, including the [Metropolitan Police Service] MPS who hold their own extensive collection, so 19 million is an underestimate.”
A High Court ruling in 2012 made it unlawful for police to retain images of individuals they had arrested or questioned but had not charged or convicted of an offense. In his report, Wiles laments that five years after that ruling, “we still do not have a clear policy in operation to correct that situation”.
He told BBC Radio 4’s Today programme: “I think it’s very worrying because if we’re not careful the public will lose confidence in the police.”
The report admits that facial images have been used throughout the history of policing, but the ability to store digital images on a searchable database, and the scope to rollout facial recognition technology in public places, is a new concern. Wiles notes that, as well as facial recognition, a new wave of biometrics also includes voice recognition and iris, gait and vein analysis.
Privacy advocacy group, The Big Brother Watch, said it welcomed the Biometric Commissioner’s warnings and concerns:
“It is of very serious concern that the Home Office appear to be so unwaveringly set on embedding facial biometric recognition technology into policing without debate, regulation, legislation or independent scrutiny,” said Renate Samson, the group’s chief executive. “Rather than throwing millions of pounds at the building of such intrusive capabilities, the Home Office should be investing in updating police IT systems to ensure that the hundreds of thousands of innocent people’s custody images and facial biometrics are deleted automatically as soon as they are released without charge, bringing them into line with DNA and fingerprints.”
Facial recognition also made the headlines this week with the launch of Apple’s iPhone X. The new handset has a facial-imaging feature called Face ID, which allows the user to unlock their device or make payments via facial identification. The company has been keen to stress the security of the feature, claiming Face ID’s error rate is only one in a million.
It has nevertheless raised questions about privacy and security, with US senator Al Franken pressing Apple for more details about its safeguards. Franken is also chairman of the Senate Judiciary Subcommittee on Privacy, Technology and the Law.
“Apple itself could use the data to benefit other sectors of its business, sell it to third parties for surveillance purposes, or receive law enforcement requests to access [its] facial recognition system – eventual uses that may not be contemplated by Apple customers,” Franken wrote, detailing ten questions for the company, including the measures taken to prevent racial or gender bias.