The UK police’s porn-hunting AI can’t yet tell the difference between deserts and nudes

London’s Metropolitan Police is tasking AI systems with the unenvious job of hunting the internet for images of child pornography – something it believes the technology will be capable of doing adeptly in “two to three years”.

The UK police's porn-hunting AI can't yet tell the difference between deserts and nudes

In the short term, however, the police’s image-recognition tools are finding it difficult to tell the difference between a naked body and a photo of a desert. While the police are already using machine-learning techniques to scan seized devices for images of guns, drugs or money, the toolset struggles when it comes to nudity.

“Sometimes it comes up with a desert and it thinks its an indecent image or pornography,” the Met’s head of digital and electronics forensics, Mark Stokes, told The Telegraph. “For some reason, lots of people have screen-savers of deserts and it picks it up thinking it is skin colour.”

Once the image-recognition system is refined, the idea is that it will allow human officers to offload the burden on identifying disturbing pictures onto machines. The Telegraph reports that the force is drawing up plans to move sensitive data to “Silicon Valley providers”, allowing it to leverage the large computing power of tech firms and help train its AI to scan for images of child abuse.

That extra computing power might come with added security risks. The shift from a locally-based data centre to a hub owned by Amazon, Google or Microsoft would impart a level of external responsibility on those companies – which attract a large amount of attention from hackers – to protect the sensitive data it hosts. There are added legal issues in that these providers would need consent from the courts to store criminal images.

There’s also the issue that even the machine-learning systems of big tech companies aren’t immune to mistakes, from Twitter’s “technical issues” around the blocking of searches related to the term “bisexual”, to gender and racial biases embedding in training sets for neural networks. As Alphr reported in our look at the subject, western norms can often distort the cultural perspectives of emerging technologies.

In the case of the mislabeled sand dunes, this looks to be a base-level mistake around image identification. Things are likely to become much more complicated when a machine needs to consider nuanced aspects around context and intent.

The Met’s AI isn’t the first to look at nature and see nudity. Last year, a neural network project created its own “pornographic” images, but based on scenes of deserts, beaches and volcanoes.

Disclaimer: Some pages on this site may include an affiliate link. This does not effect our editorial in any way.