UK Police turn to AI to predict crimes before they happen

Nine police forces across the UK are developing an AI  to identify criminals, or victims of crime before any crime has actually been committed. By using archival statistics and AI learning, UK police forces are seemingly building something straight out of Minority Report.

UK Police turn to AI to predict crimes before they happen

The project, revealed in a New Scientist report, is led by the West Midlands police, although eight other forces are involved in its development. The aim of the project is to help police forces around the country cope with funding cuts by automating much of their processes.

Researchers used data and statistics from past criminal proceedings in order to identify 1,400 potential indicators for crime, including 30 specifically important ones. They then taught the National Data Analytics Solution (NDAS) system these indicators, so it could flag those at risk of committing or enduring a crime. It’s essentially an AI-powered “precog” straight from Philip K Dick’s The Minority Report.

Those flagged by the system won’t be arrested, so don’t expect a UK Precrime division just yet, but will be offered counselling or aid, depending on the circumstances. They could also have their details passed on to social services or other bodies. If successful, the system would prevent crimes from occurring in the first place by helping potential criminals, or victims, find alternative options.

READ NEXT: Police in the UK can now scan fingerprints from anyone on the street

West Midlands police has until March 2019 to develop a working prototype of the system, although it hopes to be using it by the beginning of the year.

News of the NDAS comes as Kent police reveal they have scrapped a similar project, which cost the taxpayer around £100,000 per year. The software was used to predict crime in Kent so officers could intervene, however, it only revealed current crimes and therefore had no preventative effect on crime rates. The same could be said of London’s Met police force who, earlier this year, scrapped an AI facial recognition project due to it having no impact on reducing crime rates in the capital.

As can be expected with any AI that purports to accurately predict human behaviour, many people have reservations with the system. While West Midlands police are working with the ICO to develop the system in order to meet privacy regulations, there are many potential pitfalls with the system.

For example, the use of historical data means any biases or prejudices used in the past will be brought forward, which could create an AI that unfairly targets certain races. Although the system seems inspired by Minority Report, at least that fictional system wasn’t used to  disproportionately report minorities.

Disclaimer: Some pages on this site may include an affiliate link. This does not effect our editorial in any way.

Todays Highlights
How to See Google Search History
how to download photos from google photos