377,000 people’s data used to predict child abuse

Data on thousands of UK residents is being amassed and used as part of a scheme to predict child abuse. Computer models with “predictive analytics” are being engineered using at least 377,000 people’s data, in a bid to crack down on child abuse.

377,000 people’s data used to predict child abuse

The phenomenon, uncovered by the Guardian, intends to predict when child abuse will take place and intervene before it happens. The algorithmic profiling is being billed as a means of assisting social workers.

READ NEXT: IWF report shows tech behind the dark world of online child abuse

Despite good intentions, the data sharing endeavour is likely to rustle up more than a little controversy, given its capacity to violate individual privacy. What’s more, naysayers have highlighted the scheme’s capacity to further entrench institutionalised racism; “There is also the risk of accidentally incorporating and perpetuating discrimination against minorities,” explain Niamh McIntyre and David Pegg.

The scheme is relatively nascent, although it has been activated by at least five local authorities, discerns the Guardian. These, it reports, have developed and/or implemented a data-fuelled predictive analytics system to protect youngsters from child abuse. The total number of people whose data has been employed tots up to at least 377,000.

READ NEXT: Smart cities: how sensors, data and analytics can transform millions of lives

As for the kind of data sought, it’s both eclectic and expansive. Councils have attained information on everything from police records on antisocial behaviour and domestic violence to housing association repairs and arrears data and information on school attendance and exclusion. However, some datasets were discarded from the final algorithmic profiling models.

When it comes to the legality of the collective endeavour, it is being overseen by the Information Commissioner’s Office (ICO), which regulates the use of individuals’ personal data by third parties, both public and private. Speaking to the Guardian, an ICO spokesperson assured that the organisation would perform the requisite checks to ensure local councils complied with data protection law while undertaking the predictive analytics schemes.

READ NEXT: Dixons Carphone data breach: Company admits ten million customers were affected

Meanwhile, councils that currently employ algorithmic profiling are reporting results; take Hackney council, which recently revealed that its system had flagged 350 potential risk families in need of protection. Thurrock council followed close behind, reporting 300 similar incidences.

As the scheme snowballs, it’s sure to generate controversy with it. While it’s not without its (sizeable) caveats, in an age where data is routinely – chronically – used to bolster commercial interest in phone cases/synthetic clothing/discount vouchers, using data to identify children at risk of abuse and act accordingly perhaps doesn’t sound quite so monstrous after all.

Disclaimer: Some pages on this site may include an affiliate link. This does not effect our editorial in any way.