AI to help police catch criminals before they strike

says a press release from the UK Government last week.

This does not fill me with confidence. We’re heading towards the era of actual Minority Report it would seem - possibly another example of the torment nexus humanity constantly fails to avoid creating.

Firstly, to make an obvious technical point, someone isn’t a criminal before they commit a crime. They are in fact an innocent person. Nonetheless, crime prevention is obviously better than letting it happen. Which is one reason why we already have plenty of laws around “conspiracy to commit” crime, which if you break, make you a criminal, with a potentially length prison sentence. These are examples of “incohate” offences:

An inchoate offence is one that is incomplete.

Anyway, what is the actual government plan? Basically it’s to have someone make a crime hotspot map that has to somehow involve using the black box magic of feeding sensitive personal data into “advanced AI”.

Innovators have been tasked with developing a detailed real time and interactive crime map that spans England and Wales and can detect, track and predict where devastating knife crime is likely to occur or spot early warning signs of anti-social behaviour before it spirals out of control

which:

…will be rooted in advanced AI that will examine how to bring together data shared between police, councils and social services, including criminal records, previous incident locations and behavioural patterns of known offenders.

Why do I feel like it’ll involve dumping a load of person data into some privately-controlled-by-a-weird-billionaire version of ChatGPT and asking it to make a map?

Outside of the potential use of ChatGPT (which to be fair is something I just assumed, it might well having nothing to do with it) - it’s not like this type of thing hasn’t been done before. Witness the rise of “predictive policing” using now-less-fashionable forms of AI.

The typical implementation did not go well.

To quote a headline from the MIT Technology Review:

Predictive policing algorithms are racist. They need to be dismantled.

Why? At least partially from the nature of the data they tend to be fed:

Yet increasing evidence suggests that human prejudices have been baked into these tools because the machine-learning models are trained on biased police data. Far from avoiding racism, they may simply be better at hiding it.

The UK Government’s own “Centre for Data Ethics and Innovation” wrote in 2019 that:

The evidence suggests that there is an absence of consistent guidelines for the use of automation and algorithms, which may be leading to discrimination in police work. … Multiple types of potential bias can occur. These include discrimination on the grounds of protected characteristics; real or apparent skewing of the decision-making process; and outcomes and processes which are systematically less fair to individuals within a particular group.

Now, some of these efforts were from a few years ago. Unquestionably, the nature and abilities of AI tools has radically changed since then. It does make sense to me that there is a potential good use of these type of tools in the field of crime prevention - even if we don’t know what it is yet. Plus more people have spent more time thinking about the potential perils embedded in these systems that we should aim to avoid.

So maybe it’s just a bad headline and this effort will prove to be something that will materially help society. But I feel it’s far more likely to be the government getting on the generative AI hype train with some underspecified request that will end up offloading the responsibilities of the state, as well as a big bucketload of public money, to some blackbox software under the exclusive control of an agenda-ridden US tech billionaire which ends up doing nothing more than further embed inequality and prejudice into the criminal justice system.

I hope I’m wrong.

Also, in extreme irony:

This announcement is the second challenge to be announced as part of the Programme, building on our Clean Energy challenge aiming to deliver cheaper bills for households across the UK by shifting electricity demand during evenings and weekends by two gigawatts by 2030 – the equivalent of 1.5 million homes.

Feeding tons of data into some unspecified “advanced AI” is unlikely to help any clean energy challenge.