Durham Constabulary has developed an artificial intelligence (AI) computer system that takes just 11 seconds to tell police officers what should be done with a suspect.
The Harm Assessment Risk Tool (HART) automatically categorises suspects as low, medium or high risk.
Durham Constabulary will start using the system during the next few months and – if it proves successful – it could be taken up by police forces across the country.
According to HART, a high-risk suspect is likely to commit a serious offence during the next two years, a medium-risk suspect is likely to commit a non-serious offence during the next two years, and low-risk suspects are not likely to commit an offence during the same time period.
The Harm Assessment Risk Tool – developed by Durham Constabulary and Cambridge University – crunches data about the suspect to arrive at its judgement in 11 seconds.
There are 34 different sets of data that the HART system processes, including the suspect’s postcode, age, gender and intelligence count.
But, most importantly, the system reviews the suspect’s criminal history. HART has been programmed with five years’ worth of Durham Constabulary records.
— Durham Police K9 (@DurhamPoliceK9) May 6, 2017
The verdicts the HART system arrives at could help police officers make decisions such as whether to detain a suspect for further questioning or let them go and whether to remand suspects who have been charged in custody or give them bail.
The system could also decide whether low-risk offenders qualify for the Checkpoint programme. The Checkpoint programme is an alternative to prosecution that tries to address the reasons why a person commits crime, such as homelessness and unemployment.
Durham Constabulary stresses that the HART system won’t be the only resource police officers use when making their decisions. The AI system’s judgements will be just one of a number of factors that will help officers decide what to do with suspects.
The HART system was trialled by Durham Constabulary from 2013-15. The results showed that the system was accurate with regards to low-risk offenders, with only 2% of those categorised as low risk going on to reoffend.
More worryingly, however, 12% of subjects categorised as high risk actually turned out to be low risk.
Artificial intelligence – in the form of robots and algorithms – is becoming increasingly common in a large number of workplaces, leading north-east MPs – among others – to express concerns about robots replacing human employees.
(Featured image courtesy of Graham Mitchell, from Flickr Creative Commons)