(ORDO NEWS) — Artificial intelligence that studies crime data can predict the location of crimes in the coming week with up to 90 percent accuracy, but there are concerns that such systems could help perpetuate bias.
Artificial intelligence can now predict the location and crime rate in a city for the week ahead with 90 percent accuracy.
Systems like this have been shown to perpetuate racist bias in police work, and the same may be true here, but the researchers behind this AI say it can be used to detect these biases as well.
Ishanu Chattopadhyay of the University of Chicago and colleagues created an AI model that analyzed historical crime data in Chicago, Illinois from 2014 to the end of 2016 and then predicted the crime rate for the weeks following that training period.
The model predicted the probability of certain crimes occurring throughout the city, which was divided into squares about 300 meters across, for a week ahead with 90 percent accuracy.
The model was also trained and tested on data from seven other major US cities and showed a similar level of performance.
Previous attempts to use AI to predict crime have been controversial because they can perpetuate racial prejudice.
In recent years, the Chicago Police Department has piloted an algorithm that created a list of people deemed most at risk of being involved in a shooting, either as a victim or perpetrator.
The details of the algorithm and the list were initially kept secret, but when the list was finally made public, it turned out that 56 percent of the city’s black men aged 20 to 29 were on it.
Chattopadhyay acknowledges that the data used by his model will also be biased, but says that efforts have been made to reduce the impact of bias and the AI does not identify suspects, only potential crime scenes. “This is not a Minority Report,” he says.
“Law enforcement resources are not unlimited. So you want to make the best use of them. It would be great if you could know where the killings are going to happen,” he says.
Chattopadhyay says AI predictions could be more confidently used to inform high-level policy rather than directly allocate police resources. He published the data and the algorithm used in the study so that other researchers could study the results.
The researchers also used the data to look for areas where human bias is affecting police performance. They analyzed the number of arrests after crimes in Chicago neighborhoods with different socioeconomic levels.
This showed that crime in wealthier areas resulted in more arrests than in poorer areas, suggesting a biased police response.
Lawrence Sherman of the Cambridge Evidence Police Center (UK) says he is concerned about the inclusion of data on reactive and proactive policing, or crimes that are reported because people report them, and crimes that are reported because the police ride on them. search.
The latter data type is highly susceptible to bias, he says. “They may reflect intentional discrimination by the police in certain areas,” he says.
Contact us: [email protected]