Artificial Intelligence Predict Crime

Artificial Intelligence Predict Crime

·

6 min read

Hello Data Loversđź‘‹

In this article we talk about artificial intelligence and crime.

Are you ready? Let's go! 🚀


Minority Report Artificial Intelligence System

An artificial intelligence system designed to predict gun and knife violence in the UK before it happens had serious flaws that made it unusable, local police have admitted. The error led to large drops in accuracy, and the system was ultimately rejected by all of the experts reviewing it for ethical problems.

The prediction system, known as the Most Serious Violence (MSV), is part of the UK’s National Data Analytics Solution (NDAS) project.

The Home Office has funded NDAS with at least ÂŁ10 million ($13 million) during the past two years, with the aim to create machine learning systems that can be used across England and Wales.

As a result of the failure of MSV, police have stopped developing the prediction system in its current form. It has never been used for policing operations and has failed to get to a stage where it could be used. However, questions have also been raised around the violence tool’s potential to be biased toward minority groups and whether it would ever be useful for policing.

The MSV tool was designed to predict whether people would commit their first violent offense with a gun or knife in the next two years. People who had already come into contact with the two police forces involved in developing the tool, West Midlands Police and West Yorkshire police, were given risk scores. The higher the score, the more likely they would be to commit one of the crimes.

TOM-CRUISE-625x350.jpg

Data, Accuracy & Problems

Historic data about 2.4 million people from the West Midlands database and 1.1 million from West Yorkshire was used in the development of the system, with data being pulled from crime and custody records, intelligence reports, and the Police national computer database.

But as NDAS was starting to “operationalize” the system earlier this year, problems struck. Documents published by the West Midlands’ Police Ethics Committee, which is responsible for scrutinizing NDAS work as well as the force’s own technical developments, reveal that the system contained a coding “flaw” that made it incapable of accurately predicting violence.

“A coding error was found in the definition of the training data set which has rendered the current problem statement of MSV unviable,” aN NDAS briefing published in March says.

A spokesperson for NDAS says the error was a data ingestion problem that was discovered during the development process.

Before the error was found, NDAS claimed its system had accuracy, or precision levels, of up to 75 percent. Out of 100 people believed to be at high risk of committing serious violence with a gun or knife in the West Midlands, 54 of these people were predicted to carry out one of these crimes. For West Yorkshire, 74 people from 100 were predicted to commit serious violence with a gun or knife. “We now know the actual level of precision is significantly lower,” NDAS said in July.

“Rare events are much harder to predict than common events,” says Melissa Hamilton, a reader in law and criminal justice at the University of Surrey, who is focusing on police use of risk prediction tools. Hamilton wasn’t surprised there were accuracy issues. “While we know that risk tools don’t perform the same in different jurisdictions, I’ve never seen that big of a margin of difference — particularly when you talk about the same country,” Hamilton says, adding the original estimations appeared to be too high, based on other systems she had seen.

As a result of the flaw, NDAS reworked its violence prediction system and its results showed a significant accuracy drop. For serious violence with a gun or knife, the accuracy dropped to between 14 and 19 percent for West Midlands Police and nine to 18 percent for West Yorkshire. These rates were also similar to whether the person had committed serious violence before or if it was going to be their first time.

NDAS found its reworked system to be most accurate when all of the initial criteria it had originally defined for the system — first-time offense, weapon type, and weapon use — were removed. In short, the original performance had been overstated. In the best-case scenario, the limited system could be accurate 25 to 38 percent of the time for West Midlands Police and 36 to 51 percent of the time for West Yorkshire Police.

“The core problem with the program goes past any issues of accuracy,” says Nuno Guerreiro de Sousa, a technologist at Privacy International. “Basing our arguments on inaccuracy is problematic because the tech deficiencies are solvable through time. Even if the algorithm was set to be 100 percent accurate, there would still be bias in this system.”

The violence-prediction system identified “more than 20” indicators that were believed to be useful in assessing how risky a person’s future behavior could be. These include age, days since their first crime, connections to other people in the data used, how severe these crimes were, and the maximum number of mentions of “knife” in intelligence reports linked to the location and ethnicity data were not included. Many of these factors, the presentation says, were weighted to give more prevalence to the newest data.

“There are a lot of categories which have been proven in other areas of data analysis in the criminal justice system to lead to unequal outcomes,” says Rashida Richardson, a visiting scholar at Rutgers Law School who has studied data problems in predictive policing. “When you use age, that often skews most predictions or outcomes in a system where you’re more likely to include a cohort of people who are younger as a result of age just being one of the indicators used.” Hamilton agrees. She explains that criminal history factors are often biased themselves, meaning any algorithms that are trained upon them will contain the same issues if a human does not intervene in the development.

Conclusion

The current thinking of NDAS is that the predictive violence tool could be used to “augment” existing decision-making processes used by police officers when investigating people who are likely to commit serious violence. The violence prediction tool is just one that is being worked on by NDAS. It is also using machine learning to detect modern slavery, the movement of firearms, and types of organized crime. Cressida Dick, the head of London’s Metropolitan Police, has previously said police should look to use “augmented intelligence” rather than relying on AI systems entirely.



Thanks for reading! If it was useful to you, please Like/Share so that, it reaches others as well.

📧 To get e-mail notification on my latest posts, please subscribe to my blog by hitting the Subscribe button at the top of the page. 📧

Stay Tuned.