UK Police Train AI System To Identify 'Future Criminals'
Anil
Instead of trying to land bad guys in jail, police want to send them supportive interventions.
- 4 Ways AI Could Change The Mobile Gaming Industry
- This South Korean YouTuber Is The Result Of Deepfake Technology
- New ‘Deep Nostalgia’ AI Allow Users To Bring Old Photos To Life
Police departments in the UK are piloting an AI-powered system that targets suspected individuals to identify future criminals. If successful, the UK government will soon spread the use of this system across the nation.
According to some sources, the aforementioned system makes use of a machine-learning algorithm to give predictions about “high harm” crimes. At the very start, the system works on a database of 200,000 criminals to harvest necessary insights.
>>> Former Go Champion Retires After Being Beaten By DeepMind AI
Police have also used risk scoring operationally to evaluate the probability of individuals re-offending. Last year, in a meeting of the Crime Commissioner’s Ethics Committee and West Midlands Police, the system was said to address the challenges “in a far more rigorous and reliable way.”
As designed, such a model will help the government to help anyone who is in need of supportive interventions, reducing the potential of crimes that could happen in the feature. These include alcohol misuse treatment, mental health support, and job training.
Referring to the budget consumed by the AI system, the West Midlands Police said they spent more than $12 on developing it. Once they have all the tests done, the system will get utilized all over the country.
However, the answer to whether such a system could be misused is still questionable. A recent report from the Centre for Data Ethics reportedly raises concerns over major legal and ethical issues if this model is rolled out massively. Once the police rally the use of data, it’s urgently required to have strict guidance over the use of such algorithms.