The UK police force has handed over unsolved cases to artificial intelligence. A system called Soze, developed in Australia, can analyze decades-long investigations in a matter of hours.
It is not yet known how accurate the Soze system is – this is a serious concern, as artificial intelligence models can give incorrect and fabricated answers.
Soze is being tested at Avon and Somerset Police, which covers denmark phone number resource parts of South West England, and scans and analyses emails, social media accounts, videos, financial statements and other documents.
The AI was reportedly able to scan evidence in 27 “complex” cases in about 30 hours, a task that would have taken a human 81 years to complete.
Another project involving artificial intelligence is being implemented in the United Kingdom — the creation of a database of knives and swords, which many suspects have used to commit crimes against the country's residents.
Artificial intelligence is notorious for being prone to errors and false positives, especially when applied to law enforcement.
A model that determines the likelihood that a suspect will commit another crime in the future has been found to be inaccurate and biased against black people.
Facial recognition using artificial intelligence could lead to the arrest of those who have not committed crimes.
The US Commission on Civil Rights recently criticized the use of artificial intelligence in policing.
There is a belief that if machines do the analysis, the results will be accurate. However, these systems are built on data collected by humans, who can make mistakes and be biased. This means that they are built from the ground up with the mistakes that we ourselves make.