Have you convinced your boss yet? Groups get the best deals 🎟️ Buy now before price increase →

This article was published on August 10, 2021

Using AI to analyze prison phone calls could amplify racial biases in policing

Plans to expand use of the tech have provoked alarm


Using AI to analyze prison phone calls could amplify racial biases in policing Image by: RODNAE Productions from Pexels

Prisoners across the US could soon be subjected to a high-risk application of automated surveillance.

A congressional committee is pressing the Department of Justice to explore the federal use of AI to analyze inmate’s phone calls.

The panel has called for further research into the tech’s potential to prevent suicide and violent crime, Reuters reports.

The system transcribes phone conversations, analyzes the tone of voice, and detects certain words or phrases that are pre-programmed by officials.

The <3 of EU tech

The latest rumblings from the EU tech scene, a story from our wise ol' founder Boris, and some questionable AI art. It's free, every week, in your inbox. Sign up now!

Bill Partridge, a police chief in Alabama, where the system is already used, said officers have solved cold case homicides after the AI flagged prisoners discussing the crimes.

Proponents argue that the tech can protect inmates and help police investigations, but critics have expressed alarm about the plans.

They fear that using to interpret conversations will lead to life-changing mistakes, misunderstanding, and racial biases.

AI biases

Speech recognition software is notorious for misinterpreting Black speakers.

A 2020 Stanford University study found that speech-to-text systems used by Amazon, Google, Apple, IBM, and Microsoft had error rates that were almost twice as high for Black people as they were for white ones.

These disparities could reinforce racial disparities in the criminal justice system. Research shows that Black men in the US are six times as likely to be incarcerated as white men

Albert Fox Cahn, the executive director of the Surveillance Technology Oversight Project, told Reuters that the tech could “automate racial profiling” and violate privacy rights:

This Congress should be outlawing racist policing tech — it shouldn’t be funding it. People who have been caught up in the criminal justice system are always turned into the subjects of experimentation for new technology systems.

Prisons aren’t exactly designed to protect the privacy of inmates, but the AI system is a worrying addition to the surveillance regime.

Greetings Humanoids! Did you know we have a newsletter all about AI? You can subscribe to it right here.

Get the TNW newsletter

Get the most important tech news in your inbox each week.

Also tagged with