AI experts say research into algorithms that claim to predict criminality must end

  • Thread starter Thread starter The Verge RSS
  • Start date Start date
T

The Verge RSS

Guest
Author: James Vincent

acastro_180730_1777_facial_recognition_0003.0.jpg
Illustration by Alex Castro / The Verge
A coalition of AI researchers, data scientists, and sociologists has called on the academic world to stop publishing studies that claim to predict an individual’s criminality using algorithms trained on data like facial scans and criminal statistics.

Such work is not only scientifically illiterate, says the Coalition for Critical Technology, but perpetuates a cycle of prejudice against Black people and people of color. Numerous studies show the justice system treats these groups more harshly than white people, so any software trained on this data simply amplifies and entrenches societal bias and racism.

Algorithms trained on racist data produce racist results

“Let’s be clear: there is no way to develop a system that can predict or...

Continue reading…

Continue reading...