T
The Verge RSS
Guest
Author: James Vincent
Illustration by Alex Castro / The Verge
A coalition of AI researchers, data scientists, and sociologists has called on the academic world to stop publishing studies that claim to predict an individual’s criminality using algorithms trained on data like facial scans and criminal statistics.
Such work is not only scientifically illiterate, says the Coalition for Critical Technology, but perpetuates a cycle of prejudice against Black people and people of color. Numerous studies show the justice system treats these groups more harshly than white people, so any software trained on this data simply amplifies and entrenches societal bias and racism.
Algorithms trained on racist data produce racist results
“Let’s be clear: there is no way to develop a system that can predict or...
Continue reading…
Continue reading...
A coalition of AI researchers, data scientists, and sociologists has called on the academic world to stop publishing studies that claim to predict an individual’s criminality using algorithms trained on data like facial scans and criminal statistics.
Such work is not only scientifically illiterate, says the Coalition for Critical Technology, but perpetuates a cycle of prejudice against Black people and people of color. Numerous studies show the justice system treats these groups more harshly than white people, so any software trained on this data simply amplifies and entrenches societal bias and racism.
Algorithms trained on racist data produce racist results
“Let’s be clear: there is no way to develop a system that can predict or...
Continue reading…
Continue reading...