A May 2016 report by ProPublica revealed a stark finding. A commonly used statistical tool used by criminal justice professionals to predict criminal recidivism and determine sentencing guidelines was inherently biased. The program, Compas, supposedly slanted its predictions to assume African-American were more likely to commit another crime than Caucasians. Not just slightly, but substantially so. Their analysis accuses the software of labeling black defendants as future criminals at twice the rate as whites, and whites were more often labeled as low risk than black defendants.
ProPublica’s Misleading Machine Bias
ProPublica’s Misleading Machine Bias
ProPublica’s Misleading Machine Bias
A May 2016 report by ProPublica revealed a stark finding. A commonly used statistical tool used by criminal justice professionals to predict criminal recidivism and determine sentencing guidelines was inherently biased. The program, Compas, supposedly slanted its predictions to assume African-American were more likely to commit another crime than Caucasians. Not just slightly, but substantially so. Their analysis accuses the software of labeling black defendants as future criminals at twice the rate as whites, and whites were more often labeled as low risk than black defendants.