March 6, 2018

Artificial Intelligence Is Now Used to Predict Crime. But Is It Biased?

“What is fair? It seems a simple question, but it’s one without simple answers. That’s particularly true in the arcane world of artificial intelligence (AI), where the notion of smart, emotionless machines making decisions wonderfully free of bias is fading fast.

Perhaps the most public taint of that perception came with a 2016 ProPublica investigation that concluded that the data driving an AI system used by judges to determine if a convicted criminal is likely to commit more crimes appeared to be biased against minorities. Northpointe, the company that created the algorithm, known as COMPAS, disputed ProPublica’s interpretation of the results, but the clash has sparked both debate and analysis about how much even the smartest machines should be trusted.”