Friday, February 16, 2018

Computers no better than humans at predicting who should go to jail.

I want to point to two pieces that offer a commentary on our trust in using algorithms to predict human behavior rather than old fashioned human judgement. In the U.S., computers help decide who goes to jail, on the basis of predicting recidivism. Matacic discusses studies showing their judgment may be no better than ours. She points, for example, to work of Dressel and Farid. Their abstract:
Algorithms for predicting recidivism are commonly used to assess a criminal defendant’s likelihood of committing a crime. These predictions are used in pretrial, parole, and sentencing decisions. Proponents of these systems argue that big data and advanced machine learning make these analyses more accurate and less biased than humans. We show, however, that the widely used commercial risk assessment software COMPAS is no more accurate or fair than predictions made by people with little or no criminal justice expertise. We further show that a simple linear predictor provided with only two features is nearly equivalent to COMPAS with its 137 features.

No comments:

Post a Comment