About a decade ago, a growing recognition that the United States has been locking up far too many people — and many for far too long — led to a shift in thinking, and in new laws. For example, in 2010, Congress passed the Fair Sentencing Act that was intended to close the dramatic gap in sentencing between crack cocaine and powder cocaine — two forms of the same drug — the latter popular among white people and the former among black people.
At about the same time, the chair of the Pennsylvania state Senate Judiciary Committee, Stewart Greenleaf, drafted legislation that instructed the state’s Commission on Sentencing to develop a tool that would help judges identify low-risk offenders who can then be diverted away from incarceration.
On Thursday, nine years later, the commission is set to vote on the risk assessment tool that it developed. The algorithm-based tool scores risk based on age, gender, and previous criminal justice involvement. But many experts and activists are raising the red flag that it’s racist.
They are right.
The idea behind algorithms is that they can remove human bias. While a judge might be more punitive toward a black defendant, for example, an algorithm doesn’t see race. But that assumption is extremely flawed.
The reason that risk assessment algorithms can be so problematic is simple: garbage in, garbage out.
Algorithms are trained on data from the real world. If the data is based on outcomes from our broken criminal justice system, the algorithm will simply entrench historical disparities under the guise of unbiased math. Including criminal history in the tool might seem reasonable, but it ignores the fact that racial biases, not necessarily behavior, often determine whether someone gets a criminal record.
Researchers at Carnegie Mellon University tested the tool and found that its accuracy in predicting high risk was 52% — about a coin toss.
As a secondary step, when the score indicates very high or very low risk, the commission proposes that judges use whatever tool the county’s probation services use to assess risk. In Philadelphia, that tool has been long called a “black-box" for its opacity.
Greenleaf, who retired last year, wanted a tool that would help reduce the number of people incarcerated. Instead, the proposed tool could lead to an increase in the number of people incarcerated. In a letter to the commission, Greenleaf expressed his concerns and asked the commission to reject the tool. The American Civil Liberties Union of Pennsylvania, experts, state lawmakers, and activists joined Greenleaf’s call. State Rep. Todd Stephens, vice chairman of the commission, says input from multiple stakeholders was taken into account and after close to a decade, it is time to implement the tool.
At a time that the criminal justice system is rightfully pushed to be more nuanced and to recognize that each defendant comes to a courtroom with a unique story, the risk assessment tool that the commission proposes takes us in the opposite direction. The commission should pull the tool and continue working until it gets it right — and not risk further entrenching racism in our criminal justice system.