Skip to content
Link copied to clipboard

Inquirer editorial: Numbers may lie when setting bail

About 60 percent of Philadelphia's prison inmates are awaiting trial, but in trying to reduce that population, officials should be careful not to put too much emphasis on an algorithm designed to help judges determine which defendants should be granted bail.

The House of Correction in Northeast Philadelphia.
The House of Correction in Northeast Philadelphia.Read more

About 60 percent of Philadelphia's prison inmates are awaiting trial, but in trying to reduce that population, officials should be careful not to put too much emphasis on an algorithm designed to help judges determine which defendants should be granted bail.

The algorithm, based on the past behavior of previous inmates with similar characteristics, supposedly can calculate the likelihood that a person will commit a crime if he is released before trial. The judge can then use that calculation in deciding whether to set bail.

Research suggests algorithms can be more accurate than judges in predicting behavior. Experts say the tool helps avoid unnecessarily harsh punishments for low- and medium-risk offenders. But lawyers and others have challenged the moral and legal validity of such algorithms.

One glaring problem with some is their reliance on variables more strongly related to a person's race or income than their criminal background, including a defendant's zip code, education level, and leisure activities. Defendants who took IQ and reading tests during previous prison stays might see those results used against them.

Some states use algorithms more effectively post-trial. The Pennsylvania prison system uses similar assessment tools when an inmate first arrives to establish custody levels and recommend housing, work detail, treatment, and program assignments, and later to help determine the conditions of release.

Philadelphia's Adult Probation and Parole Department uses an assessment tool developed by University of Pennsylvania criminologist Richard Berk, who is leading the effort to develop an algorithm for pretrial decisions. Asked about the possibility that an algorithm could lead to discriminatory treatment of black and poor defendants, Berk said, "People who stress that point forget about the victims. How many deaths are you willing to trade for something that's race-neutral?"

Other criminologists largely agree with Berk, saying their job is only to create the most accurate predictions of criminal behavior. They also argue that algorithms are more transparent and consistent than judges, who, as humans, have biases.

Perhaps, but using an algorithm to determine a defendant's status prior to trial may be more detrimental than using a risk-assessment tool after a conviction. Studies show people detained before trial are more likely to be convicted and more likely to receive longer sentences.

Where you live and how much you make isn't always predictive of criminal behavior. There are plenty of defendants from less-than-perfect neighborhoods who live up to the trust put in them when given the chance. Good judges, most of the time, can tell who deserves that chance, with or without an algorithm to help them.