Nearly a decade after being tasked with the assignment, a state commission is still grappling with a mandate to create a risk assessment algorithm for Pennsylvania judges to use during criminal sentencing procedures.
The algorithm is intended to make the criminal justice system fairer and cut the state prison population by making the sentencing process more objective — smoothing out disparities in punishments ordered by different judges.
But even after the Pennsylvania Commission on Sentencing tweaked the algorithm to appease opponents, the newest version of the proposed tool is getting poor marks from advocates.
During more than two hours of public testimony in Philadelphia on Wednesday — the third of five hearing scheduled across the state in December — some of the 11-member commission were present to hear wall-to-wall opposition, which included several calls for lawmakers to scrap the legislation that kickstarted the body’s work.
“We need to treat everyone in this system as a person, not a number,” said Lorraine Haw, a member of the Coalition to Abolish Death by Incarceration.
Dozens of activists gathered at a hearing before in Pittsburgh the next day. About 30 speakers at that hearing voiced opposition to the proposed algorithm, designed to predict recidivism, which the commission defines as the likelihood someone will be arrested and convicted within three years of being released from prison or completing probation.
That risk — characterized as either low, medium or high — is calculated based on a person’s age, gender, prior convictions and other pieces of criminal history.
In Pennsylvania, where the incarceration rate for black people is nine times the rate for white people, witness after witness told the commision the risk assessment tool would reinforce racial disparities.
One speaker in Pittsburgh, Michael Skirpan, said he studied algorithmic modeling, machine learning, and ethics as a doctoral student. Through his research, he said, he “became very aware that [risk assessment] tools in particular, these analytics, were not going to solve our deepest social issues, and that they would in fact likely reinforce them.”
“We are falling victim to ... believing that data and science and technical tools are going to somehow solve the problem by simply making them more efficient and objective in how we evaluate these problems,” continued Skirpan, who now serves as executive director of Community Forge, a community center in Wilkinsburg.
“I’m coming here as a mother of sons. My 20-year-old has been fitting the description [of a high-risk crime threat] since he was 13 years old,” Carol Speaks, who is black, added.. “I don’t want my sons’, my grandsons’, my great-grandsons’ lives in the hands of a computer.”
Allegheny County Common Pleas Judge Jill Rangos serves as vice chair of the sentencing commission and presided over the Pittsburgh hearing. She advised Speaks that judges would use risk assessment metrics as just one factor in determining sentences.
“Under the law of the Commonwealth of Pennsylvania, sentencing is left to the sound discretion of the trial judge,” Rangos said, “and that would remain the case if a risk assessment tool were adopted – that is the law.”
Some speakers in Pittsburgh recommended scrapping the tool and using the saved funds for drug and alcohol and mental health treatment, as well as job training programs. Such investment, the activists said, would help to reduce incarceration.
Data-driven risk assessment first gained traction in criminal justice circles because, to many, it promised to limit the influence of judges’ individual biases on the cases that come before them.
In Pittsburgh, however, some activists said they trust elected judges to be fair.
“I would really like for you to have that power and do what’s best, not an algorithm or a computer scientist,” said Terrell Thomas, senior field organizer for the American Civil Liberties Union’s Smart Justice campaign.
At the hearing in Philadelphia, Nyssa Taylor, criminal justice policy counsel with the ACLU of Pennsylvania, called the tool unconstitutional because the algorithm is weighted differently for men and women.
“All men receive one point in the risk assessment score and women receive zero. Our state and federal courts have said that that is explicit gender discrimination. Gender discrimination is prohibited at sentencing,” said Taylor in an interview after her testimony.
Critics also emphasized that the commission’s own analysis shows that the proposed algorithm is particularly flawed.
The commission found the current version of their risk assessment tool is 85 percent accurate when it comes to identifying low-risk defendants.
The total dips to 52 percent when it comes to identifying high-risk defendants.
“That’s a coin flip,” said Rev. Gregory Holston, executive director of POWER, an interfaith organization that fights for social justice.
The commission will hear testimony Friday in Warren County.
The next formal meeting for the body is in March. Executive Director Mark Bergstrom said he’s committed to fulfilling the commission’s mandate, but acknowledges the possibility of an impasse.
“Our system right now is not great and we need to take steps to improve it, but we have to make sure we’re not causing harm,” said Bergstrom.
After comparing the algorithm to the sometimes errant auto-correct function on smartphones, Rep. Stephen Kinsey, D-Philadelphia, said he’d support repealing the commission’s mandate.
State Sen. Shariff Street, D-Philadelphia, said he thinks there’s “substantial” support in the Senate to change the mandate.
*This story was updated at 6:02 p.m. on Thursday, Dec. 13, 2018, to include coverage from Pittsburgh.