Play Live Radio
Next Up:
0:00
0:00
0:00 0:00
Available On Air Stations
Contact 90.5 WESA with a story idea or news tip: news@wesa.fm

Residents Raise Concerns About Bias In County's Automated Decision-Making Tools

An-Li Herring
/
90.5 WESA
Ryan Scott (right), Director of Carlow University's Social Justice Institutes, leads a discussion on predictive policing at a community meeting at the Homewood-Brushton YMCA Tuesday, March 10, 2020.

Local residents voiced trepidation at a meeting in Homewood Tuesday about the use of algorithms to guide criminal justice, law enforcement, and child welfare decisions.

Allegheny County employs data-driven algorithms for a number of purposes: to determine whether an allegation of child maltreatment should be investigated, for example, or to decide whether to confine defendants ahead of trial. At Tuesday’s meeting, hosted by the Pittsburgh Task Force on Public Algorithms, attendees said they worried the formulas could reinforce existing biases in policing and incarceration.

Critics said algorithms that draw on crime data or individuals’ arrest records are inherently biased, because black communities historically have been over-policed.

When “white kids in suburban communities … go and do something wrong, frequently police bring them back to their parents’ house,”  said Tim Stevens, chairman and CEO of the Black Political Empowerment Project. “Urban black kids … police just take them down to the jail.”

“Race and class are two big deals that are not going away in this country anytime soon," Stevens said. "So any [algorithm] system … is going to be built on prejudices and biases, and they will continue to screw the people who are already at the bottom of the system."

Nathaniel Carter, director of community services organization This Generation Connect, worried that judges would use data-based risk assessment tools to deflect blame for racial disparities in setting bail.

“Now we can say that it wasn’t this particular judge … It was this algorithm,” Carter said.

But left to their own devices, judges could be more arbitrary, countered University of Pittsburgh Public Health Professor Eric Hulsey.

“So on the flip side, you could use [algorithms] to take away that power from them and say, ‘No, you don’t get full discretion,’” Hulsey said.

The University of Pittsburgh’s Institute for Cyber Law, Policy, and Security announced the task force in January. Its 22 members include professors in fields ranging from law to statistics, as well as civic and foundation leaders. Seven city and county government officials make up a separate advisory panel.

The task force's mission is to examine local governments’ use of algorithms to measure risk in a range of contexts. Beyond criminal justice and human services, municipalities use the tools to manage challenges such as fire risk and traffic patterns, according to Pitt Cyber Policy Director Chris Deluzio.

Deluzio said a chief goal of the task force is to reduce potential biases in the region’s decision-making tools. While the use of algorithms can facilitate more systematic and efficient decision-making, the process has come under fire nationally and internationally for being opaque and for threatening to exacerbate existing biases.

“If these types of systems aren’t implemented and overseen with some sort of oversight, with some sort of controls,” he said, “there’s a risk they could lock in bias that’s reflected in their data.”

But some forum attendees on Tuesday night saw an opportunity to use risk-based models to combat racial and class disparities.

Shimira Williams, a resident of Lincoln-Lemington-Belmar, suggested that officials use predictive policing tools to determine which neighborhoods need more resources to combat poverty and crime.

“So instead of it being used to disseminate people in a reactive [way], make it more of a proactive approach,” she said. If “we know it’s a hot spot, let’s partner with local community organizations to see how we can become part of that community.”

Bonnie Fan, an organizer for the Coalition Against Predictive Policing, said a diverse set of experts should be involved in developing algorithms. She said such an approach could prompt governments to rethink traditional assumptions. For example, she said, they might ask, “Is the answer to implement a risk assessment score so that the judge doesn’t have to set bail? Or is the answer [never] to require bail?”

Debate over algorithms is widespread. In Pennsylvania, for example, controversy arose when the state sentencing commission developed a formula to guide judges in assessing how likely defendants were to commit new violent offenses.

Meanwhile, a task force charged with overseeing the use of algorithms in New York City recently faltered amid complaints that the process was not transparent and did too little to engage the public. A task force member complained that officials did not furnish a single instance of an automated system for the panel to study.

Tuesday’s meeting was the first of two community gatherings the algorithms task force plans to host this month. The second is scheduled to take place in Beltzhoover next Thursday.

The task force plans to publish a full report, including best practices and guidelines for using municipal algorithms, in the summer of 2021. The recommendations will not be binding.