Play Live Radio
Next Up:
0:00
0:00
0:00 0:00
Available On Air Stations

Pitt task force calls for transparency, public input when government agencies use algorithms

Uwe Lein
/
AP

Government agencies that use algorithms should be more transparent about how those systems work and allow the public to have a say in how they’re used, according to a Pittsburgh task force on public algorithms.

For the last two years, a task force of academic experts from the University of Pittsburgh and government advisors from Pittsburgh and Allegheny County have been studying how the region’s agencies use automated decision-making tools. The Pittsburgh Task Force on Public Algorithms published its first report Wednesday. The group also made recommendations to address biases in those systems.

“There is a growing realization that algorithms can — and in some cases do — have extraordinary power to shape our lives, in both the private and public sectors. We believe that this is a critical moment for our region’s governments to act,” the report reads.

Supporters of public algorithmic systems argue they make data processing faster and more efficient by removing the human variable. But the task force found algorithms can also reflect existing human biases and even accelerate them.

The study looked at systems used by the Allegheny County’s child-welfare department and courts and a pilot project by Pittsburgh Police. These algorithms have been used to make decisions about whether allegations of child mistreatment should be investigated; whether to confine a defendant ahead of trial; and where police should patrol.

The task force found that transparency about how these systems were developed and how they’re used varied greatly from agency to agency. A system used to screen allegations of child mistreatment in Allegheny County was found to have more public input and be evaluated more frequently than a program used by the Pittsburgh Bureau of Police to predict hot spots for crime.

“Such opacity seems to be far too common across the country, especially in policing applications of algorithmic systems,” the report argued.

The task force studied a pilot program launched by Pittsburgh Police and Carnegie Mellon University in 2017 that was designed to make predictions about where crime might occur in the future. Police would then target patrols in so-called "hot spots."

The system relied on reports of crimes and 911 calls from 2011 to 2016 to make predictions, which could reflect long-standing racial and economic disparities in Pittsburgh according to the report.

“We actually can exacerbate inequality and that’s the problem,” said David Hickton, founding director of Pitt's Institute for Cyber Law, Policy and Security.

The predictive policing system was suspended in 2020 when concerns arose about racial bias inherent in the system. Facial recognition software has been found to work less efficiently on people with darker skin. Use of facial recognition software led to the wrongful arrest of a Black man in Michigan in 2020.

“If we’re trying to correct over-policing, the use of these algorithms would defeat that,” Hickton said. The predictive policing system used by Pittsburgh Police was suspended in 2017.

Pittsburgh City Council passed a measure in 2020 that would guide how facial recognition could be used in the future by police. But the task force recommends against using the technology at all for the foreseeable future due to profound racial and gender disparities associated with biometric systems.

The task force also noted concerns about privacy concerns about facial recognition software informed by Allegheny County’s network of surveillance cameras.

“Even if accuracy issues eventually improve, [the systems] could result in invasive surveillance that would undermine privacy,” the report notes.

The task force makes seven recommendations and suggests several best practices for agencies using algorithms to make decisions. Among the recommendations is more public input and education about how algorithms are used to deliver government services and make criminal justice decisions.

A spokesperson for Allegheny County's Department of Human Services said Wednesday that developing algorithms in the public eye has been a priority for the department.

"DHS Director Erin Dalton is pleased to see the task force emphasize the need for transparency and community engagement in these processes," the spokesperson said. "We have long recognized that the public should understand how their government makes critical decisions and have a track record for ensuring that we develop these tools in public view."

While the report makes recommendations for best practices and improvements, the task force concedes mitigating some issues is not a silver bullet for systemic biases.

“We should not expect perfection from our government algorithms,” the report argues. “But we should expect that agencies are able to demonstrate that algorithmic systems produce equal or better outcomes than human processes, and there must be a way for the public to interrogate and challenge such systems.”

The report calls for external reviews of systems used in high risk situations like predictive policing and court sentencing, agencies to publish information about algorithmic systems, public announcements about new contracts might that might bring in new algorithmic systems, and public involvement when systems are developed or changed.

“We do not have to accept the false choice between technological advancement and civil and constitutional rights,” said Hickton. “People of goodwill can find ways to balance liberty and security in the digital age, leveraging tech innovation fairly and with transparency.”

The Task Force on Public Algorithms is organizing community meetings to discuss the findings of the report. You can find more information about those meetings and the full report here.

Kiley Koscinski covers city government, policy and how Pittsburghers engage with city services. She also works as a fill-in host for All Things Considered. Kiley has previously served as a producer on The Confluence and Morning Edition.