How The Personal Biases Of Computer Scientists Can Seep Into The Programs They Create

Aug 27, 2019

It's a truth we can't escape: human beings have biases, and those biases can seep into computer programs and algorithms based on who is creating them.

That can be particularly harmful when it comes to government programs and services, said Annie Sullivan, a Google engineer who recently did a stint with the federal government. Sullivan spoke about her experience at last week's Abstractions Conference, a multi-disciplinary software conference held in Pittsburgh.

Sullivan said humans learn by generalizing, which can make us biased in ways we never intended.

"It's really easy to take a machine-learning algorithm and accidentally encode that bias in a way that's very opaque, or hard to understand," she said.

For example, a machine-learning algorithm could be created to sort through resumes, putting some in a "good" pile and others in a "bad" pile. The engineer's bias against community colleges could be programmed into that algorithm. 

This kind of programmed bias is especially harmful when it's built into government processes, said Sullivan. 

"We have a really big responsibility in government to represent all the people and to serve all the people," she said. "Not just people that happen to be software engineers."

In the long-term, the real fix is improving diversity in the tech sector, said Sullivan. According to data from the U.S. Census Bureau, just over one fifth of computer programmers in the country are women, and less than five percent are black.