Play Live Radio
Next Up:
0:00
0:00
0:00 0:00
Available On Air Stations

Police Face Recognition Software Still Isn't Great With Women & POCs, And Your Face Is Already In It

Gerry Bloome
/
AP
Stephen Lamm, supervisor with the ID Fraud Unit of the North Carolina Department of Motor Vehicles looks through photos in the facial recognition system Thursday, Sept. 24, 2009 in Raleigh, N.C.

Facial recognition systems look fast and effective in the movies and on television crime shows, but a new report shows that these identification tools suffer from some of the same biases that we’ve heard about when humans try to identify an alleged criminal.

On this week's episode of the Criminal Injustice podcast, University of Pittsburgh law professor and host David Harris talked to Alvaro Bedoya, executive director of the Center on Privacy & Technology at Georgetown Law School and author of the report, The Perpetual Lineup

Their conversation has been edited for length and clarity. Listen to the full episode above, or find more at criminalinjusticepodcast.com.

=============

DAVID HARRIS: Facial recognition starts with a picture taken by a police officer or a surveillance camera, a computer measures the face and compares it to millions of stored faces. Now it's still limited, but it's come a long way. What's that do for everyday law enforcement in the technical sense?

ALVARO BEDOYA: It generally is very good at comparing two photos taken under similar similarly good lighting conditions where the subject is facing the camera head on with high resolution. So, for example, if someone has been arrested and they refuse to identify themselves or they're providing an alias, you can take a mug shot of them and run it into your face recognition database of mug shots and identify that person in any fairly accurate way.

HARRIS: How many Americans have their images in these facial recognition networks?

BEDOYA: We estimate that at a minimum one out of every two American adults is in any criminal face recognition network. We identified 26 states at the time of publication, and actually now it's 28 states*, where we have found that driver's licenses are shared in some way with law enforcement. Either law enforcement is able to run face recognition searches or law enforcement actually obtains all those driver's license photos. This has really never happened before. By tapping into all of these driver's license photos, what the FBI is doing right now is creating a national biometric network that is primarily made up of law-abiding Americans. That's a fundamental shift, and we think a problematic one.

HARRIS: Why is it a problem, just having your face in a database? How does it hurt anyone?

BEDOYA: Let me give you an example from a couple of years ago. A couple years ago, the FBI tested its own system and it found that one out of every seven searches of that system returned a list of totally innocent people.

HARRIS: And your report says that not everyone has the same odds of being wrongly identified.

BEDOYA: Again, the little research that has been done on bias in the systems suggests that these systems actually have a harder time finding women, African-Americans and young people, and this is actually FBI co-authored research from 2012 that suggests this. Now, for some systems, this might mean that the person just isn't caught and that you're slightly more likely to catch a white suspect than you are an African-American suspect or female suspect. But for other systems, like for example the FBI system, that system is not designed to give no for an answer. It is not designed to return zero potential suspects. No matter what, it will return a list of faces. And so, in these systems that are designed to not tell you no for an answer, when they miss the right suspect, they're still going to give you a list of potential suspects that look like the candidate image. And so this is why we're particularly worried that innocent people are going to be caught up in criminal investigations as a result of face recognition. And those innocent people will predominantly be African-Americans, women and young people.

HARRIS: The biases that are in the system already get baked into the new system.

BEDOYA: That's exactly right. There is a feedback loop of bias. So the bias that occurs on the street when a police officer decides to arrest a black person where you know he or she would have let a white person keep on walking is translated into a digital bias of face recognition search results that an investigating officer will get when he or she runs a search of the system for someone's face. That's exactly what's happening here.

=============

*The face recognition system used in Pennsylvania is managed by the Pennsylvania Justice Network and owned by the Pennsylvania Chiefs of Police Association. In 2013, the state's driver’s license photo database, including 34 million driver’s licenses, ID photos and over four million mug shots. Find more here.