Friday, October 21, 2016

"Study Finds Racial Bias In Facial Recognition Software Used By Police"

Earlier this week, the Centre of Privacy and Technology at Georgetown Law released a study  examining facial recognition technology used by police departments across the country, which is unregulated. While the report has a total of 11 “key findings,” one in particular is getting the most attention: “Police face recognition will disproportionately affect African Americans.” While this doesn’t necessarily suggest a conscious racial bias by the developers of the software, there are a number of factors at work that may lead facial recognition programs being flawed in identifying black men and women.

There have long been studies about “cross-racial identification” being less reliable than someone distinguishing facial features among someone of their own ethnic group. Facial recognition systems are still designed by humans, after all, so the “training” of those systems can be flawed. The final match after a facial recognition system turns up possible matches is left to a human operator much of the time, so even if the software is working the way it should, there’s plenty of room for human bias. Two different facial recognition companies interviewed for the study admitted that their software had not been tested for racial bias.

5 comments:

Amartel said...

Could the "bias" be due to the fact that, statistically, there are more AA faces in the system due to higher government employment and (?) incarceration rates so the system is more likely to recognize an individual face? If so, that's not bias, at least not on the part of the software.

bagoh20 said...

The machine claimed: "you humans all look alike to me, and the smell, Oy vey!".

edutcher said...

I wonder if it takes into consideration more black yoots are shot by black cops.

Or is that racist?

Synova said...

There was a thing where a computer was programmed to recognize beauty and it chose light skin over dark. Outrage ensued. I don't know if anyone asked if a different skin tone made a difference to the light sensors. It's reasonable to expect that it would.

GOODSTUFF said...

Other factors cited as making facial recognition more difficult include variations in makeup on women and darker skin tones not working well with programs that use color contrast as to help read facial features.

BTW - Most Facial Recognition Software is AI (machine learning)... no bias