ACLU finds Amazon’s facial recognition AI is racially biased

ACLU finds Amazon’s facial recognition AI is racially biased Ryan is a senior editor at TechForge Media with over a decade of experience covering the latest technology and interviewing leading industry figures. He can often be sighted at tech conferences with a strong coffee in one hand and a laptop in the other. If it's geeky, he’s probably into it. Find him on Twitter (@Gadget_Ry) or Mastodon (@gadgetry@techhub.social)


A test of Amazon’s facial recognition technology by the ACLU has found it erroneously labelled those with darker skin colours as criminals more often.

Bias in AI technology, when used by law enforcement, has raised concerns of infringing on civil rights by automated racial profiling.

A 2010 study by researchers at NIST and the University of Texas in Dallas found that algorithms designed and tested in East Asia are better at recognising East Asians, while those designed in Western countries are more accurate at detecting Caucasians.

The ACLU (American Civil Liberties Union) ran a test of Amazon’s facial recognition technology on members of Congress to see if they match with a database of criminal mugshots.

Amazon’s Rekognition tool was used to compare pictures of all members of the House and Senate against 25,000 arrest photos, the false matches disproportionately affected members of the Congressional Black Caucus.

In a blog post, the ACLU said:

“The members of Congress who were falsely matched with the mugshot database we used in the test include Republicans and Democrats, men and women, and legislators of all ages, from all across the country.

These results demonstrate why Congress should join the ACLU in calling for a moratorium on law enforcement use of face surveillance.”

AWS (Amazon Web Services) disputed the methodology used by the ACLU.

The company says the default setting of 80 percent confidence was left on. For law enforcement, Amazon says it suggests the option to only register matches of 95 percent confidence or above.

Only 28 members of Congress were incorrectly flagged as being among the criminal mugshots, but nearly 40 percent of the wrong matches were people of darker skin colours. What really puts it all into perspective, however, is that only 20 percent of Congress are people of colour.

“Our test reinforces that face surveillance is not safe for government use,” said Jacob Snow, Technology and Civil Liberties Attorney at the ACLU Foundation of Northern California. “Face surveillance will be used to power discriminatory surveillance and policing that targets communities of color, immigrants, and activists. Once unleashed, that damage can’t be undone.”

Amazon is actively marketing its facial recognition technology to law enforcement such as police in Washington County, Oregon, and Orlando, Florida. They promote it as a way to identify people in real-time from both surveillance footage and officers’ body cameras.

What are your thoughts on the ACLU’s findings? Let us know in the comments.

 Interested in hearing industry leaders discuss subjects like this and sharing their use-cases? Attend the co-located AI & Big Data Expo events with upcoming shows in Silicon Valley, London and Amsterdam to learn more. Co-located with the  IoT Tech Expo, Blockchain Expo and Cyber Security & Cloud Expo so you can explore the future of enterprise technology in one place.

Tags: , , , , , , , ,

View Comments
Leave a comment

Leave a Reply