Skip to main content

Sign up for our newsletter.

Quality journalism. Progressive values. Direct to your inbox.

A display shows a facial recognition system for law enforcement during the NVIDIA GPU Technology Conference, which showcases artificial intelligence, deep learning, virtual reality, and autonomous machines, in Washington, D.C., November 1, 2017. (Photo: Saul Loeb/AFP via Getty Images)

Federal Study on Racial Biases in Facial Recognition Technology Confirms Warnings of Civil Liberties Groups

African American and Asian American men were misidentified 100 times as often as white men.

Julia Conley

The U.S. government's first major federal study of facial recognition surveillance, released Thursday, shows the technology's extreme racial and gender biases, confirming what privacy and civil rights groups have warned about for years.

In a study of 189 algorithms used by law enforcement agencies to match facial recognition images with names in state and federal databases, the National Institute of Standards and Technology (NIST) found that Asian American and African American men were misidentified 100 times as often as white men.

The algorithms disproportionately favored white middle-aged men overall. Compared with young people, the elderly, and women of all ages, middle-aged white males were identified accurately most frequently, while Native American people were most frequently misidentified.

Such misidentifications can lead to false arrests as well as inability to secure employment, housing, or credit, the MIT Media Lab said in a study it conducted in 2018.

"Criminal courts are using algorithms for sentencing, mirroring past racial biases into the future," tweeted Brianna Wu, a U.S. House candidate in Massachusetts. "Tech is a new, terrifying frontier for civil rights."

The NIST study echoed MIT Media Lab's results in their study entitled "Gender Shades," in which researchers found that algorithms developed by three different companies most often misidentified women of color.

NIST's report is "a sobering reminder that facial recognition technology has consequential technical limitations alongside posing threats to civil rights and liberties," Joy Buolamwini, lead author the Gender Shades report, told the Washington Post.

Digital rights group Fight for the Future wrote on social media that the study demonstrated "why dozens of groups and tens of thousands of people are calling on Congress to ban facial recognition."

Fight for the Future launched its #BanFacialRecognition campaign in July, calling on local, state, and federal governments to ban the use of the technology by law enforcement and other public agencies—instead of just regulating its use.

"Face recognition technology—accurate or not—can enable undetectable, persistent, and suspicionless surveillance on an unprecedented scale." —Jay Stanley, ACLU"This surveillance technology poses such a profound threat to the future of human society and basic liberty that its dangers far outweigh any potential benefits," Fight for the Future said when it launched the campaign.

This week lawmakers in Alameda, Calif. became the latest local officials to vote for a ban.

Despite warnings from Fight for the Future and other groups including the ACLU, which sued the federal government in October over its use of the technology, the FBI has run nearly 400,000 searches of local and federal databases using facial recognition since 2011.

The algorithms studied by NIST were developed by companies including Microsoft, Intel, and Panasonic. Amazon, which developed facial recognition software called Rekognition, did not provide its algorithm for the study.

"Amazon is deeply cowardly when it comes to getting their facial recognition algorithm audited," tweeted Cathy O'Neil, an algorithm auditor.

Jay Stanley, a senior policy analyst at the ACLU, told the Post that inaccuracies in algorithms are "only one concern" that civil liberties groups have about the surveillance programs that the federal government is now studying.

"Face recognition technology—accurate or not—can enable undetectable, persistent, and suspicionless surveillance on an unprecedented scale," Stanley said.

Our work is licensed under Creative Commons (CC BY-NC-ND 3.0). Feel free to republish and share widely.

This is the world we live in. This is the world we cover.

Because of people like you, another world is possible. There are many battles to be won, but we will battle them together—all of us. Common Dreams is not your normal news site. We don't survive on clicks. We don't want advertising dollars. We want the world to be a better place. But we can't do it alone. It doesn't work that way. We need you. If you can help today—because every gift of every size matters—please do. Without Your Support We Simply Don't Exist.

New Legal Campaign Aims to Protect People and Nature From Polluters' 'Irreparable Damage'

"States must listen to communities' demands to recognize the human right to a healthy environment and better regulate businesses with respect to the impacts of their operations."

Jessica Corbett ·

'You Tell Me What We Should Cut': Sanders Not Budging on $3.5 Trillion

"Poll after poll tells me, and tells you, that what we are trying to do is enormously popular."

Jake Johnson ·

Civil Rights Leaders Decry Senate's Failure to Pass Police Reform Bill

"We will continue to fight and advocate for legislation worthy of George Floyd's name," vowed the heads of seven advocacy groups.

Brett Wilkins ·

Biden Set to Admit Even Fewer Refugees Than Trump's Record Low

The "paltry" number of those admitted so far would be well below the 62,500 ceiling President Joe Biden had set for the current fiscal year.

Andrea Germanos ·

In 'Landmark' Decision, EPA Finalizes Rule Cutting Use of Super-Pollutant HFCs

The regulation will drastically curb the use of "the most potent super-pollutants known to mankind at the moment," one climate campaigner said.

Julia Conley ·

Support our work.

We are independent, non-profit, advertising-free and 100% reader supported.

Subscribe to our newsletter.

Quality journalism. Progressive values.
Direct to your inbox.

Subscribe to our Newsletter.

Common Dreams, Inc. Founded 1997. Registered 501(c3) Non-Profit | Privacy Policy
Common Dreams Logo