Skip to main content

Common Dreams. Journalism funded by people, not corporations.

There has never been—and never will be—an advertisement on our site except for this one: without readers like you supporting our work, we wouldn't exist.

No corporate influence. No pay-wall. Independent news and opinion 365 days a year that is freely available to all and funded by those who support our mission: To inform. To inspire. To ignite change for the common good.

Our mission is clear. Our model is simple. If you can, please support our Fall Campaign today.

Support Our Work -- No corporate influence. No pay-wall. Independent news funded by those who support our mission: To inform. To inspire. To ignite change for the common good. Please support our Fall Campaign today.

Facial recognition systems

Amazon claimed this week that the company's facial recognition technology can recognize "fear" in the faces of people it encounters. (Image: iStock/Getty Images)

Rights Group Warns Against Amazon Marketing to Law Enforcement 'Dangerous and Invasive' Facial Recognition System It Claims Can Detect People's Fear

"The dystopian surveillance state of our nightmares is being built in plain sight."

Jessica Corbett

Privacy advocates are responding with alarm to Amazon's claim this week that the controversial cloud-based facial recognition system the company markets to law enforcement agencies can now detect "fear" in the people it targets.

"Amazon is going to get someone killed by recklessly marketing this dangerous and invasive surveillance technology to governments," warned Evan Greer, deputy director of the digital rights group Fight for the Future, in a statement Wednesday.

Amazon Web Services detailed new updates to its system—called Rekognition—in an announcement Monday:

With this release, we have further improved the accuracy of gender identification. In addition, we have improved accuracy for emotion detection (for all 7 emotions: 'Happy', 'Sad', 'Angry', 'Surprised', 'Disgusted', 'Calm', and 'Confused') and added a new emotion: 'Fear'. Lastly, we have improved age range estimation accuracy; you also get narrower age ranges across most age groups.

Pointing to research on the technology conducted by the ACLU and others, Fight for the Future's Greer said that "facial recognition already automates and exacerbates police abuse, profiling, and discrimination."

"Now Amazon is setting us on a path where armed government agents could make split second judgements based on a flawed algorithm's cold testimony. Innocent people could be detained, deported, or falsely imprisoned because a computer decided they looked afraid when being questioned by authorities," she warned. "The dystopian surveillance state of our nightmares is being built in plain sight—by a profit-hungry corporation eager to cozy up to governments around the world."

VICE reported that "despite Amazon's bold claims, the efficacy of emotion recognition is in dispute. A recent study reviewing over 1,000 academic papers on emotion recognition found that the technique is deeply flawed—there just isn't a strong enough correlation between facial expressions and actual human emotions, and common methods for training algorithms to spot emotions present a host of other problems."

Amid mounting concerns over how police and other agencies may use and abuse facial recognition tools, Fight for the Future launched a national #BanFacialRecognition campaign last month. Highlighting that there are currently no nationwide standards for how agencies and officials can use the emerging technology, the group calls on federal lawmakers to ban the government from using it at all.

Fight for the Future reiterated their demand Wednesday, in response to Amazon's latest claims. Although there are not yet any federal regulations for the technology, city councils—from San Francisco to Somerville, Massachusetts—have recently taken steps to outlaw government use such systems.

Activists are especially concerned about the technology in that hands of federal agencies such as U.S. Immigration and Customs Enforcement (ICE) and Customs and Border Patrol (CBP), whose implementation of the Trump administration's immigration policies has spurred condemnation from human rights advocates the world over.

Civil and human rights advocates have strongly urged Amazon—as well as other developers including Google and Microsoft—to refuse to sell facial recognition technology to governments in the United States and around the world, emphasizing concerns about safety, civil liberties, and public trust.

However, documents obtained last year by the Project on Government Oversight revealed that in the summer of 2018, Amazon pitched its Rekognition system to the Department of Homeland Security—which oversees ICE and CBP—over the objection of Amazon employees. More recently, the corporation has been targeted by protesters of the Trump administration's immigration agenda for Amazon Web Service's cloud contracts with ICE.

In a July report on Amazon's role in the administration's immigration policies, Al Jazeera explained that "U.S. authorities manage their immigration caseload with Palantir software that facilitates tracking down would-be deportees. Amazon Web Services hosts these databases, while Palantir provides the computer program to organize the data."

"Amazon provides the technological backbone for the brutal deportation and detention machine that is already terrorizing immigrant communities," Audrey Sasson, executive director of Jews For Racial and Economic Justice, told VICE Tuesday. "[A]nd now Amazon is giving ICE tools to use the terror the agency already inflicts to help agents round people up and put them in concentration camps."

"Just as IBM collaborated with the Nazis, Amazon and Palantir are collaborating with ICE today," added Sasson. "They've chosen which side of history they want to be on."


Our work is licensed under Creative Commons (CC BY-NC-ND 3.0). Feel free to republish and share widely.

This is the world we live in. This is the world we cover.

Because of people like you, another world is possible. There are many battles to be won, but we will battle them together—all of us. Common Dreams is not your normal news site. We don't survive on clicks. We don't want advertising dollars. We want the world to be a better place. But we can't do it alone. It doesn't work that way. We need you. If you can help today—because every gift of every size matters—please do. Without Your Support We Simply Don't Exist.

Corporate Dems of US Senate Blamed as GOP Texas Governor Approves 'Rigged' Voting Maps

"We must protect our democracy with federal legislation immediately and defeat these cynical politicians at the ballot box."

Jessica Corbett ·


'An Act of Cowardice': 21 Israel-Based Groups Condemn Terror Label for Palestine NGOs

The organizations called the designation "a draconian measure that criminalizes critical human rights work."

Brett Wilkins ·


The Facebook Papers Spur More Calls to 'Break Them Up!'

Other critics are demanding a "full, independent, outside investigation" of the tech titan as whistleblower Frances Haugen testifies to the U.K. Parliament.

Jessica Corbett ·


'This Is an Emergency': Oxfam Says Rich Nations' $100 Billion Climate Pledge Not Good Enough

"Time is running out for rich nations to build trust and deliver on their unmet target."

Andrea Germanos ·


Progressives Vow to 'Push Very Hard' to Keep Agenda From Being Gutted Beyond Recognition

Congresswoman Ilhan Omar said House Democrats "are fighting to tackle the climate crisis, expand Medicare to cover dental, vision, and hearing, and guarantee family leave in America."

Kenny Stancil ·

Support our work.

We are independent, non-profit, advertising-free and 100% reader supported.

Subscribe to our newsletter.

Quality journalism. Progressive values.
Direct to your inbox.

Subscribe to our Newsletter.


Common Dreams, Inc. Founded 1997. Registered 501(c3) Non-Profit | Privacy Policy
Common Dreams Logo