Ocasio-Cortez Ties Concerns Over Unregulated Facial Recognition Technology to 'Global Rise in Authoritarianism and Fascism'

Rep. Alexandria Ocasio-Cortez participated in a House Oversight Committee hearing about facial recognition technology Wednesday. (Photo: C-SPAN/screenshot)

Ocasio-Cortez Ties Concerns Over Unregulated Facial Recognition Technology to 'Global Rise in Authoritarianism and Fascism'

House Democrats discussed the need for federal rules amid concerns about privacy rights and surveillance tools used to impose social control

Rep. Alexandria Ocasio-Cortez on Wednesday tied "the political reality that there is a global rise in authoritarianism and fascism" to concerns about the threats that facial recognition technology poses to Americans' civil rights and liberties.

"I don't want to see an authoritarian surveillance state," the New York Democrat told reporters on Capitol Hill, "whether it's run by a government or whether it's run by five corporations."

BuzzFeed's Davey Alba, who captured the congresswoman's comments on video, reported that she was referencing U.S. tech giants Amazon, Apple, Google, Facebook, and Microsoft.

Ocasio-Cortez took questions from reporters after she participated in a House Oversight Committee hearing about "the use of facial recognition technology by government and commercial entities and the need for oversight on how this technology is used on civilians." Earlier this month, San Francisco passed the nation's first ban on the technology, barring city police and agencies--though not corporations or private citizens--from using it. However, there are no such federal rules on the technology.

The committee hearing featured testimonies from advocates and experts, including ACLU senior legislative counsel Neema Singh Guliani; Clare Garvie, a senior associate at the Georgetown Law Center's privacy and technology think tank; and MIT researcher Joy Buolamwini, who founded the Algorithmic Justice League and has researched race and gender bias in facial recognition algorithms.

In an exchange circulated on social media, Ocasio-Cortez asked Buolamwini if the algorithms are mostly designed by white men, most effective at identifying white men, and have trouble correctly identifying people of color and people of different gender expressions--all of which the expert confirmed.

Ocasio-Cortez then asked, "Do you think this could exacerbate the already egregious inequalities in our criminal justice system?"

Buolamwini responded, "It already is."

During the hearing, Ocasio-Cortez also touched on how facial recognition technology relates to constitutionally protected privacy rights.

Despite concerns about privacy rights, the consequences of misidentification, and the potential for misuse of collected data, facial recognition technology has been used by airlines, government agencies, police, retailers, and schools across the country--all in the absence of nationwide regulations.

Some efforts at major U.S. tech companies to self-regulate have failed. Fast Companynoted in a report Wednesday:

The facial recognition hearing happened on the same day that Amazon's investors voted down a measure seeking to stop the sale of [the company's face software] Rekognition to government agencies. Shareholders also voted down a measure directing Amazon to perform a review of the civil liberties implications of its face recognition technology. The measures were proposed by activist investors and employees, partly based on Buolamwini's research, but were vehemently opposed by Amazon, which argued that fears around bias and misuse of face recognition were "insignificant" to the company.

Civil and human rights advocates have pressured corporate giants in the field to refuse to sell facial recognition technology to governments across the globe, citing concerns about abuse. In January, 90 groups sent letters to executives at Amazon, Google, and Microsoft with that demand, warning that surveillance technology can endanger people--particularly marginalized communities--and undermine public trust.

As Democrats and Republicans in Congress consider how to regulate facial recognition tools in the United States, debates about the technology are raging worldwide. Privacy and data protection rights are at the center of an ongoing court case in the United Kingdom about police use of facial recognition cameras.

Exemplifying concerns over how the technology is being deployed in the U.K., this video recently went viral online after Met Police stopped and ticketed a man for refusing to have his face scanned as he walked by surveillance cameras in London:

Defense Onereported last week that sometime this year, "Moscow will join a growing number of global cities whose populations are monitored by AI-enabled facial recognition programs."

Meanwhile in China, as the New York Timesreported last month, authorities are "using a vast, secret system of advanced facial recognition technology to track and control the Uighurs, a largely Muslim minority," in what experts described as "the first known example of a government intentionally using artificial intelligence for racial profiling."

Asked by a reporter on Wednesday about the potential for the United States--particularly under the Trump administration--to follow in the footsteps of China in using real-time surveillance as a form of social control, Ocasio-Cortez said, "I absolutely am concerned about that."

Join Us: News for people demanding a better world


Common Dreams is powered by optimists who believe in the power of informed and engaged citizens to ignite and enact change to make the world a better place.

We're hundreds of thousands strong, but every single supporter makes the difference.

Your contribution supports this bold media model—free, independent, and dedicated to reporting the facts every day. Stand with us in the fight for economic equality, social justice, human rights, and a more sustainable future. As a people-powered nonprofit news outlet, we cover the issues the corporate media never will. Join with us today!

Our work is licensed under Creative Commons (CC BY-NC-ND 3.0). Feel free to republish and share widely.