SUBSCRIBE TO OUR FREE NEWSLETTER
Daily news & progressive opinion—funded by the people, not the corporations—delivered straight to your inbox.
5
#000000
#FFFFFF
To donate by check, phone, or other method, see our More Ways to Give page.
Daily news & progressive opinion—funded by the people, not the corporations—delivered straight to your inbox.
A Project NOLA security camera keeps watch over the corner of Conti Street and Burgundy Street in New Orleans on May 8, 2025.
"This is the facial recognition technology nightmare scenario that we have been worried about," said one civil liberties campaigner.
Amid a Washington Post investigation and pushback from civil liberties defenders, New Orleans police recently paused their sweeping—and apparently unlawful—use without public oversight of a private network of over 200 surveillance cameras and facial recognition technology to track and arrest criminal suspects.
On Monday, the Post published an exposé detailing how the New Orleans Police Department (NOPD) relied on real-time facial recognition technology provided by Project NOLA, a nonprofit organization operating out of the University of New Orleans, to locate and apprehend suspects.
"Facial recognition technology poses a direct threat to the fundamental rights of every individual and has no place in our cities."
Project NOLA's website says the group "operates the largest, most cost-efficient, and successful networked [high definition] crime camera program in America, which was created in 2009 by criminologist Bryan Lagarde to help reduce crime by dramatically increasing police efficiency and citizen awareness."
The Post's Douglas MacMillan and Aaron Schaffer described Project NOLA as "a surveillance method without a known precedent in any major American city that may violate municipal guardrails around use of the technology."
As MacMillan and Schaffer reported:
Police increasingly use facial recognition software to identify unknown culprits from still images, usually taken by surveillance cameras at or near the scene of a crime. New Orleans police took this technology a step further, utilizing a private network of more than 200 facial recognition cameras to watch over the streets, constantly monitoring for wanted suspects and automatically pinging officers' mobile phones through an app to convey the names and current locations of possible matches.
This, despite a 2022 municipal law
limiting police use of facial recognition. That ordinance reversed the city's earlier outright ban on the technology and was criticized by civil liberties advocates for dropping a provision that required permission from a judge or magistrate commissioner prior to use.
"This is the facial recognition technology nightmare scenario that we have been worried about," Nathan Freed Wessler, deputy director with the ACLU's Speech, Privacy, and Technology Project, told the Post. "This is the government giving itself the power to track anyone—for that matter, everyone—as we go about our lives walking around in public."
Since 2023, Project NOLA—which was paused last month amid the Post's investigation—has contributed to dozens of arrests. Proponents including NOPD and city officials credit the collaboration with Project NOLA for a decrease in crime in the city that had the nation's highest homicide rate as recently as 2022. Project NOLA has even been featured in the true crime series "Real Time Crime."
New Orleans Police Commissioner Anne Kirkpatrick told Project NOLA last month that its automated alerts must be shut off until she is "sure that the use of the app meets all the requirements of the law and policies."
Critics point to racial bias in facial recognition algorithms, which disproportionately misidentify racial minorities, as a particular cause for concern. According to one landmark federal study published in 2019, Black, Asian, and Native American people were up to 100 times likelier to be misidentified by facial recognition algorithms than white people.
The ACLU said in a statement that Project NOLA "supercharges the risks":
Consider Randal Reid, for example. He was wrongfully arrested based on faulty Louisiana facial recognition technology, despite never having set foot in the state. The false match cost him his freedom, his dignity, and thousands of dollars in legal fees. That misidentification happened based on a still image run through a facial recognition search in an investigation.
"We cannot ignore the real possibility of this tool being weaponized against marginalized communities, especially immigrants, activists, and others whose only crime is speaking out or challenging government policies," ACLU of Louisiana executive director Alanah Odoms said. "These individuals could be added to Project NOLA's watchlist without the public's knowledge, and with no accountability or transparency on the part of the police departments."
"Facial recognition technology poses a direct threat to the fundamental rights of every individual and has no place in our cities," Odoms asserted. "We call on the New Orleans Police Department and the city of New Orleans to halt this program indefinitely and terminate all use of live-feed facial recognition technology."
Dear Common Dreams reader, The U.S. is on a fast track to authoritarianism like nothing I've ever seen. Meanwhile, corporate news outlets are utterly capitulating to Trump, twisting their coverage to avoid drawing his ire while lining up to stuff cash in his pockets. That's why I believe that Common Dreams is doing the best and most consequential reporting that we've ever done. Our small but mighty team is a progressive reporting powerhouse, covering the news every day that the corporate media never will. Our mission has always been simple: To inform. To inspire. And to ignite change for the common good. Now here's the key piece that I want all our readers to understand: None of this would be possible without your financial support. That's not just some fundraising cliche. It's the absolute and literal truth. We don't accept corporate advertising and never will. We don't have a paywall because we don't think people should be blocked from critical news based on their ability to pay. Everything we do is funded by the donations of readers like you. Will you donate now to help power the nonprofit, independent reporting of Common Dreams? Thank you for being a vital member of our community. Together, we can keep independent journalism alive when it’s needed most. - Craig Brown, Co-founder |
Amid a Washington Post investigation and pushback from civil liberties defenders, New Orleans police recently paused their sweeping—and apparently unlawful—use without public oversight of a private network of over 200 surveillance cameras and facial recognition technology to track and arrest criminal suspects.
On Monday, the Post published an exposé detailing how the New Orleans Police Department (NOPD) relied on real-time facial recognition technology provided by Project NOLA, a nonprofit organization operating out of the University of New Orleans, to locate and apprehend suspects.
"Facial recognition technology poses a direct threat to the fundamental rights of every individual and has no place in our cities."
Project NOLA's website says the group "operates the largest, most cost-efficient, and successful networked [high definition] crime camera program in America, which was created in 2009 by criminologist Bryan Lagarde to help reduce crime by dramatically increasing police efficiency and citizen awareness."
The Post's Douglas MacMillan and Aaron Schaffer described Project NOLA as "a surveillance method without a known precedent in any major American city that may violate municipal guardrails around use of the technology."
As MacMillan and Schaffer reported:
Police increasingly use facial recognition software to identify unknown culprits from still images, usually taken by surveillance cameras at or near the scene of a crime. New Orleans police took this technology a step further, utilizing a private network of more than 200 facial recognition cameras to watch over the streets, constantly monitoring for wanted suspects and automatically pinging officers' mobile phones through an app to convey the names and current locations of possible matches.
This, despite a 2022 municipal law
limiting police use of facial recognition. That ordinance reversed the city's earlier outright ban on the technology and was criticized by civil liberties advocates for dropping a provision that required permission from a judge or magistrate commissioner prior to use.
"This is the facial recognition technology nightmare scenario that we have been worried about," Nathan Freed Wessler, deputy director with the ACLU's Speech, Privacy, and Technology Project, told the Post. "This is the government giving itself the power to track anyone—for that matter, everyone—as we go about our lives walking around in public."
Since 2023, Project NOLA—which was paused last month amid the Post's investigation—has contributed to dozens of arrests. Proponents including NOPD and city officials credit the collaboration with Project NOLA for a decrease in crime in the city that had the nation's highest homicide rate as recently as 2022. Project NOLA has even been featured in the true crime series "Real Time Crime."
New Orleans Police Commissioner Anne Kirkpatrick told Project NOLA last month that its automated alerts must be shut off until she is "sure that the use of the app meets all the requirements of the law and policies."
Critics point to racial bias in facial recognition algorithms, which disproportionately misidentify racial minorities, as a particular cause for concern. According to one landmark federal study published in 2019, Black, Asian, and Native American people were up to 100 times likelier to be misidentified by facial recognition algorithms than white people.
The ACLU said in a statement that Project NOLA "supercharges the risks":
Consider Randal Reid, for example. He was wrongfully arrested based on faulty Louisiana facial recognition technology, despite never having set foot in the state. The false match cost him his freedom, his dignity, and thousands of dollars in legal fees. That misidentification happened based on a still image run through a facial recognition search in an investigation.
"We cannot ignore the real possibility of this tool being weaponized against marginalized communities, especially immigrants, activists, and others whose only crime is speaking out or challenging government policies," ACLU of Louisiana executive director Alanah Odoms said. "These individuals could be added to Project NOLA's watchlist without the public's knowledge, and with no accountability or transparency on the part of the police departments."
"Facial recognition technology poses a direct threat to the fundamental rights of every individual and has no place in our cities," Odoms asserted. "We call on the New Orleans Police Department and the city of New Orleans to halt this program indefinitely and terminate all use of live-feed facial recognition technology."
Amid a Washington Post investigation and pushback from civil liberties defenders, New Orleans police recently paused their sweeping—and apparently unlawful—use without public oversight of a private network of over 200 surveillance cameras and facial recognition technology to track and arrest criminal suspects.
On Monday, the Post published an exposé detailing how the New Orleans Police Department (NOPD) relied on real-time facial recognition technology provided by Project NOLA, a nonprofit organization operating out of the University of New Orleans, to locate and apprehend suspects.
"Facial recognition technology poses a direct threat to the fundamental rights of every individual and has no place in our cities."
Project NOLA's website says the group "operates the largest, most cost-efficient, and successful networked [high definition] crime camera program in America, which was created in 2009 by criminologist Bryan Lagarde to help reduce crime by dramatically increasing police efficiency and citizen awareness."
The Post's Douglas MacMillan and Aaron Schaffer described Project NOLA as "a surveillance method without a known precedent in any major American city that may violate municipal guardrails around use of the technology."
As MacMillan and Schaffer reported:
Police increasingly use facial recognition software to identify unknown culprits from still images, usually taken by surveillance cameras at or near the scene of a crime. New Orleans police took this technology a step further, utilizing a private network of more than 200 facial recognition cameras to watch over the streets, constantly monitoring for wanted suspects and automatically pinging officers' mobile phones through an app to convey the names and current locations of possible matches.
This, despite a 2022 municipal law
limiting police use of facial recognition. That ordinance reversed the city's earlier outright ban on the technology and was criticized by civil liberties advocates for dropping a provision that required permission from a judge or magistrate commissioner prior to use.
"This is the facial recognition technology nightmare scenario that we have been worried about," Nathan Freed Wessler, deputy director with the ACLU's Speech, Privacy, and Technology Project, told the Post. "This is the government giving itself the power to track anyone—for that matter, everyone—as we go about our lives walking around in public."
Since 2023, Project NOLA—which was paused last month amid the Post's investigation—has contributed to dozens of arrests. Proponents including NOPD and city officials credit the collaboration with Project NOLA for a decrease in crime in the city that had the nation's highest homicide rate as recently as 2022. Project NOLA has even been featured in the true crime series "Real Time Crime."
New Orleans Police Commissioner Anne Kirkpatrick told Project NOLA last month that its automated alerts must be shut off until she is "sure that the use of the app meets all the requirements of the law and policies."
Critics point to racial bias in facial recognition algorithms, which disproportionately misidentify racial minorities, as a particular cause for concern. According to one landmark federal study published in 2019, Black, Asian, and Native American people were up to 100 times likelier to be misidentified by facial recognition algorithms than white people.
The ACLU said in a statement that Project NOLA "supercharges the risks":
Consider Randal Reid, for example. He was wrongfully arrested based on faulty Louisiana facial recognition technology, despite never having set foot in the state. The false match cost him his freedom, his dignity, and thousands of dollars in legal fees. That misidentification happened based on a still image run through a facial recognition search in an investigation.
"We cannot ignore the real possibility of this tool being weaponized against marginalized communities, especially immigrants, activists, and others whose only crime is speaking out or challenging government policies," ACLU of Louisiana executive director Alanah Odoms said. "These individuals could be added to Project NOLA's watchlist without the public's knowledge, and with no accountability or transparency on the part of the police departments."
"Facial recognition technology poses a direct threat to the fundamental rights of every individual and has no place in our cities," Odoms asserted. "We call on the New Orleans Police Department and the city of New Orleans to halt this program indefinitely and terminate all use of live-feed facial recognition technology."