May 30, 2018
EFF has joined the ACLU and a coalition of civil liberties organizations demanding that Amazon stop powering a government surveillance infrastructure. Last week, we signed onto a letter to Amazon condemning the company for developing a new face recognition product that enables real-time government surveillance through police body cameras and the smart cameras blanketing many cities. Amazon has been heavily marketing this tool--called "Rekognition"--to law enforcement, and it's already being used by agencies in Florida and Oregon. This system affords the government vast and dangerous surveillance powers, and it poses a threat to the privacy and freedom of communities across the country. That includes many of Amazon's own customers, who represent more than 75 percent of U.S. online consumers.
As the joint letter to Amazon CEO Jeff Bezos explains, Amazon's face recognition technology is "readily available to violate rights and target communities of color." And as we've discussed extensively before, face recognition technology like this allows the government to amp up surveillance in already over-policed communities of color, continuously track immigrants, and identify and arrest protesters and activists. This technology will not only invade our privacy and unfairly burden minority and immigrant communities, but it will also chill our free speech.
Amazon should stand up for civil liberties, including those of its own customers, and get out of the surveillance business.
Since the ACLU sounded the alarm, others have started to push back on Amazon. The Congressional Black Caucus wrote a separate letter to Bezos last week, stating, "We are troubled by the profound negative unintended consequences this form of artificial intelligence could have for African Americans, undocumented immigrants, and protesters." The CBC pointed out the "race-based 'blind spots' in artificial intelligence" that result in higher numbers of misidentifications for African Americans and women than for whites and men, and called on Amazon to hire more lawyers, engineers, and data scientists of color. Two other members of Congress followed up with another letter on Friday.
Amazon's partnership with law enforcement isn't new. Amazon already works with agencies across the country, offering cloud storage services through Amazon Web Services (AWS) that allow agencies to store the extremely large video files generated by body and other surveillance cameras. Rekognition is an inexpensive add-on to AWS, costing agencies approximately $6-$12 per month.
Rekognition doesn't just identify faces. It also can track people through a scene, even if their faces aren't visible. It can identify and catalog a person's gender, what they're doing, what they're wearing, and whether they're happy or sad. It can identify other things in a scene, like dogs, cars, or trees, and can recognize text, including street signs and license plates. It also offers to flag things it considers "unsafe" or "inappropriate."
And the technology is powerful, if Amazon's marketing materials are accurate. According to the company, Rekognition can identify people in real-time by instantaneously searching databases containing tens of millions of faces, detect up to 100 people in "challenging crowded" images, and track people through video--within a single shot and across multiple shots, and even when the camera is in motion--which makes "investigation and monitoring of individuals easy and accurate" for "security and surveillance applications." Amazon has even advertised Rekognition for use on police officer "bodycams." (The company took mention of bodycams off its website after the ACLU voiced concern, but "[t]hat appears to be the extent of its response[.]")
This is an example of what can go wrong when police departments unilaterally decide what privacy invasions are in the public interest, without any public oversight or input. That's why EFF supports Community Control Over Police Surveillance (CCOPS) measures, which ensure that local police can't do deals with surveillance technology companies without going through local city councils and the public. People deserve a say in what types of surveillance technology police use in their communities, and what policies and safeguards the police follow. Further, governments must make more balanced, accountable decisions about surveillance when communities and elected officials are involved in the decision-making process.
Amazon responded to the uproar surrounding the announcement of its government surveillance work by defending the usefulness of the program, noting that it has been used to find lost children in amusement parks and to identify faces in the crowd at the royal wedding. But it failed to grapple with the bigger issue: as one journalist put it, "Nobody is forcing these companies to supply more sensitive image-recognition technology to those who might use it in violation of human or civil rights."
Amazon should stand up for civil liberties, including those of its own customers, and get out of the surveillance business. It should cut law enforcement off from using its face recognition technology, not help usher in a surveillance state. And communities across the country should demand baseline measures to stop law enforcement from acquiring and using powerful new surveillance systems without any public oversight or accountability in the future.
Join Us: News for people demanding a better world
Common Dreams is powered by optimists who believe in the power of informed and engaged citizens to ignite and enact change to make the world a better place. We're hundreds of thousands strong, but every single supporter makes the difference. Your contribution supports this bold media model—free, independent, and dedicated to reporting the facts every day. Stand with us in the fight for economic equality, social justice, human rights, and a more sustainable future. As a people-powered nonprofit news outlet, we cover the issues the corporate media never will. |
Jennifer Lynch
Jennifer Lynch is a staff attorney with the Electronic Frontier Foundation and works on EFF's FOIA Litigation for Accountable Government (FLAG) Project.
Jamie Williams
Jamie is a staff attorney at the Electronic Frontier Foundation, where she is part of the civil liberties team. Jamie focuses on the First and Fourth Amendment implications of new technologies. Jamie joined EFF in 2014 as a Frank Stanton Legal Fellow. Prior to joining EFF, Jamie clerked for Judge Saundra Brown Armstrong in the Northern District of California. Before her clerkship, she was a litigation associate at Paul Hastings LLP and an attorney law clerk at the Alameda County Public Defender. Jamie has a J.D. from the University of California, Berkeley School of Law (Boalt Hall) and a B.A. in journalism from the University of Wisconsin, Madison. In her free time, she enjoys being outdoors and collecting cool rocks.
EFF has joined the ACLU and a coalition of civil liberties organizations demanding that Amazon stop powering a government surveillance infrastructure. Last week, we signed onto a letter to Amazon condemning the company for developing a new face recognition product that enables real-time government surveillance through police body cameras and the smart cameras blanketing many cities. Amazon has been heavily marketing this tool--called "Rekognition"--to law enforcement, and it's already being used by agencies in Florida and Oregon. This system affords the government vast and dangerous surveillance powers, and it poses a threat to the privacy and freedom of communities across the country. That includes many of Amazon's own customers, who represent more than 75 percent of U.S. online consumers.
As the joint letter to Amazon CEO Jeff Bezos explains, Amazon's face recognition technology is "readily available to violate rights and target communities of color." And as we've discussed extensively before, face recognition technology like this allows the government to amp up surveillance in already over-policed communities of color, continuously track immigrants, and identify and arrest protesters and activists. This technology will not only invade our privacy and unfairly burden minority and immigrant communities, but it will also chill our free speech.
Amazon should stand up for civil liberties, including those of its own customers, and get out of the surveillance business.
Since the ACLU sounded the alarm, others have started to push back on Amazon. The Congressional Black Caucus wrote a separate letter to Bezos last week, stating, "We are troubled by the profound negative unintended consequences this form of artificial intelligence could have for African Americans, undocumented immigrants, and protesters." The CBC pointed out the "race-based 'blind spots' in artificial intelligence" that result in higher numbers of misidentifications for African Americans and women than for whites and men, and called on Amazon to hire more lawyers, engineers, and data scientists of color. Two other members of Congress followed up with another letter on Friday.
Amazon's partnership with law enforcement isn't new. Amazon already works with agencies across the country, offering cloud storage services through Amazon Web Services (AWS) that allow agencies to store the extremely large video files generated by body and other surveillance cameras. Rekognition is an inexpensive add-on to AWS, costing agencies approximately $6-$12 per month.
Rekognition doesn't just identify faces. It also can track people through a scene, even if their faces aren't visible. It can identify and catalog a person's gender, what they're doing, what they're wearing, and whether they're happy or sad. It can identify other things in a scene, like dogs, cars, or trees, and can recognize text, including street signs and license plates. It also offers to flag things it considers "unsafe" or "inappropriate."
And the technology is powerful, if Amazon's marketing materials are accurate. According to the company, Rekognition can identify people in real-time by instantaneously searching databases containing tens of millions of faces, detect up to 100 people in "challenging crowded" images, and track people through video--within a single shot and across multiple shots, and even when the camera is in motion--which makes "investigation and monitoring of individuals easy and accurate" for "security and surveillance applications." Amazon has even advertised Rekognition for use on police officer "bodycams." (The company took mention of bodycams off its website after the ACLU voiced concern, but "[t]hat appears to be the extent of its response[.]")
This is an example of what can go wrong when police departments unilaterally decide what privacy invasions are in the public interest, without any public oversight or input. That's why EFF supports Community Control Over Police Surveillance (CCOPS) measures, which ensure that local police can't do deals with surveillance technology companies without going through local city councils and the public. People deserve a say in what types of surveillance technology police use in their communities, and what policies and safeguards the police follow. Further, governments must make more balanced, accountable decisions about surveillance when communities and elected officials are involved in the decision-making process.
Amazon responded to the uproar surrounding the announcement of its government surveillance work by defending the usefulness of the program, noting that it has been used to find lost children in amusement parks and to identify faces in the crowd at the royal wedding. But it failed to grapple with the bigger issue: as one journalist put it, "Nobody is forcing these companies to supply more sensitive image-recognition technology to those who might use it in violation of human or civil rights."
Amazon should stand up for civil liberties, including those of its own customers, and get out of the surveillance business. It should cut law enforcement off from using its face recognition technology, not help usher in a surveillance state. And communities across the country should demand baseline measures to stop law enforcement from acquiring and using powerful new surveillance systems without any public oversight or accountability in the future.
Jennifer Lynch
Jennifer Lynch is a staff attorney with the Electronic Frontier Foundation and works on EFF's FOIA Litigation for Accountable Government (FLAG) Project.
Jamie Williams
Jamie is a staff attorney at the Electronic Frontier Foundation, where she is part of the civil liberties team. Jamie focuses on the First and Fourth Amendment implications of new technologies. Jamie joined EFF in 2014 as a Frank Stanton Legal Fellow. Prior to joining EFF, Jamie clerked for Judge Saundra Brown Armstrong in the Northern District of California. Before her clerkship, she was a litigation associate at Paul Hastings LLP and an attorney law clerk at the Alameda County Public Defender. Jamie has a J.D. from the University of California, Berkeley School of Law (Boalt Hall) and a B.A. in journalism from the University of Wisconsin, Madison. In her free time, she enjoys being outdoors and collecting cool rocks.
EFF has joined the ACLU and a coalition of civil liberties organizations demanding that Amazon stop powering a government surveillance infrastructure. Last week, we signed onto a letter to Amazon condemning the company for developing a new face recognition product that enables real-time government surveillance through police body cameras and the smart cameras blanketing many cities. Amazon has been heavily marketing this tool--called "Rekognition"--to law enforcement, and it's already being used by agencies in Florida and Oregon. This system affords the government vast and dangerous surveillance powers, and it poses a threat to the privacy and freedom of communities across the country. That includes many of Amazon's own customers, who represent more than 75 percent of U.S. online consumers.
As the joint letter to Amazon CEO Jeff Bezos explains, Amazon's face recognition technology is "readily available to violate rights and target communities of color." And as we've discussed extensively before, face recognition technology like this allows the government to amp up surveillance in already over-policed communities of color, continuously track immigrants, and identify and arrest protesters and activists. This technology will not only invade our privacy and unfairly burden minority and immigrant communities, but it will also chill our free speech.
Amazon should stand up for civil liberties, including those of its own customers, and get out of the surveillance business.
Since the ACLU sounded the alarm, others have started to push back on Amazon. The Congressional Black Caucus wrote a separate letter to Bezos last week, stating, "We are troubled by the profound negative unintended consequences this form of artificial intelligence could have for African Americans, undocumented immigrants, and protesters." The CBC pointed out the "race-based 'blind spots' in artificial intelligence" that result in higher numbers of misidentifications for African Americans and women than for whites and men, and called on Amazon to hire more lawyers, engineers, and data scientists of color. Two other members of Congress followed up with another letter on Friday.
Amazon's partnership with law enforcement isn't new. Amazon already works with agencies across the country, offering cloud storage services through Amazon Web Services (AWS) that allow agencies to store the extremely large video files generated by body and other surveillance cameras. Rekognition is an inexpensive add-on to AWS, costing agencies approximately $6-$12 per month.
Rekognition doesn't just identify faces. It also can track people through a scene, even if their faces aren't visible. It can identify and catalog a person's gender, what they're doing, what they're wearing, and whether they're happy or sad. It can identify other things in a scene, like dogs, cars, or trees, and can recognize text, including street signs and license plates. It also offers to flag things it considers "unsafe" or "inappropriate."
And the technology is powerful, if Amazon's marketing materials are accurate. According to the company, Rekognition can identify people in real-time by instantaneously searching databases containing tens of millions of faces, detect up to 100 people in "challenging crowded" images, and track people through video--within a single shot and across multiple shots, and even when the camera is in motion--which makes "investigation and monitoring of individuals easy and accurate" for "security and surveillance applications." Amazon has even advertised Rekognition for use on police officer "bodycams." (The company took mention of bodycams off its website after the ACLU voiced concern, but "[t]hat appears to be the extent of its response[.]")
This is an example of what can go wrong when police departments unilaterally decide what privacy invasions are in the public interest, without any public oversight or input. That's why EFF supports Community Control Over Police Surveillance (CCOPS) measures, which ensure that local police can't do deals with surveillance technology companies without going through local city councils and the public. People deserve a say in what types of surveillance technology police use in their communities, and what policies and safeguards the police follow. Further, governments must make more balanced, accountable decisions about surveillance when communities and elected officials are involved in the decision-making process.
Amazon responded to the uproar surrounding the announcement of its government surveillance work by defending the usefulness of the program, noting that it has been used to find lost children in amusement parks and to identify faces in the crowd at the royal wedding. But it failed to grapple with the bigger issue: as one journalist put it, "Nobody is forcing these companies to supply more sensitive image-recognition technology to those who might use it in violation of human or civil rights."
Amazon should stand up for civil liberties, including those of its own customers, and get out of the surveillance business. It should cut law enforcement off from using its face recognition technology, not help usher in a surveillance state. And communities across the country should demand baseline measures to stop law enforcement from acquiring and using powerful new surveillance systems without any public oversight or accountability in the future.
We've had enough. The 1% own and operate the corporate media. They are doing everything they can to defend the status quo, squash dissent and protect the wealthy and the powerful. The Common Dreams media model is different. We cover the news that matters to the 99%. Our mission? To inform. To inspire. To ignite change for the common good. How? Nonprofit. Independent. Reader-supported. Free to read. Free to republish. Free to share. With no advertising. No paywalls. No selling of your data. Thousands of small donations fund our newsroom and allow us to continue publishing. Can you chip in? We can't do it without you. Thank you.