SUBSCRIBE TO OUR FREE NEWSLETTER
Daily news & progressive opinion—funded by the people, not the corporations—delivered straight to your inbox.
5
#000000
#FFFFFF
To donate by check, phone, or other method, see our More Ways to Give page.
Daily news & progressive opinion—funded by the people, not the corporations—delivered straight to your inbox.
"Nobody is forcing these companies to supply more sensitive image-recognition technology to those who might use it in violation of human or civil rights." (Photo: Amazon)
EFF has joined the ACLU and a coalition of civil liberties organizations demanding that Amazon stop powering a government surveillance infrastructure. Last week, we signed onto a letter to Amazon condemning the company for developing a new face recognition product that enables real-time government surveillance through police body cameras and the smart cameras blanketing many cities. Amazon has been heavily marketing this tool--called "Rekognition"--to law enforcement, and it's already being used by agencies in Florida and Oregon. This system affords the government vast and dangerous surveillance powers, and it poses a threat to the privacy and freedom of communities across the country. That includes many of Amazon's own customers, who represent more than 75 percent of U.S. online consumers.
As the joint letter to Amazon CEO Jeff Bezos explains, Amazon's face recognition technology is "readily available to violate rights and target communities of color." And as we've discussed extensively before, face recognition technology like this allows the government to amp up surveillance in already over-policed communities of color, continuously track immigrants, and identify and arrest protesters and activists. This technology will not only invade our privacy and unfairly burden minority and immigrant communities, but it will also chill our free speech.
Amazon should stand up for civil liberties, including those of its own customers, and get out of the surveillance business.
Since the ACLU sounded the alarm, others have started to push back on Amazon. The Congressional Black Caucus wrote a separate letter to Bezos last week, stating, "We are troubled by the profound negative unintended consequences this form of artificial intelligence could have for African Americans, undocumented immigrants, and protesters." The CBC pointed out the "race-based 'blind spots' in artificial intelligence" that result in higher numbers of misidentifications for African Americans and women than for whites and men, and called on Amazon to hire more lawyers, engineers, and data scientists of color. Two other members of Congress followed up with another letter on Friday.
Amazon's partnership with law enforcement isn't new. Amazon already works with agencies across the country, offering cloud storage services through Amazon Web Services (AWS) that allow agencies to store the extremely large video files generated by body and other surveillance cameras. Rekognition is an inexpensive add-on to AWS, costing agencies approximately $6-$12 per month.
Rekognition doesn't just identify faces. It also can track people through a scene, even if their faces aren't visible. It can identify and catalog a person's gender, what they're doing, what they're wearing, and whether they're happy or sad. It can identify other things in a scene, like dogs, cars, or trees, and can recognize text, including street signs and license plates. It also offers to flag things it considers "unsafe" or "inappropriate."
And the technology is powerful, if Amazon's marketing materials are accurate. According to the company, Rekognition can identify people in real-time by instantaneously searching databases containing tens of millions of faces, detect up to 100 people in "challenging crowded" images, and track people through video--within a single shot and across multiple shots, and even when the camera is in motion--which makes "investigation and monitoring of individuals easy and accurate" for "security and surveillance applications." Amazon has even advertised Rekognition for use on police officer "bodycams." (The company took mention of bodycams off its website after the ACLU voiced concern, but "[t]hat appears to be the extent of its response[.]")
This is an example of what can go wrong when police departments unilaterally decide what privacy invasions are in the public interest, without any public oversight or input. That's why EFF supports Community Control Over Police Surveillance (CCOPS) measures, which ensure that local police can't do deals with surveillance technology companies without going through local city councils and the public. People deserve a say in what types of surveillance technology police use in their communities, and what policies and safeguards the police follow. Further, governments must make more balanced, accountable decisions about surveillance when communities and elected officials are involved in the decision-making process.
Amazon responded to the uproar surrounding the announcement of its government surveillance work by defending the usefulness of the program, noting that it has been used to find lost children in amusement parks and to identify faces in the crowd at the royal wedding. But it failed to grapple with the bigger issue: as one journalist put it, "Nobody is forcing these companies to supply more sensitive image-recognition technology to those who might use it in violation of human or civil rights."
Amazon should stand up for civil liberties, including those of its own customers, and get out of the surveillance business. It should cut law enforcement off from using its face recognition technology, not help usher in a surveillance state. And communities across the country should demand baseline measures to stop law enforcement from acquiring and using powerful new surveillance systems without any public oversight or accountability in the future.
Dear Common Dreams reader, The U.S. is on a fast track to authoritarianism like nothing I've ever seen. Meanwhile, corporate news outlets are utterly capitulating to Trump, twisting their coverage to avoid drawing his ire while lining up to stuff cash in his pockets. That's why I believe that Common Dreams is doing the best and most consequential reporting that we've ever done. Our small but mighty team is a progressive reporting powerhouse, covering the news every day that the corporate media never will. Our mission has always been simple: To inform. To inspire. And to ignite change for the common good. Now here's the key piece that I want all our readers to understand: None of this would be possible without your financial support. That's not just some fundraising cliche. It's the absolute and literal truth. We don't accept corporate advertising and never will. We don't have a paywall because we don't think people should be blocked from critical news based on their ability to pay. Everything we do is funded by the donations of readers like you. Will you donate now to help power the nonprofit, independent reporting of Common Dreams? Thank you for being a vital member of our community. Together, we can keep independent journalism alive when it’s needed most. - Craig Brown, Co-founder |
EFF has joined the ACLU and a coalition of civil liberties organizations demanding that Amazon stop powering a government surveillance infrastructure. Last week, we signed onto a letter to Amazon condemning the company for developing a new face recognition product that enables real-time government surveillance through police body cameras and the smart cameras blanketing many cities. Amazon has been heavily marketing this tool--called "Rekognition"--to law enforcement, and it's already being used by agencies in Florida and Oregon. This system affords the government vast and dangerous surveillance powers, and it poses a threat to the privacy and freedom of communities across the country. That includes many of Amazon's own customers, who represent more than 75 percent of U.S. online consumers.
As the joint letter to Amazon CEO Jeff Bezos explains, Amazon's face recognition technology is "readily available to violate rights and target communities of color." And as we've discussed extensively before, face recognition technology like this allows the government to amp up surveillance in already over-policed communities of color, continuously track immigrants, and identify and arrest protesters and activists. This technology will not only invade our privacy and unfairly burden minority and immigrant communities, but it will also chill our free speech.
Amazon should stand up for civil liberties, including those of its own customers, and get out of the surveillance business.
Since the ACLU sounded the alarm, others have started to push back on Amazon. The Congressional Black Caucus wrote a separate letter to Bezos last week, stating, "We are troubled by the profound negative unintended consequences this form of artificial intelligence could have for African Americans, undocumented immigrants, and protesters." The CBC pointed out the "race-based 'blind spots' in artificial intelligence" that result in higher numbers of misidentifications for African Americans and women than for whites and men, and called on Amazon to hire more lawyers, engineers, and data scientists of color. Two other members of Congress followed up with another letter on Friday.
Amazon's partnership with law enforcement isn't new. Amazon already works with agencies across the country, offering cloud storage services through Amazon Web Services (AWS) that allow agencies to store the extremely large video files generated by body and other surveillance cameras. Rekognition is an inexpensive add-on to AWS, costing agencies approximately $6-$12 per month.
Rekognition doesn't just identify faces. It also can track people through a scene, even if their faces aren't visible. It can identify and catalog a person's gender, what they're doing, what they're wearing, and whether they're happy or sad. It can identify other things in a scene, like dogs, cars, or trees, and can recognize text, including street signs and license plates. It also offers to flag things it considers "unsafe" or "inappropriate."
And the technology is powerful, if Amazon's marketing materials are accurate. According to the company, Rekognition can identify people in real-time by instantaneously searching databases containing tens of millions of faces, detect up to 100 people in "challenging crowded" images, and track people through video--within a single shot and across multiple shots, and even when the camera is in motion--which makes "investigation and monitoring of individuals easy and accurate" for "security and surveillance applications." Amazon has even advertised Rekognition for use on police officer "bodycams." (The company took mention of bodycams off its website after the ACLU voiced concern, but "[t]hat appears to be the extent of its response[.]")
This is an example of what can go wrong when police departments unilaterally decide what privacy invasions are in the public interest, without any public oversight or input. That's why EFF supports Community Control Over Police Surveillance (CCOPS) measures, which ensure that local police can't do deals with surveillance technology companies without going through local city councils and the public. People deserve a say in what types of surveillance technology police use in their communities, and what policies and safeguards the police follow. Further, governments must make more balanced, accountable decisions about surveillance when communities and elected officials are involved in the decision-making process.
Amazon responded to the uproar surrounding the announcement of its government surveillance work by defending the usefulness of the program, noting that it has been used to find lost children in amusement parks and to identify faces in the crowd at the royal wedding. But it failed to grapple with the bigger issue: as one journalist put it, "Nobody is forcing these companies to supply more sensitive image-recognition technology to those who might use it in violation of human or civil rights."
Amazon should stand up for civil liberties, including those of its own customers, and get out of the surveillance business. It should cut law enforcement off from using its face recognition technology, not help usher in a surveillance state. And communities across the country should demand baseline measures to stop law enforcement from acquiring and using powerful new surveillance systems without any public oversight or accountability in the future.
EFF has joined the ACLU and a coalition of civil liberties organizations demanding that Amazon stop powering a government surveillance infrastructure. Last week, we signed onto a letter to Amazon condemning the company for developing a new face recognition product that enables real-time government surveillance through police body cameras and the smart cameras blanketing many cities. Amazon has been heavily marketing this tool--called "Rekognition"--to law enforcement, and it's already being used by agencies in Florida and Oregon. This system affords the government vast and dangerous surveillance powers, and it poses a threat to the privacy and freedom of communities across the country. That includes many of Amazon's own customers, who represent more than 75 percent of U.S. online consumers.
As the joint letter to Amazon CEO Jeff Bezos explains, Amazon's face recognition technology is "readily available to violate rights and target communities of color." And as we've discussed extensively before, face recognition technology like this allows the government to amp up surveillance in already over-policed communities of color, continuously track immigrants, and identify and arrest protesters and activists. This technology will not only invade our privacy and unfairly burden minority and immigrant communities, but it will also chill our free speech.
Amazon should stand up for civil liberties, including those of its own customers, and get out of the surveillance business.
Since the ACLU sounded the alarm, others have started to push back on Amazon. The Congressional Black Caucus wrote a separate letter to Bezos last week, stating, "We are troubled by the profound negative unintended consequences this form of artificial intelligence could have for African Americans, undocumented immigrants, and protesters." The CBC pointed out the "race-based 'blind spots' in artificial intelligence" that result in higher numbers of misidentifications for African Americans and women than for whites and men, and called on Amazon to hire more lawyers, engineers, and data scientists of color. Two other members of Congress followed up with another letter on Friday.
Amazon's partnership with law enforcement isn't new. Amazon already works with agencies across the country, offering cloud storage services through Amazon Web Services (AWS) that allow agencies to store the extremely large video files generated by body and other surveillance cameras. Rekognition is an inexpensive add-on to AWS, costing agencies approximately $6-$12 per month.
Rekognition doesn't just identify faces. It also can track people through a scene, even if their faces aren't visible. It can identify and catalog a person's gender, what they're doing, what they're wearing, and whether they're happy or sad. It can identify other things in a scene, like dogs, cars, or trees, and can recognize text, including street signs and license plates. It also offers to flag things it considers "unsafe" or "inappropriate."
And the technology is powerful, if Amazon's marketing materials are accurate. According to the company, Rekognition can identify people in real-time by instantaneously searching databases containing tens of millions of faces, detect up to 100 people in "challenging crowded" images, and track people through video--within a single shot and across multiple shots, and even when the camera is in motion--which makes "investigation and monitoring of individuals easy and accurate" for "security and surveillance applications." Amazon has even advertised Rekognition for use on police officer "bodycams." (The company took mention of bodycams off its website after the ACLU voiced concern, but "[t]hat appears to be the extent of its response[.]")
This is an example of what can go wrong when police departments unilaterally decide what privacy invasions are in the public interest, without any public oversight or input. That's why EFF supports Community Control Over Police Surveillance (CCOPS) measures, which ensure that local police can't do deals with surveillance technology companies without going through local city councils and the public. People deserve a say in what types of surveillance technology police use in their communities, and what policies and safeguards the police follow. Further, governments must make more balanced, accountable decisions about surveillance when communities and elected officials are involved in the decision-making process.
Amazon responded to the uproar surrounding the announcement of its government surveillance work by defending the usefulness of the program, noting that it has been used to find lost children in amusement parks and to identify faces in the crowd at the royal wedding. But it failed to grapple with the bigger issue: as one journalist put it, "Nobody is forcing these companies to supply more sensitive image-recognition technology to those who might use it in violation of human or civil rights."
Amazon should stand up for civil liberties, including those of its own customers, and get out of the surveillance business. It should cut law enforcement off from using its face recognition technology, not help usher in a surveillance state. And communities across the country should demand baseline measures to stop law enforcement from acquiring and using powerful new surveillance systems without any public oversight or accountability in the future.