Jan 23, 2020
Privacy advocates in the United Kingdom responded with alarm Friday to an announcement that the Metropolitan Police plans to use live facial recognition cameras at specific London locations "to try to locate and arrest wanted people."
"Rolling out a mass surveillance tool that has been rejected by democracies and embraced by oppressive regimes is a dangerous and sinister step."
--Clare Collier, Liberty
"This decision represents an enormous expansion of the surveillance state and a serious threat to civil liberties in the U.K.," Silkie Carlo, director of the London-based privacy campaign group Big Brother Watch, declared in a statement.
"This is a breathtaking assault on our rights and we will challenge it, including by urgently considering next steps in our ongoing legal claim against the Met and the Home Secretary," Carlo added. "This move instantly stains the new government's human rights record and we urge an immediate reconsideration."
\u201cWe condemn the decision of @metpoliceuk to roll out Facial Recognition technology in London.\n\nConsidering how the Prevent programme has been used to target activists and minorities, we have little confidence that this won't be used in the same way.\nhttps://t.co/nQapSSjV22\u201d— YouthStrike4Climate (@YouthStrike4Climate) 1579880793
The rights advocacy group Liberty, based in London, issued a similarly scathing statement slamming the Met's decision.
"This is a dangerous, oppressive, and completely unjustified move by the Met," said Clare Collier, Liberty's advocacy director. "Facial recognition technology gives the state unprecedented power to track and monitor any one of us, destroying our privacy and our free expression."
"Rolling out a mass surveillance tool that has been rejected by democracies and embraced by oppressive regimes is a dangerous and sinister step," she warned. "It pushes us towards a surveillance state in which our freedom to live our lives free from state interference no longer exists."
Liberty, SC Media UKnoted, "represented activist Ed Bridges in his failed attempt to stop South Wales Police (SWP) from using automatic facial recognition (AFR)." The outlet explained:
Ed Bridges last year lost his case against South Wales Police over the use of AFR. His crowdfunded appeal was the world's first legal challenge over police use of facial recognition technology.
The court went with the police's claim that the AFR apparatus is placed in public not a form of covert surveillance that would contravene Regulation of Investigatory Powers Act 2000, which states that "surveillance is covert if, and only if, it is carried out in a manner that is calculated to ensure that persons who are subject to the surveillance are unaware that it is or may be taking place."
The Met's announcement presents plans to use live facial recognition (LFR) technology as in line with the court's ruling, saying that "the Met will begin operationally deploying LFR at locations where intelligence suggests we are most likely to locate serious offenders. Each deployment will have a bespoke 'watch list,' made up of images of wanted individuals, predominantly those wanted for serious and violent offenses."
"At a deployment, cameras will be focused on a small, targeted area to scan passers-by," the statement continued. "The cameras will be clearly signposted and officers deployed to the operation will hand out leaflets about the activity. The technology, which is a standalone system, is not linked to any other imaging system, such as CCTV, body worn video, or ANPR."
The Met's assistant commissioner, Nick Ephgrave, framed the move as "an important development for the Met and one which is vital in assisting us in bearing down on violence." Ephgrave said that "we are using a tried-and-tested technology, and have taken a considered and transparent approach in order to arrive at this point."
Critics, however, pushed back against the suggestion that LFR technology is reliable. BBC Newsreported:
Trials of the cameras have already taken place on 10 occasions in locations such as Stratford's Westfield shopping center and the West End of London.
The Met said in these trials, 70% of wanted suspects in the system who walked past the cameras were identified, while only one in 1,000 people generated a false alert.
But an independent review of six of these deployments found that only eight out of 42 matches were "verifiably correct."
Referencing those findings, Big Brother Watch's Carlo said Friday that moving forward with the use of this technology "flies in the face of the independent review showing the Met's use of facial recognition was likely unlawful, risked harming public rights, and was 81% inaccurate."
Sharing the Met's announcement on Twitter, Big Brother Watch said, "See you in court."
\u201cSee you in court.\u201d— Big Brother Watch (@Big Brother Watch) 1579869228
The Met's move, the New York Times noted, "comes amid a worldwide debate about the use of facial recognition systems. Police departments contend that the software gives them a technological edge to catch criminals. Critics say the technology is an invasion of privacy and is being rolled out without adequate public debate."
Critics often highlight the technology's issues with racial and gender biases. As Common Dreams reported last month, the U.S. government's first major federal study of facial recognition surveillance confirmed civil liberties groups' warnings that the technology often disproportionately favors white middle-aged men and frequently misidentifies those of other identities, particularly people of color.
In the absence of federal regulations on facial recognition technology in the United States, some cities have started banning local agencies from using it. The digital rights group Fight for the Future, which launched the Ban Facial Recognition campaign last year, maintains an interactive map of the U.S. that shows "where facial recognition surveillance is happening, where it's spreading to next, and where there are local and state efforts to rein it in."
Join Us: News for people demanding a better world
Common Dreams is powered by optimists who believe in the power of informed and engaged citizens to ignite and enact change to make the world a better place. We're hundreds of thousands strong, but every single supporter makes the difference. Your contribution supports this bold media model—free, independent, and dedicated to reporting the facts every day. Stand with us in the fight for economic equality, social justice, human rights, and a more sustainable future. As a people-powered nonprofit news outlet, we cover the issues the corporate media never will. |
Our work is licensed under Creative Commons (CC BY-NC-ND 3.0). Feel free to republish and share widely.
Privacy advocates in the United Kingdom responded with alarm Friday to an announcement that the Metropolitan Police plans to use live facial recognition cameras at specific London locations "to try to locate and arrest wanted people."
"Rolling out a mass surveillance tool that has been rejected by democracies and embraced by oppressive regimes is a dangerous and sinister step."
--Clare Collier, Liberty
"This decision represents an enormous expansion of the surveillance state and a serious threat to civil liberties in the U.K.," Silkie Carlo, director of the London-based privacy campaign group Big Brother Watch, declared in a statement.
"This is a breathtaking assault on our rights and we will challenge it, including by urgently considering next steps in our ongoing legal claim against the Met and the Home Secretary," Carlo added. "This move instantly stains the new government's human rights record and we urge an immediate reconsideration."
\u201cWe condemn the decision of @metpoliceuk to roll out Facial Recognition technology in London.\n\nConsidering how the Prevent programme has been used to target activists and minorities, we have little confidence that this won't be used in the same way.\nhttps://t.co/nQapSSjV22\u201d— YouthStrike4Climate (@YouthStrike4Climate) 1579880793
The rights advocacy group Liberty, based in London, issued a similarly scathing statement slamming the Met's decision.
"This is a dangerous, oppressive, and completely unjustified move by the Met," said Clare Collier, Liberty's advocacy director. "Facial recognition technology gives the state unprecedented power to track and monitor any one of us, destroying our privacy and our free expression."
"Rolling out a mass surveillance tool that has been rejected by democracies and embraced by oppressive regimes is a dangerous and sinister step," she warned. "It pushes us towards a surveillance state in which our freedom to live our lives free from state interference no longer exists."
Liberty, SC Media UKnoted, "represented activist Ed Bridges in his failed attempt to stop South Wales Police (SWP) from using automatic facial recognition (AFR)." The outlet explained:
Ed Bridges last year lost his case against South Wales Police over the use of AFR. His crowdfunded appeal was the world's first legal challenge over police use of facial recognition technology.
The court went with the police's claim that the AFR apparatus is placed in public not a form of covert surveillance that would contravene Regulation of Investigatory Powers Act 2000, which states that "surveillance is covert if, and only if, it is carried out in a manner that is calculated to ensure that persons who are subject to the surveillance are unaware that it is or may be taking place."
The Met's announcement presents plans to use live facial recognition (LFR) technology as in line with the court's ruling, saying that "the Met will begin operationally deploying LFR at locations where intelligence suggests we are most likely to locate serious offenders. Each deployment will have a bespoke 'watch list,' made up of images of wanted individuals, predominantly those wanted for serious and violent offenses."
"At a deployment, cameras will be focused on a small, targeted area to scan passers-by," the statement continued. "The cameras will be clearly signposted and officers deployed to the operation will hand out leaflets about the activity. The technology, which is a standalone system, is not linked to any other imaging system, such as CCTV, body worn video, or ANPR."
The Met's assistant commissioner, Nick Ephgrave, framed the move as "an important development for the Met and one which is vital in assisting us in bearing down on violence." Ephgrave said that "we are using a tried-and-tested technology, and have taken a considered and transparent approach in order to arrive at this point."
Critics, however, pushed back against the suggestion that LFR technology is reliable. BBC Newsreported:
Trials of the cameras have already taken place on 10 occasions in locations such as Stratford's Westfield shopping center and the West End of London.
The Met said in these trials, 70% of wanted suspects in the system who walked past the cameras were identified, while only one in 1,000 people generated a false alert.
But an independent review of six of these deployments found that only eight out of 42 matches were "verifiably correct."
Referencing those findings, Big Brother Watch's Carlo said Friday that moving forward with the use of this technology "flies in the face of the independent review showing the Met's use of facial recognition was likely unlawful, risked harming public rights, and was 81% inaccurate."
Sharing the Met's announcement on Twitter, Big Brother Watch said, "See you in court."
\u201cSee you in court.\u201d— Big Brother Watch (@Big Brother Watch) 1579869228
The Met's move, the New York Times noted, "comes amid a worldwide debate about the use of facial recognition systems. Police departments contend that the software gives them a technological edge to catch criminals. Critics say the technology is an invasion of privacy and is being rolled out without adequate public debate."
Critics often highlight the technology's issues with racial and gender biases. As Common Dreams reported last month, the U.S. government's first major federal study of facial recognition surveillance confirmed civil liberties groups' warnings that the technology often disproportionately favors white middle-aged men and frequently misidentifies those of other identities, particularly people of color.
In the absence of federal regulations on facial recognition technology in the United States, some cities have started banning local agencies from using it. The digital rights group Fight for the Future, which launched the Ban Facial Recognition campaign last year, maintains an interactive map of the U.S. that shows "where facial recognition surveillance is happening, where it's spreading to next, and where there are local and state efforts to rein it in."
Privacy advocates in the United Kingdom responded with alarm Friday to an announcement that the Metropolitan Police plans to use live facial recognition cameras at specific London locations "to try to locate and arrest wanted people."
"Rolling out a mass surveillance tool that has been rejected by democracies and embraced by oppressive regimes is a dangerous and sinister step."
--Clare Collier, Liberty
"This decision represents an enormous expansion of the surveillance state and a serious threat to civil liberties in the U.K.," Silkie Carlo, director of the London-based privacy campaign group Big Brother Watch, declared in a statement.
"This is a breathtaking assault on our rights and we will challenge it, including by urgently considering next steps in our ongoing legal claim against the Met and the Home Secretary," Carlo added. "This move instantly stains the new government's human rights record and we urge an immediate reconsideration."
\u201cWe condemn the decision of @metpoliceuk to roll out Facial Recognition technology in London.\n\nConsidering how the Prevent programme has been used to target activists and minorities, we have little confidence that this won't be used in the same way.\nhttps://t.co/nQapSSjV22\u201d— YouthStrike4Climate (@YouthStrike4Climate) 1579880793
The rights advocacy group Liberty, based in London, issued a similarly scathing statement slamming the Met's decision.
"This is a dangerous, oppressive, and completely unjustified move by the Met," said Clare Collier, Liberty's advocacy director. "Facial recognition technology gives the state unprecedented power to track and monitor any one of us, destroying our privacy and our free expression."
"Rolling out a mass surveillance tool that has been rejected by democracies and embraced by oppressive regimes is a dangerous and sinister step," she warned. "It pushes us towards a surveillance state in which our freedom to live our lives free from state interference no longer exists."
Liberty, SC Media UKnoted, "represented activist Ed Bridges in his failed attempt to stop South Wales Police (SWP) from using automatic facial recognition (AFR)." The outlet explained:
Ed Bridges last year lost his case against South Wales Police over the use of AFR. His crowdfunded appeal was the world's first legal challenge over police use of facial recognition technology.
The court went with the police's claim that the AFR apparatus is placed in public not a form of covert surveillance that would contravene Regulation of Investigatory Powers Act 2000, which states that "surveillance is covert if, and only if, it is carried out in a manner that is calculated to ensure that persons who are subject to the surveillance are unaware that it is or may be taking place."
The Met's announcement presents plans to use live facial recognition (LFR) technology as in line with the court's ruling, saying that "the Met will begin operationally deploying LFR at locations where intelligence suggests we are most likely to locate serious offenders. Each deployment will have a bespoke 'watch list,' made up of images of wanted individuals, predominantly those wanted for serious and violent offenses."
"At a deployment, cameras will be focused on a small, targeted area to scan passers-by," the statement continued. "The cameras will be clearly signposted and officers deployed to the operation will hand out leaflets about the activity. The technology, which is a standalone system, is not linked to any other imaging system, such as CCTV, body worn video, or ANPR."
The Met's assistant commissioner, Nick Ephgrave, framed the move as "an important development for the Met and one which is vital in assisting us in bearing down on violence." Ephgrave said that "we are using a tried-and-tested technology, and have taken a considered and transparent approach in order to arrive at this point."
Critics, however, pushed back against the suggestion that LFR technology is reliable. BBC Newsreported:
Trials of the cameras have already taken place on 10 occasions in locations such as Stratford's Westfield shopping center and the West End of London.
The Met said in these trials, 70% of wanted suspects in the system who walked past the cameras were identified, while only one in 1,000 people generated a false alert.
But an independent review of six of these deployments found that only eight out of 42 matches were "verifiably correct."
Referencing those findings, Big Brother Watch's Carlo said Friday that moving forward with the use of this technology "flies in the face of the independent review showing the Met's use of facial recognition was likely unlawful, risked harming public rights, and was 81% inaccurate."
Sharing the Met's announcement on Twitter, Big Brother Watch said, "See you in court."
\u201cSee you in court.\u201d— Big Brother Watch (@Big Brother Watch) 1579869228
The Met's move, the New York Times noted, "comes amid a worldwide debate about the use of facial recognition systems. Police departments contend that the software gives them a technological edge to catch criminals. Critics say the technology is an invasion of privacy and is being rolled out without adequate public debate."
Critics often highlight the technology's issues with racial and gender biases. As Common Dreams reported last month, the U.S. government's first major federal study of facial recognition surveillance confirmed civil liberties groups' warnings that the technology often disproportionately favors white middle-aged men and frequently misidentifies those of other identities, particularly people of color.
In the absence of federal regulations on facial recognition technology in the United States, some cities have started banning local agencies from using it. The digital rights group Fight for the Future, which launched the Ban Facial Recognition campaign last year, maintains an interactive map of the U.S. that shows "where facial recognition surveillance is happening, where it's spreading to next, and where there are local and state efforts to rein it in."
We've had enough. The 1% own and operate the corporate media. They are doing everything they can to defend the status quo, squash dissent and protect the wealthy and the powerful. The Common Dreams media model is different. We cover the news that matters to the 99%. Our mission? To inform. To inspire. To ignite change for the common good. How? Nonprofit. Independent. Reader-supported. Free to read. Free to republish. Free to share. With no advertising. No paywalls. No selling of your data. Thousands of small donations fund our newsroom and allow us to continue publishing. Can you chip in? We can't do it without you. Thank you.