The Federal Bureau of Investigation announced on Monday that its facial recognition software, Next Generation Identification (NGI), is "fully operational," cementing the launch of a program that civil rights groups warn could risk turning innocent civilians into criminal suspects.
"The IPS facial recognition service will provide the nation’s law enforcement community with an investigative tool that provides an image-searching capability of photographs associated with criminal identities," the bureau said in a press release. "This effort is a significant step forward for the criminal justice community in utilizing biometrics as an investigative enabler."
NGI was initially developed to expand the FBI's biometric identification capabilities, but will eventually replace the bureau's current fingerprinting system. The program's database holds more than 100 million individual records that link fingerprints, iris scans and facial-recognition data with personal information, legal status, home addresses, and other private details, and will obtain 52 million facial recognition images by 2015. One individual may be linked to multiple images, including those that come from employment records, DMV photos, and background check databases.
Civil rights and privacy watchdogs have criticized the program for its invasive—and inaccurate—tactics. The system, a billion-dollar investment that has been in development with Lockheed Martin for three years, was found to identify the wrong individual 20 percent of the time, a statistic which increases over time and as the database expands, the Electronic Frontier Foundation discovered. Another report obtained by the Electronic Privacy Information Center (EPIC) found that the system had an 85 percent success rate when searches were made among clear, front-facing images with no obstructions.
By compiling mugshots and DMV photos in the same database, the bureau risks identifying citizens with no records as potential criminal suspects, EFF said, adding, "This is not how our system of justice was designed and should not be a system that Americans tacitly consent to move towards."
Sen. Al Franken (D-Minnesota), former chairman of the Senate Subcommittee on Privacy, Technology, and the Law, said at a July 2012 congressional hearing on the facial recognition program that it "could be abused to not only identify protesters at political events and rallies, but to target them for selective jailing and prosecution."
"Data accumulation and sharing can be good for solving crimes across jurisdictions or borders, but can also perpetuate racial and ethnic profiling, social stigma, and inaccuracies throughout all systems and can allow for government tracking and surveillance on a level not before possible," EFF said in a letter (pdf) to Attorney General Eric Holder.
The FBI defended itself by stating that NGI, which is used by law enforcement agencies throughout the country, was not meant to give accurate information, but rather to return a list of possible "candidates."
"The candidate list is an investigative lead not an identification," the bureau told EFF, claiming that because the system does not single anyone out, "there is no false positive rate."
Jeramie Scott, National Security Counsel at EPIC, told Common Dreams that the FBI was "again moving forward with invasive surveillance technology without first fully addressing the privacy risks."
"Facial recognition, which enables the identification of individuals without consent, has far-reaching implications for personal privacy and presents a real risk of mass surveillance," Scott said. "The E-Government Act of 2002 requires privacy assessments for technology that collects personal information, and over two years ago the FBI committed to a privacy assessment of its use of facial recognition technology. The FBI should suspend the use of facial recognition until the agency follows through with its commitment to assess and mitigate the privacy risks associated with facial recognition."
Privacy watchdogs also expressed concern over the individual features of the NGI program, such as Rap Back service, which EPIC said "equates to an ongoing, continual background check."
"It is not only used to monitor whether people under correctional supervision are arrested again, but to constantly monitor civilians in various trusted positions (e.g. teachers or banker tellers)," EPIC wrote in a blog post. "Civilians under Rap Back monitoring must submit their fingerprints and potentially photos too as NGI now allow photo submissions for civilian entries."
There are currently no federal laws limiting the use of facial-recognition software.