The FBI is planning to build out a facial recognition system that can query a huge database of photos to identify someone based on his or her appearance regardless of criminal history, reports the Electronic Frontier Foundation.
This is part of the FBI’s NGI system — Next Generation Identification — which could hold data on as much as one-third of all Americans. Privacy advocates are up in arms, of course. You only need to consider the overwhelming backlash we saw when facial recognition apps became available for Google Glass — they were summarily banned.
This new arm of the NGI database will build off of the FBI’s already impressive collection of fingerprints of approximately 100 million total records, some of which include retina scans and palm prints. Now their facial data will be part of this, joined with personal info such as address, age race, and name.
By 2015, the system will be querying up to 52 million photos in order to identify people of interest. 46 million will come from criminal images, mugshots and the like. 4.3 million are “civil images” from other sources. 215,000 come from RISC, the Repository for Individuals of Special Concern. But the FBI doesn’t specify where the last million photos or so come from — 750,000 from the “Special Population Cognisant” category and 215,000 from “New Repositories.”
Writes the Electronic Frontier Foundation: “[T]he FBI does not define either the ‘Special Population Cognisant’ database or the ‘new repositories’ category. This is a problem because we do not know what rules govern these categories, where the data comes from, how the images are gathered, who has access to them, and whose privacy is impacted.”
This plan is already underway in a few states with many more right on their heels. The EFF provides this map to help you determine if your home state gives a rip about your privacy.
Personally, we’re most bothered by the FBI’s assertion that the system will not actually “make positive identifications,” but will instead generate an “investigative lead.” It seems like a tedious, semantic way of being able to claim its facial recognition software will never make mistakes: “Therefore, there is no false positive [identification] rate.”
The EFF explains it thusly:
[T]he FBI only ensures that “the candidate will be returned in the top 50 candidates” 85 per cent of the time “when the true candidate exists in the gallery.”
It is unclear what happens when the “true candidate” does not exist in the gallery — does NGI still return possible matches? Could those people then be subject to criminal investigation for no other reason than that a computer thought their face was mathematically similar to a suspect’s? This doesn’t seem to matter much to the FBI — the Bureau notes that because “this is an investigative search and caveats will be prevalent on the return detailing that the [non-FBI] agency is responsible for determining the identity of the subject, there should be NO legal issues.”