08:32 GMT +320 September 2019
Listen Live
    Facial recognition

    New Rules Exempt FBI From Disclosing Contents of Biometric Database to Citizens

    CC0 / Pixabay
    US
    Get short URL
    0 11
    Subscribe

    In a move long-feared by privacy advocates, new rules have been adopted exempting the Next Generation Identification (NGI) system, the Federal Bureau of Investigation's database of biometric information on millions of American citizens, from Privacy Act safeguards.

    The database includes iris scans, photos, fingerprints and other information of individuals, much of which stems from sources totally unrelated to law enforcement. Even individuals who have never been arrested or had issues with authorities could have biometric data stored in the NGI system, which even hoovers up data from background checks conducted on job applicants, welfare recipients, and licensed state teachers, realtors, and dentist.

    In all, the database holds around 52 million photographs, searchable through facial recognition, and accessible by 20,000 foreign, federal, state, and municipal-level law enforcement agencies.

    There are also few restrictions on what types of data can be submitted to the system, who can access the data, and how the data can be used. For example, while the FBI has promised it will not allow images from social networking sites to be saved to the system, there are no legal or codified restrictions in place of any kind to prevent exactly that.

    Privacy Act rules state any agencies with access to the database are legally required to inform individuals their data is stored on the system, but the FBI has sought an exception for some time, claiming that acknowledging it retains biometric records of individuals could affect investigations.

    "The NGI system also contains latent fingerprints, as well as other biometrics, and associated personal information that may be law enforcement or national security sensitive. Compliance could alert the subject of an authorized law enforcement activity about that particular activity and the interest of the FBI and/or other law enforcement agencies," the FBI said in a statement.

    The move has been long-feared by privacy advocacy and campaign groups. In July 2016, the Electronic Frontier Foundation (EFF), which filed an FOIA lawsuit in 2013 to obtain documents regarding the database, expressed its concern over the prospect.

    The EFF noted the FBI had amassed the database in the first place with little congressional and public oversight, and failed for years to provide basic information about NGI as required by law, or a detailed description of the records and its policies for maintaining them.

    "The FBI has sidestepped the Privacy Act as it has expanded NGI, essentially saying ‘just trust us' with highly personal and private data — but the FBI hasn't proved itself worthy of the public's trust. Exemption will eliminate our rights to access our own records and take action against the government when it make mistakes with that data. The Privacy Act is only the barest of protection for Americans, but the FBI wants to escape from even that basic responsibility," said EFF Senior Staff Attorney Jennifer Lynch at the time.

    Moreover, the EFF stated the FBI refused to recognize the inaccuracy of the facial recognition tech, or publish any data on the system's accuracy rates. Given included images are typically "well-below" the recommended resolution of.75 megapixels necessary for accurate identification (newer iPhone cameras are capable of 8 megapixel resolution at least), it's perhaps unsurprising research has shown the innovation frequently misidentifies African Americans, ethnic minorities, women, and young people, and at higher rates than white citizens and males.

    As a result, errors within NGI will result in greater targeting of non-white US citizens — particularly given FBI databases include a disproportionate number of such individuals. The larger a facial recognition data set, the larger the scope for error — at 52 million images, the NGI effectively ensures many mistakes will be made.

    In 2014, the Electronic Privacy Information Center filed an FOIA lawsuit and obtained records that revealed the database had an error rate of up to 20 percent on facial recognition searches. 

    Related:

    Misidentification Problem: FBI's Facial Recognition Program Has Racial, Sex Bias
    Who’s Watching? Oversight Issues Arise as NYPD Seeks Facial Recognition Tech
    Blacks More Likely to be Misidentified as Suspects by FBI Facial Recognition
    Activists Counter FBI Trying to Exempt Its Database From Privacy Protection
    Tags:
    biometric identification, facial recognition, Freedom of Information Act, surveillance, privacy, Electronic Privacy Information Center (EPIC), Federal Bureau of Investigation (FBI), Electronic Frontier Foundation (EFF), United States
    Community standardsDiscussion
    Comment via FacebookComment via Sputnik