02:44 GMT +322 May 2019
Listen Live
    Face

    Misidentification Problem: FBI's Facial Recognition Program Has Racial, Sex Bias

    © Photo : Pixabay
    US
    Get short URL
    169

    The facial recognition database used by the FBI is dangerously inaccurate, the House Committee on Oversight and Government Reform revealed last week. Radio Sputnik discussed the issue with Dr. James Wayman, Director of Biometric Identification Research Program at San Jose State University in California.

    According to the committee’s report, around half of all adult Americans, even those who have never committed any crime, have their pictures stored in databases that the Bureau uses to look for criminal suspects.

    Notably, around 80 percent of these photos are taken from sources like driver’s’ licenses and passports, and have been placed into these databases without the individuals' consent.

    “The accuracy of the data in the database is really important. Studies done by the US government show that only one in a thousand biometric images in government databases is accurately labeled,” Dr. Wayman said.

    He also underscored the importance of accurate facial recognition algorithms that are used to compare photographs in the database.

    “The FBI doesn’t believe that these algorithms are accurate. They don’t believe that even if the fact that two faces meet the match it doesn’t mean that they belong to one and the same person. Useful as it is in court, the assessment of facial algorithms is not as accurate as fingerprints,” he noted.

    The software program used by the FBI also appears to incorrectly identify black people more frequently than it does white people.

    When asked about the reason for the inaccurate matching of facial algorithms of females and African-Americans, James Wayman said that the phenomenon is known as the “other race effect.”

    “It means that a person raised in a mono-ethnic community finds it hard to recognize the faces of people of other races. A system of facial recognition trained on, say, Caucasians, works well with Caucasians. One trained on Asians does a very good job recognizing Asian faces,” he observed.

    The problem is, however, that these systems are not trained on recognizing women and African-Americans.

    Speaking about the law enforcement’s use of biometric tools like facial recognition to try to solve a crime, he said that the police should be allowed to use investigatory tools such as face, fingerprint and voice recognition.

    “I still don’t think that someone should be put in jail because of the findings of a computer,” Dr. Wayman noted.

    Some say that the algorithms used by commercial companies are at least five years ahead of what the FBI has and are much more accurate too.

    Wayman agreed, saying that it takes the government some five years to train people and introduce new technology.

    “Commercial companies move much more rapidly in terms of acquiring technology and know-how, that’s why I’m not surprised that the FBI is lagging behind,” he said.

    Last week’s House Oversight Committee hearing revealed that “about 80 percent of photos in the FBI's network are non-criminal entries, including pictures from driver's licenses and passports.

    The algorithms used to identify matches are inaccurate about 15 percent of the time, and are most likely to misidentify black people than white people."

    Moreover, the algorithms used by the FBI are more likely to misidentify African-Americans as crime suspects.

    Have you heard the news? Sign up to our Telegram channel and we'll keep you up to speed!

    Related:

    FBI Fails to Ensure Privacy, Accuracy When Using Facial Recognition Technology
    Blacks More Likely to be Misidentified as Suspects by FBI Facial Recognition
    Who’s Watching? Oversight Issues Arise as NYPD Seeks Facial Recognition Tech
    Tags:
    commercial companies, misidentification, racial bias, facial recognition, database, San Jose State University, Federal Bureau of Investigation (FBI), US Congress, James Wayman, United States
    Community standardsDiscussion
    Comment via FacebookComment via Sputnik