South Wales Police didn't provide much in the way of detail about the arrest, although it follows an announcement by the force in April that they intended to scan the faces of people at strategic locations in and around Cardiff city center ahead of the UEFA Champions League final, which was played at the Millennium Stadium June 3.
The arrest is understood to be unconnected to the Champions League, and instead related to an outstanding warrant — it's likely the man's visage was included on the South Wales Police database of custody images, which contains 500,000 mugshots, and was spotted by an AFR-equipped van.
As with other UK policing jurisdictions, South Wales uses hardware and software provided by NEC, a cybersecurity firm leading development on real-time facial recognition solutions, which has been the technology partner for other UK police AFR trials.
In some UK police districts, facial recognition software has been integrated with closed circuit cameras. The UK is already one of the most spied upon nations in the world, with 5.9 million CCTV devices in the country, approximately one for every 11 citizens. Evidently, British criminals on the run are running out of places to hide.
However, it's not merely AWOL criminals the South Wales Police — and presumably other forces in the country, and the wider world — are hoping to catch. The force's top brass have previously indicated NEC's AFR technology would support their emphasis on "early intervention" by allowing officers to identify vulnerability, challenge perpetrators, and prevent offenses before they happen.
While it's unclear how AFR technology could assist in the detection of crimes before they happen, or indeed what powers police have to reprimand or arrest individuals before they've committed a crime, such capabilities will be familiar to fans of the 2002 dystopian science fiction blockbuster Minority Report — and raise a host of ethical and philosophical issues.
Almost every technology has the capacity for good and evil alike, and AFR applications evidently have many positive potential ends — assisting disaster relief, identifying missing people, and more — but equally obviously have several sinister implications. For the tech to be effective, AFR vans must monitor the faces of all members of the public to check for "matches" — but will the images it takes also be stored?
— Chris Garaffa (@cmg) May 11, 2017
Advocates may argue AFR is the 21st century's answer to fingerprinting, but unlike that previous forensic breakthrough, facial recognition software does not require active individual participation — it can be gathered without a subject even knowing about it.
This could mean a further hammer blow to anonymity the world over. After all, making it easier to scan crowds, and scanning the faces of the public in perpetuity, would allow authorities to match facial pictures to an individual's wider digital footprint — Facebook account, work profiles, etc. It would also mean groups behaving legally that are nonetheless subject to surveillance by law enforcement, such as protesters, would be far easier to monitor.
Moreover, as with all technology, laws and regulations lag AFR development. There's currently little legislation covering its deployment by law enforcement anywhere in the world, creating a grey area ripe for abuse — the questions of what rights individuals have regarding facial recognition, and how the technology can be used by police, have so far gone unanswered. Do individuals erroneously suspected of a crime have a right for their faces to be forgotten after acquittal?
However, it may be of some relief to privacy campaigners that facial recognition databases in the US have been found to be dangerously inaccurate — albeit no relief at all to individuals wrongfully arrested as a result.
According to the Senate Committee on Oversight and Government Reform, around half of all adult Americans, even those who have never committed or been suspected of any crime, have their pictures stored in databases that authorities use to look for criminal suspects.
Around 80 percent of these photos are taken from sources such as driver's licenses and passports, and have been placed into these databases without the individual's consent. Only one in a thousand images in these databases are accurately labeled, and the Federal Bureau of Investigation itself doesn't believe official recognition algorithms to be accurate — in essence, even if two faces match, it doesn't mean they belong to a single person.
The software has also been found to incorrectly identify black people more frequently than it does white people.