UK Home Office Backs Police Trials of Controversial, Human Rights Violating Facial Recognition Tech

© AP Photo / Eric RisbergIn this photo taken Tuesday, May 7, 2019, is a security camera in the Financial District of San Francisco. San Francisco is on track to become the first U.S. city to ban the use of facial recognition by police and other city agencies as the technology creeps increasingly into daily life.
In this photo taken Tuesday, May 7, 2019, is a security camera in the Financial District of San Francisco. San Francisco is on track to become the first U.S. city to ban the use of facial recognition by police and other city agencies as the technology creeps increasingly into daily life.  - Sputnik International
Subscribe
The system, currently used by Metropolitan and South Wales police, is supplied by Japanese company NEC - other customers include retailers, stadiums and casinos, which use the system to scour crowds for “potential troublemakers”.

Home Secretary and failed Conservative leadership candidate Sajid Javid has backed police trials of Neoface, the controversial facial recognition technology.

The Home Office claims there’s an adequate legal framework to support and facilitate its use, but it was reviewing ways to simplify and extend governance and oversight of biometrics. Moreover, the tool - which costs £1.76 million - is said to be potentially “game-changing” in respect of fighting online child abuse, speeding up investigations and limiting the number of indecent images officers have to review.

The technology, which cost £1.76 million, aims to improve the capability of the Child Abuse Image Database, which holds millions of images.

Pushing Ahead

British authorities have long been determined to push ahead with facial recognition tech’s implementation, despite significant criticism from public bodies, NGOs and campaigners.

For instance, earlier in July, University of Essex researchers were given access to six live trials by the Metropolitan police - they found matches were correct in only a fifth of cases, and the system was moreover likely to break human rights laws.

“The legal basis for the trials was unclear and is unlikely to satisfy the ‘in accordance with the law’ test established by human rights law. It does not appear an effective effort was made to identify human rights harms or to establish the necessity of LFR. Ultimately, the impression is human rights compliance was not built into the Metropolitan Police’s systems from the outset, and was not an integral part of the process,” report author Dr Daragh Murray said.

While shocking, the findings were hardly newsworthy, given when facial recognition systems were deployed at the 2017 Notting Hill Carnival, the technology produced just one accurate identification, and 95 'false positives', leading to several wrongful arrests.

The University’s findings also echoed the long-standing position of civil rights campaign groups such as Liberty - the organisation says such technology is “dangerously intrusive and discriminatory technology that destroys our privacy rights and forces people to change their behaviour”.

​Likewise, campaign group Big Brother Watch has documented how South Wales Police stored biometric photos of 2,451 innocent people wrongly identified by its system for 12 months, a policy that’s potentially unlawful. Facial recognition technology in any event operates in a legal grey area, not subject to regulation or statutory oversight — and Big Brother Watch, among others, strongly argue its use contravenes existing legislation.

"It's highly questionable whether the use of automated facial recognition is compatible with fundamental human rights — in particular, the rights to a private life and freedom of expression. The necessity of such biometric surveillance is highly questionable, and inherently indiscriminate scanning appears to be plainly disproportionate. As it stands, the risk automated facial recognition is fundamentally incompatible with people's rights under the Human Rights Act 1998 is yet to be considered," the rights group has written.

Tony Porter, surveillance camera commissioner, went to the extent of forcing Greater Manchester police to end their use of facial recognition systems at the Trafford shopping centre in October 2018.

The force had snapped up to 15 million faces over a six month period, analyzing and storing the images of potentially millions without anything in the way of an official announcement, or consent from those surveilled.

 

 

 

Newsfeed
0
To participate in the discussion
log in or register
loader
Chats
Заголовок открываемого материала