Business

AI watchdog slams government over facial recognition rollout


The UK government has been criticised for the roll out of facial recognition technology.

An artificial intelligence (AI) search body has criticised the UK government for accelerating the deployment of facial recognition technology without first establishing a comprehensive legal framework.

The Ada Lovelace Institute has warned the increasing use of live facial recognition (LFR) by police and retailers across the UK is happening in a “legislative void”, raising urgent concerns about privacy, transparency and accountability.

The warning comes as the government pushes ahead with installing permanent LFR cameras in Croydon, south London, as part of a long term policing trial this summer.

Fragmented oversight

Since 2020, nearly 800,000 faces have been scanned by the Metropolitan Police and over £10m has been spent on facial recognition-equipped vehicles, according to the Home Office.

But critics have said the legal basis for these operations remains shaky. The only significant legal ruling so far, the 2020 Bridges versus South Wales police cass, found the use of LFR unlawful due to “fundamental deficiencies” in existing laws.

Michael Birtwistly, associate director at the Ada Lovelace Institute, described the regulatory situation as “doubly alarming”.

“The lack of adequate governance framework for police use of facial recognition – one of its most visible and high stakes applications – not only puts the legitimacy of police deployments into question, but also exposes how unprepared our broader regulatory regime is”, he said.

The institute’s latest report highlighted how fragmented UK biometric laws have failed to keep pace with rapid developments in AI-powered surveillance.

It also pointed to the dangers posed by newer technologies like “emotion recognition”, which attempt to interpret mental states in real time.

Nuala Polo, UK policy lead at the institute, noted that while police often assert that their use of the technology is lawful under current human rights and data protection laws, adding “these claims are almost impossible to assess outside of retrospective court cases”.

She also said “it is not credible to say that there is a sufficient legal framework in place”.

Privacy campaigners have also echoed the call for reform. Sarah Simms of Privacy International said the lack of specific legislation has made the UK an “outlier”, globally.

Expanding use of facial recognition cameras

The rapid expansion of the technology was recently revealed in a joint investigation by The Guardian and Liberty Investigates, which found that nearly five million faces were scanned by police across the UK last year, resulting in over 600 arrests.

The technology is now also being trialled in retail and sports settings, with retailers like Asda, Budgens, and Sports Direct adopting facial recognition systems to deter theft.

However, civil liberties organisations have said the practice risks misidentification, particularly of ethnic minorities, and coul deter lawful public protest.

“We’re in a situation where we’ve got analogue laws in a digital age”, said Charlie Welton of Liberty, who also warned the UK was lagging behind Europe and the US, where several jurisdictions have banned or limited LFR.

Despite the mounting criticism, the Home Office has defended the technology as “an improtant tool in modern policing”.

Policing minister Dame Diana Johnson recently acknowledged in Parliament that “very legitimate concerns” exist and accepted that the government may need to consider a “bespoke legislative framework” for the use of LFR.

No concrete proposals have yet been announced.





READ SOURCE

This website uses cookies. By continuing to use this site, you accept our use of cookies.