The use of live, or automated, facial recognition (LFR) software by law enforcement in public places is increasing rapidly in the UK. It enables rapid surveillance of thousands of people.
Whilst the use of LFR software to locate serious offenders has obvious benefits to law enforcement authorities, concerns have been raised by civil liberties groups and the Information Commissioner’s Office (ICO) about infringement of privacy and inherent bias in the technology.
LFR software analyses a person’s facial features and then compares its findings against the faces of those stored in a database (known as “watchlists”) to try to find a match. The database may include suspects, missing people and persons of interest.
In August 2016 the Metropolitan Police Service deployed LFR technology at the Notting Hill Carnival. Since June 2017 the South Wales Police has deployed LFR on a number of occasions at sports matches and other large events. More recently, in February 2020 the Metropolitan Police started using LFR cameras on London Streets.
In September 2019 the High Court gave its judgment on a judicial review claim concerning the use of LFR technology by South Wales police .
This was a significant decision and the first time the use of LFR technology has been scrutinised by our courts.
The face of a member of the public, Edward Bridges, was scanned by South Wales police’s LFR technology – known as AFR Locate – during a pilot project whilst he attended a peaceful anti-arms protest in 2018 and was shopping for Christmas in the previous year.
Mr Bridges’ claimed that the processing of his image violated his privacy and data protection rights and caused him distress.
The main issue was whether or not existing rules and regulations are sufficient to ensure that the use of LFR is non-arbitrary and suitable for its purpose.
In its decision, the High Court found that South Wales police’s use of LFR technology was lawful and the claim for judicial review was dismissed on all grounds.
The layers of rules and regulations (primary legislations, codes of practice and local policies) that regulate the South Wales police’s broad common law powers, and accordingly their use of this technology, was a key factor in finding that the current legal framework adequately governs the proper use of LFR.
This decision was decided on its facts and it should not be seen as a general permission to use LFR technology. An appeal against this decision is pending which we will follow with interest.
This decision has highlighted the difficult balance between individuals’ rights to privacy and the need for police powers to keep us safe.
Since this decision the ICO has issued an opinion in the use of this technology by law enforcement in public places and has published its report on its investigation into such use. The ICO’s opinion makes the following points:
- Data protection law applies to use of LFR because it involves the processing of personal data
- The use of LFR for law enforcement purposes is sensitive processing because it involves processing biometric data of members of the public for the purposes of uniquely identifing an individual
- Controllers must identify a lawful basis for LFR and have an appropriate policy document in place at the time of processing
- In the absence of individual consent, the processing must be “strictly necessary” for the law enforcement purposes.
The ICO has raised significant concerns about the creation of watchlists that are compiled using custody images that should have been deleted from police systems or are compiled from images with uncertain provenance where accuracy may be an issue (for example sourced from social media).
The ICO also raised concerns about the potential for inherent technical bias (for example gender or ethnicity bias) in the way the LFR technology works.
In the not too distant future, we can expect to see a binding code of practice issued for police and other law enforcement agencies on the use of LFR and similar biometric technologies.
LFR technology can enable the surveillance of individuals on a massive scale. It has the potential to be a force for good – maintain public security by identifying and aiding the apprehension of criminals.
However, adequate safeguards must be put in place for this technology otherwise it could be used intrusively or inconsistently by law enforcement organisations which could undermine public confidence in its use.