Facial recognition technology – protecting privacy in a world of surveillance

The use of live or automated, facial recognition (LFR) software by law enforcement in public places is fast increasing in the UK. It enables rapid surveillance of thousands of people. However, the increased use of LFR also highlights the tricky balance between an individuals’ right to privacy and the need for police powers to keep us safe.

Whilst the use of LFR software to locate serious offenders has obvious benefits to law enforcement authorities, concerns have been raised by civil liberties groups and the Information Commissioner’s Office (ICO) about infringement of privacy and inherent bias in the technology.

LFR software analyses a person’s facial features and then compares its findings against the faces of those stored in a database (known as “watchlists”) to try to find a match. The database may include suspects, missing people and persons of interest.

In August 2016 the Metropolitan Police Service deployed LFR technology at the Notting Hill Carnival. Since June 2017 the South Wales Police has deployed LFR on a number of occasions at sports matches and other large events. More recently, in February 2020, the Metropolitan Police started using LFR cameras on London Streets.

The face of a member of the public, Edward Bridges, was scanned by South Wales police’s LFR technology, known as AFR Locate, during a pilot project whilst he attended a peaceful anti-arms protest in 2018 and was shopping for Christmas in the previous year.

Mr Bridges claimed that the processing of his image violated his privacy and data protection rights and caused him distress.

This was a significant decision and the first time the use of LFR technology has been scrutinised by our courts.

In its decision, the High Court found that South Wales Police’s use of LFR technology was lawful. However, this decision was appealed and in August 2020 the Court of Appeal overturned parts of the High Court’s decision and held that South Wales Police Force’s use of AFR technology was not in accordance with the law for the purposes of Article of the European Convention on Human Rights and was in breach of the Equality Act 2010 and Data Protection Act 2018.

Amongst other things, the Court of Appeal held that:

  • the data protection impact assessment conducted by South Wales Police failed to properly assess the rights and freedoms of data subjects and failed to address the measures envisaged to mitigate the risks arising from the use of the LFR; and
  • South Wales Police had not done all it reasonably could to fulfil its public sector equality duty. They had never investigated whether AFR had an unacceptable bias on grounds of race or gender. However, there was no clear evidence that the AFR was in fact biased on these grounds.

In its statement on the Court of Appeal judgment, an ICO spokesperson said:

“We welcome the Court of Appeal’s judgment that provides clarification on the police use of live facial recognition technology in public places.

“Facial recognition relies on sensitive personal data and balancing people’s right to privacy with the surveillance technology the police need to carry out their role effectively is challenging. But for the public to have trust and confidence in the police and their actions there needs to be a clear legal framework. Today’s judgment is a useful step towards providing that.”

This decision has highlighted the difficult balance between individuals’ rights to privacy and the need for police powers to keep us safe.

Earlier this year the ICO issued an opinion in the use of this technology by law enforcement in public places. The ICO’s opinion makes the following points:

  • Data protection law applies to use of LFR because it involves the processing of personal data
  • The use of LFR for law enforcement purposes is sensitive processing because it involves processing biometric data of members of the public for the purposes of uniquely identifying an individual
  • Controllers must identify a lawful basis for LFR and have an appropriate policy document in place at the time of processing
  • In the absence of individual consent, the processing must be “strictly necessary” for the law enforcement purposes

The ICO has raised significant concerns about the creation of watchlists that are compiled using custody images that should have been deleted from police systems or are compiled from images with uncertain provenance where accuracy may be an issue (for example sourced from social media).

The ICO also raised concerns about the potential for inherent technical bias (for example gender or ethnicity bias) in the way the LFR technology works.

In the not too distant future, we can expect to see a binding code of practice issued for police and other law enforcement agencies on the use of LFR and similar biometric technologies.

LFR technology can enable the surveillance of individuals on a massive scale. It has the potential to be a force for good – maintain public security by identifying and aiding the apprehension of criminals.

However, adequate safeguards must be put in place for this technology otherwise it could be used intrusively or inconsistently by law enforcement organisations which could undermine public confidence in its use.

Find out more about our commercial solicitors.


Share