PR

The future of facial
recognition technology

The future of facial recognition technology

Published: 27th August 2019
Area: Corporate & Commercial
Author(s): Andrew Hartshorn

In the first major UK legal challenge to the police use of automated facial recognition (AFR) surveillance, the Divisional Court ruled that the use of such technology was justifiable.

However a judgment in Sweden last month, as well as recent investigations announced by the Information Commissioner, means the use of AFR remains problematic.

Background

In the UK, and across much of Europe, the legal framework surrounding facial recognition is based on the GDPR. This allows the use of AFR only where one of a number of bases is satisfied.

However, in the US, some cities such as San Francisco, have banned facial recognition technology for law enforcement purposes. Politicians such as Bernie Sanders wish to ban its use nationally, for everything other than private purposes.

United Kingdom

Civil rights group, Liberty, claimed that the use of a camera van by South Wales Police (SWP) was similar to the unregulated capture of DNA or fingerprints.  SWP used the camera van to capture dozens of digital images of members of the public every second and cross-reference them against a database of wanted persons.  SWP argued that this scheme was necessary for the safety and security of the public at large.

The Court concluded that the SWP’s use of AFR was in accordance with the law under its powers to prevent and detect crime as it was deployed in a transparent and limited way.  Perhaps most importantly, SWP had an “appropriate policy document” in place that (only just) complied with section 42 of the Data Protection Act.  The court also took care not to evaluate the quality of the DPIA carried out by the SPW.

Sweden

However in Sweden, a far more restrictive approach was taken to the use of AFR.  Here the Swedish data protection authority (DPA) fined a school approximately £17,000 for a pilot scheme that used AFR technology to assist the student registration process over a few weeks.

Biometric data in the form of the students’ faces and names were captured and stored in a local computer (without internet connection), which was stored in a sealed cabinet. Parents and guardians had provided explicit consent for their child to be part of the scheme.

Nevertheless the Swedish DPA argued that this use of AFR was disproportionate: the students’ privacy was greatly infringed and the process of registering students could be done in less intrusive ways. Furthermore, despite parental consent being obtained, it held that consent was not freely given as there was a power imbalance between the students and the school board. Finally, although the school had carried out a data protection impact assessment (DPIA) the Swedish DPA considered that the DIPA was not adequate for the purposes.

Recent developments

Last month the Information Commissioner launched an investigation following concerns that live AFR was being used in King’s Cross and Manchester Piccadilly railway stations, as well as in parts of Birmingham. The ICO restated that “any organisations wanting to use facial recognition technology must comply with the law – and they must do so in a fair, transparent and accountable way. They must have documented how and why they believe their use of the technology is legal, proportionate and justified”.

Going forward

AFR looks set to become part of the landscape for UK law enforcement agencies in the same way as CCTV.  For private businesses, the use of AFR is more problematic given that the imbalance between employer and employee makes consent unlikely to be an effective legal basis of processing.

Learn more about our commercial team.

Back to Thoughts & Insights