Lawfulness of automated facial recognition
R (Edward Bridges) v the Chief Constable of South Wales  EWHC 2341 (Admin)The question
Is the use of automated facial recognition (AFR) technology by law enforcement lawful under the Data Protection Act 1998 (DPA 1998), the Data Protection Act 2018 (DPA 2018), the Equality Act 2010 and Article 8 of the European Convention on Human Rights (ECHR)?
The key takeaway
Rights under Article 8 of the ECHR are engaged by the use of AFR, but (in this case) its use by law enforcement struck a fair balance between the rights of the individual and those of the community.
AFR can help to assess whether two facial images depict the same person. A digital photograph of a person’s face is taken and processed to extract measurements of facial features. That data is then compared with similar data from images contained in a database.
This case resulted from the use of security cameras by South Wales Police (SWP) to take digital images of the public and match them against images of individuals on SWP’s watch lists as part of a pilot project named “AFR Locate”. If no match was found, the relevant individuals’ biometric data was not stored (although the underlying CCTV footage was kept for a period of time). If a match was made, the police decided how to respond. The technology was used on around fifty occasions at a variety of large public events (for example, the 2017 UEFA Champions League Final).
Mr Bridges, the Claimant, is a former Liberal Democrat local politician. He said that the SWP captured and processed his image on two occasions in the course of AFR Locate. As Mr Bridges was not on a watch list, his image was deleted shortly after it was taken. Supported by Liberty, a human rights organisation, Mr Bridges brought an application for judicial review, alleging that the SWP’s conduct was unlawful.
He contended that the use of AFR was unlawful for the following three reasons:
- the use of AFR was an interference with his rights under Article 8(1) of the ECHR which provides that everyone “has the right to respect for his private and family life, his home and his correspondence”. The use of AFR was neither “in accordance with the law” or “necessary” or “proportionate” as required by Article 8(2);
- the use of AFR was contrary to s4(4) DPA 1998 (that personal data may only be processed fairly and lawfully) and s35 DPA 2018 (the processing of personal data for any law enforcement purposes must be lawful and fair). Additionally, that the use of AFR falls within s64(1) DPA 2018 (as this type of processing is likely to result in a high risk to the rights and freedoms of individuals) and therefore a data protection impact assessment must be carried out;
- under s149(1) Equality Act 2010 (where public authorities must, in the exercise of their functions, have due regard to, inter alia, the need to eliminate discrimination and the need to foster good relations between different people) the SWP failed to take into account the fact that the use of AFR would result in a disproportionately higher rate of false-positive matches for women and minority ethnic groups. Therefore, the use of the program would indirectly discriminate. Accordingly, the SWP failed to take into account the relevant considerations from s149(1)(a)-(c) of the Act. SWP argued that the facial recognition cameras helped safeguard the public and prevent crime, but did not infringe the privacy of members of the public whose images were scanned.
The Court held that Mr. Bridges’ Article 8 rights were engaged, even though the surveillance took place in public spaces and Mr. Bridges’ image was automatically deleted immediately following the matching exercise. However, the High Court decided that SWP’s use of AFR technology, as part of AFR Locate, was lawful because its common law powers to keep the peace and prevent crime gave it the power to deploy AFR, and because there is legislation (such as the GDPR), practice codes (such as the Surveillance Camera Code of Practice), and policy documents which provide standards against which the lawfulness of SWP’s use of AFR can be assessed.
Additionally, the high Court held that no less intrusive measure than AFR was reasonably available to the SWP and that the SWP’s use of AFR struck a fair balance between the rights of the individual and those of the community.
The High Court also held that there was no evidence that AFR did in fact produce results which were discriminatory in the way alluded to by Mr. Bridges and dismissed the Equality Act 2010 claim.
Why is this important?
The case is important as it highlights that the Court acknowledged that SWP’s use of live facial recognition technology did involve the processing of sensitive personal data of members of the public. However, the ruling indicates an element of deference to the police and the overarching objective to keep the peace and prevent crime (the purpose of the AFR Locate project).
It’s also important to note that a factor weighing in favour of the High Court’s conclusion that SWP’s use of AFR was lawful was that the software’s decisions as to identification were always reviewed by a human police officer (“In our view, the fact that human eye is used to ensure that an intervention is justified, is an important safeguard”) (Para. 33)).
Any practical tips?
The police and private organisations should consider existing data protection law and guidance when using live facial recognition technology.
This level of technology is new and intrusive and, if used without appropriate privacy safeguards, could potentially undermine instead of enhance public confidence in the police.