On September 11th, the England and Wales Court of Appeal held that the South Wales Police Force violated Article 8 of the European Convention on Human Rights and the UK Equality Act 2018 by using facial recognition software on two occasions. The finding is narrow, though, and leaves facial recognition technology open to police use.
The police piloted facial recognition technology on two occasions. They were governed by the Data Protection Act 2018, a surveillance “code of practice” issued under the protection of Freedoms Act 2012 and written local police policy. The police also conducted a data protection impact assessment and a (somewhat limited) equality impact assessment.
The police conducted overt facial recognition surveillance under this framework based on pre-deployment notice made, in part, via advertising and via notices posted on the police cars equipped with facial recognition cameras. On two occasions the police collected images for an entire day and matched the images against images in “watch lists” comprised of persons wanted on warrants, persons identified as suspects and other persons of interest. The police used human validation to screen matches, which led them to make two arrests on one occasion and no arrests on another. Significantly, the police immediately disposed of images of all persons who did not match.
The Court found the deployment to have been unlawful based on two problems, both problems of process rather than fundamental problems.
First, the Court held that the deployments were not sufficiently prescribed by law to justify an infringement of Article 8 (which protects the right to privacy). More specifically, it held that the legal framework for the deployments left too much discretion to the police as to who may be placed on a watch list, in particular for intelligence gathering purposes. The police failure to reckon with this aspect of the technology and surveillance program also led the Court to conclude that its data privacy impact assessment was inadequate.
Second, the Court held that the police did not conduct an adequate equality impact assessment, which it held requires “the taking of reasonable steps to make enquiries about what may not yet be known to a public authority about the potential impact of a proposed decision or policy on people with the relevant characteristics, in particular for present purposes race and sex.” The police ought to have, the Court said, assessed the facial recognition software to determine if it resulted in “unacceptable bias,” even if human validation was to be a feature of the matching process.
Notably, the Court held (in obiter) that the police infringement of Article 8 rights was justifiable in regards to the relative consequences and benefits of the surveillance scheme, calling the impact on Article 8 rights “negligible.”
As noted, this leaves facial recognition technology open to police use in the UK. Use for intelligence gathering purposes may be more questionable than use for investigatory purposes.