UK Court of Appeal causes re-set for facial recognition surveillance

On September 11th, the England and Wales Court of Appeal held that the South Wales Police Force violated Article 8 of the European Convention on Human Rights and the UK Equality Act 2018 by using facial recognition software on two occasions. The finding is narrow, though, and leaves facial recognition technology open to police use.

The police piloted facial recognition technology on two occasions. They were governed by the Data Protection Act 2018, a surveillance “code of practice” issued under the protection of Freedoms Act 2012 and written local police policy. The police also conducted a data protection impact assessment and a (somewhat limited) equality impact assessment.

The police conducted overt facial recognition surveillance under this framework based on pre-deployment notice made, in part, via advertising and via notices posted on the police cars equipped with facial recognition cameras. On two occasions the police collected images for an entire day and matched the images against images in “watch lists” comprised of persons wanted on warrants, persons identified as suspects and other persons of interest. The police used human validation to screen matches, which led them to make two arrests on one occasion and no arrests on another. Significantly, the police immediately disposed of images of all persons who did not match.

The Court found the deployment to have been unlawful based on two problems, both problems of process rather than fundamental problems.

First, the Court held that the deployments were not sufficiently prescribed by law to justify an infringement of Article 8 (which protects the right to privacy). More specifically, it held that the legal framework for the deployments left too much discretion to the police as to who may be placed on a watch list, in particular for intelligence gathering purposes. The police failure to reckon with this aspect of the technology and surveillance program also led the Court to conclude that its data privacy impact assessment was inadequate.

Second, the Court held that the police did not conduct an adequate equality impact assessment, which it held requires “the taking of reasonable steps to make enquiries about what may not yet be known to a public authority about the potential impact of a proposed decision or policy on people with the relevant characteristics, in particular for present purposes race and sex.” The police ought to have, the Court said, assessed the facial recognition software to determine if it resulted in “unacceptable bias,” even if human validation was to be a feature of the matching process.

Notably, the Court held (in obiter) that the police infringement of Article 8 rights was justifiable in regards to the relative consequences and benefits of the surveillance scheme, calling the impact on Article 8 rights “negligible.”

As noted, this leaves facial recognition technology open to police use in the UK. Use for intelligence gathering purposes may be more questionable than use for investigatory purposes.

Bridges, R (On the Application Of) v South Wales Police [2020] EWCA Civ 1058 (11 August 2020).

IPC wades into shadow IT mess, may never again

The Information and Privacy Commissioner/Ontario issued a decision about a security incident on July 9th in which it made clear, after participating in a health information custodians’ efforts to recover lost data, that this burden falls on custodians alone.

The incident involved a clinician at an unnamed rehabilitation clinic and her estranged spouse, who reported to the clinic that he possessed 164 unique files containing the personal health information of 46 clinic clients on two computers that belonged to the clinician. The clinician explained the existence of the files as a by-product of secure access and inadvertent, though the the files appear to have been purposely moved from temporary storage to a Google drive at some point, possibly by the spouse

The spouse was not particularly cooperative. This led the IPC, who the clinic had notified, to engage with the spouse together with the clinic over a several month period. The IPC took the (questionable) position that the spouse was in breach of duties under section 49(1) of PHIPA.

In the course of these dealings the spouse reported he had also received e-mails with attached assessment reports from the clinician for printing purposes. The clinician said she had thought she had adequately de-identified the reports, though one included a full patient name and others (as the IPC held) contained ample data to render patients identifiable.

All of the detritus was eventually deleted to the satisfaction of the clinic and IPC. The clinic reconfigured its means of providing secure remote access to adresses the risk of local storage and beefed up its administrative policies and training. There is no mention of implementing a digital loss prevention solution.

The IPC decision is notable for two points.

First, the IPC made clear that custodians should not rely on the IPC to help with data recovery (which can be very expensive):

It is clear that interactions between the Clinic and the Spouse had been very challenging, chiefly due to the Spouse’s changing positions throughout this investigation. However, the obligations on a health information custodian to contain the breach remain, even in the face of challenging circumstances.  The Privacy Breach Guidelines are clear that there is an obligation on the health information custodian to retrieve any copies of personal health information that have been disclosed and ensure that no copies of personal health information have been made or retained by anyone who was not authorized to receive the information.  Nothing in the legislation or these guidelines transfers this obligation to the IPC.

Second, the clinic was less skeptical of the clinician than it might otherwise have been, and did not issue discipline. The IPC accepted this, and re-stated its deferential position on employee discipline as follows:

With respect to the Clinic’s decision, I am satisfied that it was reasonable in the circumstances. This office has stated that its role is not to judge the severity or appropriateness of sanctions taken by a custodian against its agents (see PHIPA Decision 74).  However, the IPC can taken into account a custodian’s disciplinary response as part of its assessment of whether the custodian has taken reasonable steps to protect personal health information against unauthorized access.

A Rehabilitation Clinic (Re), 2020 CanLII 45770 (ON IPC).

BC OIPC dismisses privacy complaint about conduct of tribunal litigation

On May 1st the British Columbia Office of the Information and Privacy Commissioner dismissed a complaint that alleged a law firm and its client violated BC PIPA by serving a seven-part application for non-party production on seven non-parties to a Human Rights Code proceeding (thereby disclosing more personal information than would have been disclosed in seven separate applications).

Most significantly, the OIPC held that the PIPA provision that states it does not “limit the information available by law to a party to a proceeding” does not limit the OIPC’s jurisdiction and, rather, “merely provides reassurance that PIPA does not restrict the availability of information to a party to a proceeding where that information is available by law.” The OIPC therefore needed to dismiss the complaint on other grounds – in this case based on finding of deemed implied consent and a finding that the disclosure was “required or authorized by law.”

The OIPC did come back to the “party to a proceeding” provision – section 3(4) – in dismissing the complainant’s proportionality argument. It said:

[77]        As I see it, the actions of parties in a court or tribunal proceeding – and whether those actions were necessary or appropriate in light of that forum’s governing law and procedures – is a matter best judged by that court or tribunal. I find support for this approach in s. 3(4) of PIPA. Section 3(4) states that PIPA does not limit the information available by law to a party to a proceeding. This provision ensures that PIPA does not interfere with, or override, statutory or common law processes or rules that make information available to a party to a proceeding.

[78]        Section 3(4) of PIPA requires that I interpret and apply PIPA in a way that does not limit the information available to PLG as a party to the legal proceedings before the Tribunal. In essence, the complainant is calling upon PIPA to censure, regulate and/or impose restrictions on what a party to a Tribunal proceeding can do to obtain information or evidence under the Tribunal’s Rules. I believe that a decision on my part prohibiting a party to a Tribunal proceeding from disclosing personal information in an application made pursuant to Rule 23(2) would, effectively, limit the information available by law to that party and run contrary to s. 3(4).

[79]        Thus, the issue of whether in this particular Tribunal proceeding the respondents complied with the Rules regarding applications for non-party disclosure is a matter that should be left to the Tribunal to decide. The Tribunal is an administrative tribunal empowered by statute to create the Rules that govern its proceedings and to enforce compliance with those Rules. Given it is the adjudicative forum where the complainant pursued her human rights complaint, it is best placed to understand the full context of what took place during its proceedings and to referee the parties’ behaviour.

This text is helpful, though the OIPC could have left litigants wider berth by reading section 3(4) as creating a form of privilege.

[Note that the HRTO did sanction the client (respondent) for serving its seven-part application by awarding the complainant $5,000 in costs.]

Mary- Helen Wright Law Corporation (Pacific Law Group) (Re), 2020 BCIPC 21 (CanLII).

Privacy claim against documentary makers dismissed

On April 23rd, the Ontario Superior Court of Justice dismissed two privacy claims brought against the makers of a documentary – one based on the misappropriation of personality tort and the other based on the intrusion upon seclusion tort.

Wiseau (and others) brought the claims against the makers of a movie called Room Full of Spoons – a documentary about Wiseau and his own infamous movie, The Room. The Room has become notorious as one of the worst movies ever made. Room Full of Spoons disclosed Wiseau’s birthdate, birth name and place of birth, facts available to the public but not widely known, in part because Wiseau’s cultivation of mystery about his background.

Wiseau aggressively objected to the release of Room Full of Spoons, according to the Court, in part because he held a financial interest in a competing film. He obtained an injunction in 2017 that was held to have been improperly obtained, leaving Wiseau on the hook for $750,000 in damages.

In addition to making this damages order, Justice Schabas wrote a lengthy judgement that adresses fair dealing and related copyright issues, a passing off claim and various pre-trial and trial procedure issues. I’ll just address his disposition of the two privacy claims.

Justice Schabas dismissed the misappropriation of personality claim because Wiseau was a public figure who cultivated interest (and mystery) in his personality. The defendants’ use of Wiseau’s image to promote Room Full of Spoons (which was limited) was therefore not actionable. Justice Schabas followed Gould Estate, and held that use of Wiseau’s image served the purpose of contributing accurate information “to the public debate of political or social issues or of providing the free expression of creative talent” and was not primarily a means of “commercial exploitation.”

Justice Schabas dismissed the intrusion upon seclusion claim for reasons unrelated to the defendants’ right of expression, finding no “highly offensive” intrusion at all:

Wiseau has failed to make out the elements of the tort in this case.  No personal details of the kind referred to in Jones v. Tsige were disclosed by the defendants. Rather, what was disclosed was Wiseau’s birthplace, his birthdate, and the name he was given at birth and had as a child in Poland. This information was available from public sources, which is how the defendants obtained and confirmed it. Wiseau may be sensitive about this information because he has cultivated an aura of mystery around it, but disclosure of these facts is not, objectively speaking, something which can be described as “highly offensive.”

The idea that Wiseau’s privacy claim could not be sustained because his information was publicly available is significant, though consistent with traditional notions of privacy and confidentiality.

Wiseau Studio, LLC et al. v. Harper et al., 2020 ONSC 2504 (CanLII).

No privacy breach for publishing information about a provincial offences conviction

Late last year the Ontario Superior Court of Justice issued judgement in a hard-fought dispute between residential neighbours. After an 11-day small claims court trial (!) the Court allowed one neighbour’s privacy breach claim and dismissed the other’s.

The Court allowed a claim against the defendants for directing surveillance cameras and motion-activated floodlights at the plaintiffs’ property as part of a deliberate campaign of harassment. It awarded each plaintiff $8,000, noting evidence of “significant stress and irritation.” The Court also awarded each plaintiff $500 on account of their exposure to “obstructive parking.”

The successful plaintiffs tended to the high road, at one point returning the defendants’ stray dog in an act of neighbourliness. They did, however, publicly post a document that detailed a Provincial Offences Act conviction of one of the defendants. (They said did so to give their prying-eyed neighbours “something to look at.”) The Court dismissed a counterclaim based on this publication, explaining:

Convictions and sentences imposed by courts of law are events which occur in public and are publicly-available information.  The fact that some third party has posted such facts on the internet makes them all the more public.  I am unable to accept the defence submission, unsupported by authority, that for Mr. Cecchin to find and post this information constitutes an actionable invasion of privacy.  Such a conclusion would be inconsistent with the definition pronounced by Sharpe J.A. in Jones v. Tsige (2012), 2012 ONCA 32 (CanLII), 108 O.R. (3d) 241 (C.A.), at para. 70.  The conviction and sentence cannot be viewed as Mr. Bradbury’s “private affairs or concerns”.  Nor would a reasonable person regard the search for or publication of the outcome of legal proceedings as “highly offensive.”

In a similar vein, the Court dismissed a counter-claim that alleged the plaintiffs committed a privacy violation by writing a letter to other neighbours drawing their attention to the defendants’ non-compliance with municipal bylaws. It said that the claim was untenable as one that attacked, “an exercise of free speech, of local political action and participation in the municipal legal process.”

Cecchin v Lander, 2019 CanLII 131883 (ON SCSM).

UKSC decides data thief was on a “frolic of his own”

The Supreme Court of the United Kingdom has decided an important vicarious liability case in favour of a company whose rogue employee stole payroll information and posted it online.

The company entrusted the employee with payroll data pertaining to over 120,000 of its employees to facilitate an audit. The employee – who was still aggrieved about some discipline the company had earlier imposed – passed the data to the auditor as instructed, but kept a copy and posted it online as part of a personal vendetta.

As in Canadian law, United Kingdom law deems employers to be responsible for the wrongful acts of their employees that are not authorized if there is a “sufficient connection” between the wrongful act and the work that is authorized. The creation of “opportunity” to commit the wrong is a factor, and the analysis is to be conducted with a view to the policy-implications, leading some to argue that data security concerns justify broadly-imposed vicarious liability.

Nonetheless, the Court held that cause (or the creation of opportunity) was not enough to warrant this employer’s liability for its employee’s data theft. That is, the employee’s theft (and his public disclosure) was caused by the company’s provision of data to the employee, but the employee was still motivated to harm the employer and “on a frolic of his own” that did not warrant employer liability.

WM Morrisons Supermarkets plc (Appellant) v Various Claimants (Respondent), [2020] UKSC 12.

 

NSCA says no expectation of privacy in address information

On January 28th the Nova Scotia Court of Appeal dismissed a privacy breach allegation that was based on a municipality’s admitted disclosure of address information to a related service commission so the service commission could bill for certain statutorily mandated charges. The Court held there was no reasonable expectation of privacy in the information disclosed, reasoning as follows:

Mr. Banfield’s information was not confidential, secret or anonymous. Neither did it offer a glimpse into Mr. Banfield’s intimate, personal or sensitive activities. Nor did it involve the investigation of a potential offence. Rather, it enabled a regulated public utility to invoice Mr. Banfield with rates approved under statutory authority for a legally authorized service that, in fact, Mr. Banfield received.  

Banfield v. Nova Scotia (Utility and Review Board), 2020 NSCA 6 (CanLII).

Saskatchewan Commissioner recommends clean desk policy for lawyers

On November 27th, the Saskatchewan Information and Privacy Commissioner faulted the Saskatchewan Legal Aid Commission for failing to have and maintain a clean desk policy – i.e., a policy requiring files to be put away and locked overnight – given cleaning staff had unsupervised after hours access to its office. The IPC relied on the Commission’s own policy, which encouraged but did not mandate clean desks. The matter came to the IPC’s attention after cleaning staff left two layers of doors open one night.

Saskatchewan Legal Aid Commission (Re), 2019 CanLII 113284 (SK IPC).

Automobile accident litigation, privacy and bad tactics

Lee Akazaki of Gilbertson Davis has written a fascinating article in the most recent Advocates’ Quarterly (vol 50) about a privacy-related development that he argues is interfering with the just and expeditious handling of no-fault motor vehicle claims in Ontario.

Mr. Akazaki explains that plaintiff counsel are using privacy claims to frustrate the mandatory independent medical examinations required by section 44 of the Statutory Accident Benefit Schedule:

In about 2016, assessors began to receive cryptic letters from the clients of lawyers retained to advance injury claims. The letters invariably followed a template containing a client’s name, the lawyer’s office address, and the client’s signature. Written in “legalese,” the letters demanded that s. 44 assessors account directly to the claimant for various documents and procedures in the IME process. The letters stated the recipients were not to disclose the requests to insurers or other parties, thus leaving assessors worried that reprisals would follow if they alerted insures or their agents. They also demanded responses within 30 days, failing which the assessor was liable to face a complaint to the Office of the Privacy Commissioner (OPC) under the Canadian federal Personal Information Protection and Electronic Documents Act (PIPEDA). The letters exploited ostensible conflicts among health care standards, privacy litigation and SABS.

This is shocking, especially given PIPEDA does not apply.  On this point I agree with Mr. Akazaki, though my reasoning differs. This is entirely analogous to the State Farm case, in which the Federal Court held that the gathering of evidence to resolve a civil dispute in a province was not subject to PIPEDA because it is not, in its essence “commercial activity.” It is unlike Wyndowe, an earlier case in which the Federal Court held that an independent medical examination under a disability insurance contract was subject to PIPEDA.

Mr. Akazaki argues that Wyndowe is distinct because the SABS regime (unlike typical contractual examination rights under disability insurance contracts) treats the (provisional) denial of benefits as a condition precedent to an examination. Okay, but I think the point of distinction is likely more fundamental. It strikes me – and I am not an insurance lawyer! – that the SABS regime is a public regime for resolving civil disputes in the province and is, in part at least, a means of keeping litigation out of court. It’s the public character of the SABS dispute resolution regime that ought to provide it with a form of immunization from PIPEDA. It’s what makes the “primary characterization of the activity” (to use the State Farm test words) something other than commercial.

And a word to the wise, never assume that a privacy statute applies. Application issues abound, and litigating them is where all the fun is at!

Canadian data commissioners have their say on political ad targeting… and more

On November 26th, the Office of the Privacy Commissioner of Canada (the OPC) and the Office of the Information and Privacy Commissioner of British Columbia (the OIPC) jointly held that online marketing company AggregateIQ Data Services breached Canadian privacy law in providing services that supported online targeted political advertising for two campaigns said to be instrumental to the 2016 “Brexit” referendum.

AggregateIQ is a small BC company that provided services to SCL Elections, the UK domiciled parent of Cambridge Analytica – infamous for harvesting data from millions of Facebook users without consent. The joint report is quite vague about whether AggregateIQ used the Cambridge Analytica contraband, and AggregateIQ denies it.

The central finding in the joint report is that information about one’s “political leanings or affiliations” is “sensitive” personal information. Accordingly, the commissioners said, authorization given by an individual to communicate via e-mail was not sufficient to authorize micro-targeting via Facebook (and its “custom audiences” service). They explained:

AIQ asserted that Facebook is simply another way of communicating with individuals, not materially different from email. We respectfully disagree. In our view, an individual who had initially provided their email address for purposes of being kept “up to date” or providing “opportunities to engage” with a campaign may expect to be contacted via email. They would not expect their email address to be used and disclosed to a social media company for advertising on their platform or any other unknown purposes… Accordingly, AIQ had the responsibility, under PIPA and PIPEDA, to ensure that it was relying on express consent for the work it was performing on behalf of Vote Leave.

This finding is about political ad targeting, a problem of great societal importance but of minimal relevance to most organizations. The report also includes two findings of broad relevance.

First, the fact the commissioners took jurisdiction over a matter involving service provided to a foreign political party is is important. The commissioners explained:

To be clear, we are not finding, in this section or below, that AIQ’s foreign clients were required to comply with Canadian and BC privacy laws. The practices of those political organizations would generally fall outside the scope of PIPA and PIPEDA, and in any event, were not the subject of this investigation. That said, to the extent that AIQ wished to rely on the consent obtained by those foreign clients for its own collection, use, and disclosure of personal information on their behalf, it would need to ensure that such consent was sufficient, under Canadian or BC law as the case may be, for its purposes.

This quote speaks to a time-old question about whether the use of a commercial service provider can cause activity that would otherwise not be regulated to become subject to Canadian privacy law. This question is of particular concern to provincially regulated employers, who are not subject to federal privacy legislation in respect of employment but often use service providers to help with employment administration.

The Federal Court’s decision in State Farm suggests that the mere engagement of a service provider is not enough to give rise to application (and enforcement jurisdiction). When State Farm was released in 2010 I asked a contact from the OPC how it would interpret State Farm. The answer was, “narrowly,” exactly as the above quote would suggest!

Second, the commissioners make a significant finding about the use of identifying information with Facebook’s “lookalike audience” service – a popular and powerful ad targeting service that involves uploading identifying information about a population (typically of customers) so Facebook can identify and target a lookalike population (of potential customers). The commissioners held that authorization to use personal information for “engagement” purposes was not authorization for “data analytics” purposes, stating:

It is also the case that the privacy notice largely speaks to collecting and using personal information for the purpose of engaging supporters in the campaign and performing services on their behalf. The disclosure of individuals’ personal information to Facebook for data analytics, via its “lookalike audience” feature, does not achieve or relate to either of those objectives. Instead, this disclosure is made to allow Facebook to link supporters to their Facebook profiles and analyze those profiles in order to identify, target, and persuade other similar or like-minded individuals. This is for Vote Leave’s benefit and can certainly not be viewed, in any way, as performing a service on behalf of the voter whose information was processed.

Organizations using the Facebook lookalike service or similar services should pay heed and revisit their source of authorization. More broadly, this quote speaks to the HUGE question, “Is using data to generate insights about a population a use of personal information at all?” The quote suggests the commissioners say “yes,” though there is a reference to data matching done by Facebook that may render this scenario unique. (I like to believe the answer is “no.”)

Joint investigation of AggregateIQ Data Services Ltd. by the Privacy Commissioner of Canada and the Information and Privacy Commissioner for British Columbia, 2019 CanLII 111872 (PCC).