Perspectives on anonymization report released

On December 18, Khaled El Emam, Anita Fineberg, Elizabeth Jonker and Lisa Pilgram published Perspectives of Canadian privacy regulators on anonymization practices and anonymization information: a qualitative study. It is based on input from all but one Canadian privacy regulator, and includes a great discussion of one of the most important policy issues in Canadian privacy law – What do we do about anonymization given the massive demand for artificial intelligence training data?

The authors stress a lack of precision and consistency in Canadian law. True that the fine parameters of Canadian privacy law are yet to be articulated, but the broad parameters of our policy are presently clear:

  • First, there must be authorization to de-identify personal information. The Canadian regulators who the authors spoke with were mostly aligned against a consent requirement, though not without qualification. If there’s no express authorization to de-identify without consent (as in Ontario PHIPA), one gets the impression that a regulator will not imply consent to de-identify data for all purposes and all manner of de-dentification.
  • Second, custodians of personal information must be transparent. One regulator said, “I have no sympathy for the point of view that it’s better not to tell people so as not to create any noise. I do not believe that that’s an acceptable public policy stance.” So, if you’re going to sell a patient’s health data to a commercial entity, okay, but you better let patients know.
  • Third, the information must be de-identified in a manner that renders the re-identification risk very low in the context. Lots can be said about the risk threshold and the manner of de-identification, and lots that will be said over the next while. The authors recommend that legislators adopt a “code of practice” model for establishing specific requirements for de-dentification.

The above requirements can all be derived from existing legislation, as is illustrated well by PHIPA Decision 175 in Ontario, about a custodian’s sale of anonymized personal health information. Notably, the IPC imposed a requirement on the disclosing custodian to govern the recipient entity by way of the data sale agreement, rooting its jurisdiction in the provision that requires safeguarding of personal health information a custodian’s control. One can question this root, though it is tied to re-identification risk and within jurisdiction in my view.

What’s not in current Canadian privacy legislation is any restriction on the purpose of de-dentification, the identity of recipients, or the nature of the recipient’s secondary use. This is a BIG issue that is tied to data ethics. Should a health care provider ever be able to sell its data to an entity for commercial use? Should custodians be responsible for determining whether the secondary use is likely to harm individuals or groups – e.g., based on the application of algorithmic bias?

Bill C-27 (the PIPEDA replacement bill) permits the non-consensual disclosure of de-identified personal information to specific entities for a “socially beneficial purpose” – “a purpose related to health, the provision or improvement of public amenities or infrastructure, the protection of the environment or any other prescribed purpose.” Given C-27 looks fated to die, Alberta’s Bill 33 may lead the way, and if passed will restrict Alberta public bodies from disclosing “non-personal information” outside of government for any purpose other than “research and analysis” and “planning, administering, delivering, managing, monitoring or evaluating a program or service” (leaving AI model developers wondering how far they can stretch the concept of “research”).

Both C-27 and Bill 33 impose a contracting requirement akin to that imposed by the IPC in Decision 175. Bill 33, for example, only permits disclosure outside of government if:

(ii) the head of the public body has approved conditions relating to the following: (A) security and confidentiality; (B) the prohibition of any actual or attempted re-identification of the non-personal data; (C) the prohibition of any subsequent use or disclosure of the non-personal data without the express authorization of the public body; (D) the destruction of the non-personal data at the earliest reasonable time after it has served its purpose under subclause (i), unless the public body has given the express authorization referred to in paragraph (C),

and

(iii) the person has signed an agreement to comply with the approved conditions, this Act, the regulations and any of the public body’s policies and procedures
relating to non-personal data.

Far be it from me to solve this complex policy problem, but here are my thoughts:

  • Let’s aim for express authorization to-de identify rather than continuing to rely on a warped concept of implied consent. Express authorization best promotes transparency and predictability.
  • I’m quite comfortable with a generally stated re-identification risk threshold, and wary of a binding organizations to a detailed and inaccessible code of practice.
  • Any foray into establishing ethical or other requirements for “research” should respect academic freedom, and have an appropriate exclusion.
  • We need to eliminate downstream accountability for de-identified data of the kind that is invited by the Bill 33 provision quoted above. Custodians don’t have the practical ability to enforce these agreements, and the agreements will therefore invite huge potential liability. Statutes should bind recipients and immunize organizations who disclose de-identified information for a valid purpose from downstream liability.

Do have a read of the report, and keep thinking and talking about these important issues.

Why “Border Security” was shut down

The media has reported that a Report of Findings recently issued by the Privacy Commissioner of Canada (OPC) led to the cancellation of the television show “Border Security” – a privately produced documentary that covered the operations of the Canada Border Services Agency (CBSA).

How is it that the CBSA was made liable for a breach of the federal Privacy Act for intrusive action taken by an arm’s-length producer?

In its 26-page report the OPC does probe at the degree of control the CBSA exercised over the producer’s activity but ultimately declined to find that the producer’s collection of personal information was also the CBSA’s collection of personal information. The OPC explained:

However, the question of whether the CBSA can be said to be participating in the collection of personal information for the purpose of the Program is not determinative of our finding in this case. In our view, the CBSA is first collecting personal information in the context of its enforcement activities and thereby has a responsibility under the Act for any subsequent disclosure of the information that is collected for, or generated by, such activities.

Following our investigation, we are of the view that there is a real-time disclosure of personal information by the CBSA to Force Four [the producer] for the purpose of Filming the TV Program. Under section 8 of the Act, unless the individual otherwise provided consent, this personal information collected by the CBSA may only be disclosed for the purpose(s) for which it was obtained, for a consistent use with that purpose, or for one of the enumerated circumstances under section 8(2).

By this reasoning the OPC distinguishes the information flow under assessment from one in which CBSA is simply being observed while conducting its operations. The OPC finding seems to rest on the CBSA’s purposeful provision of access to personal information that would have otherwise been inaccessible – access that invites a “real-time” disclosure of personal information. The OPC applies a novel, expansive conception of a “disclosure.”

From time-to-time organizations are faced with a concern about the potentially invasive activities of others on their property or otherwise within their domain. Most often, they can take comfort in the availability of an “it’s not my collection and not my doing” defence. This OPC finding illustrates when such a defence might not be available.

Report of Findings dated 6 June 2016 (PA-031594).

Alberta CA comments on meaning of “personal information”

Whether information is “personal information” – information about an identifiable individual – depends on the context. The Court of Appeal of Alberta issued an illustrative judgement on April 14th. It held that a request for information about a person’s property was, in the context, a request for personal information. The Court explained:

In general terms, there is some universality to the conclusion in Leon’s Furniture that personal information has to be essentially “about a person”, and not “about an object”, even though most objects or properties have some relationship with persons. As the adjudicator recognized, this concept underlies the definitions in both the FOIPP Act and the Personal Information Protection Act. It was, however, reasonable for the adjudicator to observe that the line between the two is imprecise. Where the information related to property, but also had a “personal dimension”, it might sometimes properly be characterized as “personal information”. In this case, the essence of the request was for complaints and opinions expressed about Ms. McCloskey. The adjudicator’s conclusion (at paras. 49-51) that this type of request was “personal”, relating directly as it did to the conduct of the citizen, was one that was available on the facts and the law.

The requester wanted information about her property because she was looking for complaints related to her actions. The request was therefore for the requester’s personal information. Note the Court’s use of the word “sometimes”: context matters.

Edmonton (City) v Alberta (Information and Privacy Commissioner), 2016 ABCA 110 (CanLII).

Reasonable necessity not enough to justify collection under Ontario’s public sector statutes

Section 38(2) is an important provision of Ontario’s provincial public sector privacy statue. It requires institutions to satisfy a necessity standard in collecting personal information. Ontario’s municipal public sector privacy statute contains the same provision.

On May 4th, the Divisional Court dismissed an Liquor Control Board of Ontario argument that the Information and Privacy Commissioner/Ontario had erred by applying a higher standard than “reasonable necessity” in resolving a section 38(2) issue. The Divisional Court held that the Court of Appeal for Ontario’s Cash Converters case establishes just such a standard:

The LCBO relies upon Cash Converters to support its submission that the IPC erred in not interpreting “necessary” as meaning “reasonably necessary.” However, Cash Converters does not interpret “necessary” in this way. In fact, it suggests the opposite. Arguably, something that is “helpful” to an activity could be “reasonably necessary” to that activity. Yet, the Court of Appeal makes it clear that “helpful” is not sufficient.

It’s hard to fathom a legislative intent to prohibit a practice that is, by definition “reasonable.” If the LCBO seeks and is granted leave to appeal this could lead to an important clarification from the Court of Appeal on a strict interpretation of section 38(2) that has stood for some time. The LCBO practice at issue – which involves collecting the non-sensitive information of wine club members to control against the illegal stockpiling and reselling of alcohol – is a good one for testing the line.

Liquor Control Board of Ontario v Vin De Garde Wine Club, 2025 ONSC 2537.