Perspectives on anonymization report released

On December 18, Khaled El Emam, Anita Fineberg, Elizabeth Jonker and Lisa Pilgram published Perspectives of Canadian privacy regulators on anonymization practices and anonymization information: a qualitative study. It is based on input from all but one Canadian privacy regulator, and includes a great discussion of one of the most important policy issues in Canadian privacy law – What do we do about anonymization given the massive demand for artificial intelligence training data?

The authors stress a lack of precision and consistency in Canadian law. True that the fine parameters of Canadian privacy law are yet to be articulated, but the broad parameters of our policy are presently clear:

  • First, there must be authorization to de-identify personal information. The Canadian regulators who the authors spoke with were mostly aligned against a consent requirement, though not without qualification. If there’s no express authorization to de-identify without consent (as in Ontario PHIPA), one gets the impression that a regulator will not imply consent to de-identify data for all purposes and all manner of de-dentification.
  • Second, custodians of personal information must be transparent. One regulator said, “I have no sympathy for the point of view that it’s better not to tell people so as not to create any noise. I do not believe that that’s an acceptable public policy stance.” So, if you’re going to sell a patient’s health data to a commercial entity, okay, but you better let patients know.
  • Third, the information must be de-identified in a manner that renders the re-identification risk very low in the context. Lots can be said about the risk threshold and the manner of de-identification, and lots that will be said over the next while. The authors recommend that legislators adopt a “code of practice” model for establishing specific requirements for de-dentification.

The above requirements can all be derived from existing legislation, as is illustrated well by PHIPA Decision 175 in Ontario, about a custodian’s sale of anonymized personal health information. Notably, the IPC imposed a requirement on the disclosing custodian to govern the recipient entity by way of the data sale agreement, rooting its jurisdiction in the provision that requires safeguarding of personal health information a custodian’s control. One can question this root, though it is tied to re-identification risk and within jurisdiction in my view.

What’s not in current Canadian privacy legislation is any restriction on the purpose of de-dentification, the identity of recipients, or the nature of the recipient’s secondary use. This is a BIG issue that is tied to data ethics. Should a health care provider ever be able to sell its data to an entity for commercial use? Should custodians be responsible for determining whether the secondary use is likely to harm individuals or groups – e.g., based on the application of algorithmic bias?

Bill C-27 (the PIPEDA replacement bill) permits the non-consensual disclosure of de-identified personal information to specific entities for a “socially beneficial purpose” – “a purpose related to health, the provision or improvement of public amenities or infrastructure, the protection of the environment or any other prescribed purpose.” Given C-27 looks fated to die, Alberta’s Bill 33 may lead the way, and if passed will restrict Alberta public bodies from disclosing “non-personal information” outside of government for any purpose other than “research and analysis” and “planning, administering, delivering, managing, monitoring or evaluating a program or service” (leaving AI model developers wondering how far they can stretch the concept of “research”).

Both C-27 and Bill 33 impose a contracting requirement akin to that imposed by the IPC in Decision 175. Bill 33, for example, only permits disclosure outside of government if:

(ii) the head of the public body has approved conditions relating to the following: (A) security and confidentiality; (B) the prohibition of any actual or attempted re-identification of the non-personal data; (C) the prohibition of any subsequent use or disclosure of the non-personal data without the express authorization of the public body; (D) the destruction of the non-personal data at the earliest reasonable time after it has served its purpose under subclause (i), unless the public body has given the express authorization referred to in paragraph (C),

and

(iii) the person has signed an agreement to comply with the approved conditions, this Act, the regulations and any of the public body’s policies and procedures
relating to non-personal data.

Far be it from me to solve this complex policy problem, but here are my thoughts:

  • Let’s aim for express authorization to-de identify rather than continuing to rely on a warped concept of implied consent. Express authorization best promotes transparency and predictability.
  • I’m quite comfortable with a generally stated re-identification risk threshold, and wary of a binding organizations to a detailed and inaccessible code of practice.
  • Any foray into establishing ethical or other requirements for “research” should respect academic freedom, and have an appropriate exclusion.
  • We need to eliminate downstream accountability for de-identified data of the kind that is invited by the Bill 33 provision quoted above. Custodians don’t have the practical ability to enforce these agreements, and the agreements will therefore invite huge potential liability. Statutes should bind recipients and immunize organizations who disclose de-identified information for a valid purpose from downstream liability.

Do have a read of the report, and keep thinking and talking about these important issues.

OPC gives guidance, argues for more enforcement power

It’s hard being the Office of the Privacy Commissioner of Canada. The OPC is responsible making sure all is right in commercial sector and federal government sector privacy. It has a pretty small operating budget, yet issues in these sectors are meaty and novel – I dare say harder to deal with than the privacy issues raised in the health and provincial public sectors. More than anything, meeting the OPC mandate is particularly challenging because the mandate is to enforce a principled statute that affords a “right to privacy” that lacks a well-understood meaning.

It is in this context that the OPC issued its 2016-2017 Annual Report to Parliament. The report includes a 24 page “year in review” on PIPEDA that follows the OPC’s public consultation on informed consent and some polling work that shows 90% of Canadians are concerned about their privacy. The OPC concludes that the PIPEDA commercial sector regime is at a crossroads – making some suggestions about new directions, giving some practical guidance and arguing for more enforcement power.

This post is to highlight the most significant new directions and practical guidance and to provide a short comment on the argument for more enforcement power.

The most significant new directions and practical guidance:

  • The OPC will expect organizations to address four elements in obtaining informed consent – what personal information is being collected, who it is being shared with (including an enumeration of third parties), for what purposes is information collected, used or shared (including an explanation of purposes that are not integral to the service) and what is the risk of harm to the individual, if any.
  • The OPC will draft and consult on new guidance that will explicitly describe those instances of collection, use or disclosure of personal information which we believe would be considered inappropriate from the reasonable person standpoint under subsection 5(3) of PIPEDA (no-go zones).
  • The OPC says that “in all but exceptional cases, consent for the collection, use and disclosure of personal information of children under the age of 13, must be obtained from their parents or guardians” and “As for youth aged 13 to 18, their consent can only be considered meaningful if organizations have taken into account their level of maturity in developing their consent processes and adapted them accordingly.”
  • The OPC will encourage industry to develop codes of practice and fund research for the purpose of developing codes of practice to address more particular, sector-specific challenges – presumably a mechanism by which organizations will be able to seek safe harbour.
  • The OPC will make greater use of its power to initiate investigations “where [it sees] specific issues or chronic problems that are not being adequately addressed.”

Then, there’s the OPC’s argument for more enforcement powers. Specifically, the OPC wants Parliament to drop the “reasonable grounds” restriction from its audit power so it can engage in truly proactive audits, it wants the power to levy fines and it wants PIPEDA to feature a private right of action – all of which would invite a departure from the ombudsman model the OPC has operated under since PIPEDA came into force in 2004.

I personally dislike the ombudsman model of enforcement because it doesn’t come with the procedural safeguards associated with more formal enforcement models and can therefore give the ombudsman a frightening degree of “soft” power. This said, the prospect of big fines and lawsuits based on substantive rules that are poorly defined and understood is even more frightening to to those in the business of privacy compliance and defence. This is the irony of the OPC report: at the same time the OPC admits that the substance of the PIPEDA is, at the very least, “challenged” it asks to enforce it with a new hammer. Now going through an admittedly bad experience with CASL – legislation that the OPC would argue is much more “ineffective” than PIPEDA (see p. 34) – we can readily foresee the wasted compliance costs that the proposed change to PIPEDA could invite. Even if business is indeed responsible for the great concern about privacy that the OPC’s polling effort reveals, this is nonetheless a valid position for business to take going forward.