Notes on Nova Scotia’s FOIPOP Reform Bill

On Friday, the Nova Scotia legislature introduced Bill 150, a new statute that consolidates the province’s public sector access and privacy laws and introduces key modernization reforms. Below are some quick highlights from the bill.

Class-based exemption for security control information. I just posted last week about withholding information that could jeopardize network security. Nova Scotia’s proposed legislation includes a novel class-based exemption that permits a head to withhold “information the disclosure of which could reasonably be expected to reveal, or lead to the revealing of, measures put in place to protect the security of information stored in electronic form.” Having previously negotiated with regulators to exclude control-related details from investigation reports, I view this language as both protective and positive.

New privacy impact assessment requirement. Under Bill 150, public bodies will be required to conduct a privacy impact assessment (PIA) before initiating any “project, program, system, or other activity” that involves the collection, use, or disclosure of personal information. The PIA must also be updated if there is a substantial change to the activity. A key question is whether the term “other activity” is broad enough to include non-routine or minimal data collections—which public bodies may prefer not to assess.

Power to collect for threat assessment purposes. This touches on an issue I’ve followed for years: behavioral threat assessment and the conduct of so-called “threat inquiries.” Conducting a threat inquiry in response to concerning behavior to properly assess a human threat is a best practice that arose out of 2004 United States school shooting report. However, their legality has been questioned when conducted by institutions without a law enforcement mandate. Nova Scotia’s proposed legislation includes a new authorization to collect personal information—either directly or indirectly—for the purpose of reducing the risk that an individual will be the victim of intimate partner violence or human trafficking. This is a positive step, but it raises a key question: What about other forms of physical violence? The statute’s narrow focus may leave gaps in protection where threat assessments could be equally justified.

New offshoring rules. The new statute, if passed, will repeal the Personal Information International Disclosure Protection Act (PIIDPA)- Nova Scotia’s statute that prohibits public bodies and municipalities from storing, accessing, or disclosing personal information outside of Canada unless an exception applies. It will replace it with a new provision, however, that could be used to continue a similar prohibition. The new provision prohibits disclosing and storing personal information outside of Canada (as well as permitting personal information to be accessed from outside of Canada) unless in accordance with regulations. It does not contemplate regulation of service providers and their employees, which is a feature of PIIDPA.

New breach notification. The new statute, if passed, will include privacy breach notification and reporting, triggered when “it is reasonable to believe that an affected individual could experience significant harm as a result of the privacy breach.” This is equivalent to the “real risk of significant harm standard” in my view.

Supreme Court power to remedy breaches. The new statute, if passed, will give the Nova Scotia Supreme Court the power to issue orders when “personal information has been stolen or has been collected by or disclosed to a third party other than as authorized by this Act.” British Columbia has a more elaborate version of such a provision, which can help public bodies respond to breaches given ongoing legal uncertainty around the status of personal information as property.

Hat tip to David Fraser.

Ontario (M)FIPPA institutions, file encryption, and breach notification – a hint

As most of you know, the Ontario IPC released four decisions in the summer relating to breach reporting and notification obligations under PHIPA and the CYSFA. One controversial finding (which is subject to a judicial review application) is that the encryption of files by ransomware actors triggers an unauthorized use and a loss of personal and personal health information. Given there is no risk-based threshold for reporting and notification in PHIPA, custodians and service providers must report and notify in respect of this particular kind of breach, even if the threat actors have not stolen or laid eyes on information.

Leaving legal analysis aside, I’ll say that this is odd policy that has led to odd questions about who is affected by file encryption. Do we really care? Does this have any meaning to “affected” individuals?

The negative impact is that it threatens the clarity of communications about matters that institutions need to communicate clearly: “Yes there’s been a privacy breach, but the threat actor(s) didn’t steal or view your information. And information has been “lost,” but not lost as in “stolen.” 🤦🏽‍♂️

One can honestly question whether there is any public good in this garble. The IPC has lobbied for cyber incident reporting, which this interpretation of PHIPA and the CYFSA effectively achieves. Cyber incident reporting should be brought in properly, through legislation, and leave out the notification obligation.

But how far does the finding extend?

The four decisions released in the summer left a question about how the encryption finding would apply to MFIPPA and FIPPA institutions, who are encouraged (but not yet legally required) to report and notify based on the “real risk of signficant harm” standard. This standard will become a legal imperative when the provisions of Bill 194 come into force.

On December 10, the IPC issued a privacy complaint report that addressed file encryption at an MFIPPA institution and (in qualified terms) held that notification was not required. Mr. Gayle explained:

As the affected personal information remains encrypted and the police’s investigation found no evidence of exfiltration, it is not clear whether the breach “poses a real risk of significant harm to [these individuals], taking into consideration the sensitivity of the information and whether it is likely to be misused”. As such, it is not clear whether the police should have given direct notice of the breach to affected individuals in accordance with the IPC’s Privacy Breach Guidelines.

However, I am mindful of the fact that the police provided some notice to the public about the extent of the ransomware attack, and of the investigative and remedial steps they took to address it. I am also mindful of the fact that the breach occurred more than three years ago.

For these reasons, I find that it would serve no useful purpose in recommending that the police renotify affected individuals of the breach in accordance with the IPC’s Privacy Breach Guidelines and, as a result, do not need to decide whether the breach in this case met the threshold of “real risk of significant harm to the individual”.

This is helpful guidance, and should allow MFIPPA and FIPPA institutions to respond to matters with the clearest possible communication.

Sault Ste. Marie Police Services Board (Re), 2024 CanLII 124986 (ON IPC).

Perspectives on anonymization report released

On December 18, Khaled El Emam, Anita Fineberg, Elizabeth Jonker and Lisa Pilgram published Perspectives of Canadian privacy regulators on anonymization practices and anonymization information: a qualitative study. It is based on input from all but one Canadian privacy regulator, and includes a great discussion of one of the most important policy issues in Canadian privacy law – What do we do about anonymization given the massive demand for artificial intelligence training data?

The authors stress a lack of precision and consistency in Canadian law. True that the fine parameters of Canadian privacy law are yet to be articulated, but the broad parameters of our policy are presently clear:

  • First, there must be authorization to de-identify personal information. The Canadian regulators who the authors spoke with were mostly aligned against a consent requirement, though not without qualification. If there’s no express authorization to de-identify without consent (as in Ontario PHIPA), one gets the impression that a regulator will not imply consent to de-identify data for all purposes and all manner of de-dentification.
  • Second, custodians of personal information must be transparent. One regulator said, “I have no sympathy for the point of view that it’s better not to tell people so as not to create any noise. I do not believe that that’s an acceptable public policy stance.” So, if you’re going to sell a patient’s health data to a commercial entity, okay, but you better let patients know.
  • Third, the information must be de-identified in a manner that renders the re-identification risk very low in the context. Lots can be said about the risk threshold and the manner of de-identification, and lots that will be said over the next while. The authors recommend that legislators adopt a “code of practice” model for establishing specific requirements for de-dentification.

The above requirements can all be derived from existing legislation, as is illustrated well by PHIPA Decision 175 in Ontario, about a custodian’s sale of anonymized personal health information. Notably, the IPC imposed a requirement on the disclosing custodian to govern the recipient entity by way of the data sale agreement, rooting its jurisdiction in the provision that requires safeguarding of personal health information a custodian’s control. One can question this root, though it is tied to re-identification risk and within jurisdiction in my view.

What’s not in current Canadian privacy legislation is any restriction on the purpose of de-dentification, the identity of recipients, or the nature of the recipient’s secondary use. This is a BIG issue that is tied to data ethics. Should a health care provider ever be able to sell its data to an entity for commercial use? Should custodians be responsible for determining whether the secondary use is likely to harm individuals or groups – e.g., based on the application of algorithmic bias?

Bill C-27 (the PIPEDA replacement bill) permits the non-consensual disclosure of de-identified personal information to specific entities for a “socially beneficial purpose” – “a purpose related to health, the provision or improvement of public amenities or infrastructure, the protection of the environment or any other prescribed purpose.” Given C-27 looks fated to die, Alberta’s Bill 33 may lead the way, and if passed will restrict Alberta public bodies from disclosing “non-personal information” outside of government for any purpose other than “research and analysis” and “planning, administering, delivering, managing, monitoring or evaluating a program or service” (leaving AI model developers wondering how far they can stretch the concept of “research”).

Both C-27 and Bill 33 impose a contracting requirement akin to that imposed by the IPC in Decision 175. Bill 33, for example, only permits disclosure outside of government if:

(ii) the head of the public body has approved conditions relating to the following: (A) security and confidentiality; (B) the prohibition of any actual or attempted re-identification of the non-personal data; (C) the prohibition of any subsequent use or disclosure of the non-personal data without the express authorization of the public body; (D) the destruction of the non-personal data at the earliest reasonable time after it has served its purpose under subclause (i), unless the public body has given the express authorization referred to in paragraph (C),

and

(iii) the person has signed an agreement to comply with the approved conditions, this Act, the regulations and any of the public body’s policies and procedures
relating to non-personal data.

Far be it from me to solve this complex policy problem, but here are my thoughts:

  • Let’s aim for express authorization to-de identify rather than continuing to rely on a warped concept of implied consent. Express authorization best promotes transparency and predictability.
  • I’m quite comfortable with a generally stated re-identification risk threshold, and wary of a binding organizations to a detailed and inaccessible code of practice.
  • Any foray into establishing ethical or other requirements for “research” should respect academic freedom, and have an appropriate exclusion.
  • We need to eliminate downstream accountability for de-identified data of the kind that is invited by the Bill 33 provision quoted above. Custodians don’t have the practical ability to enforce these agreements, and the agreements will therefore invite huge potential liability. Statutes should bind recipients and immunize organizations who disclose de-identified information for a valid purpose from downstream liability.

Do have a read of the report, and keep thinking and talking about these important issues.

Notable features of the Alberta public sector privacy bill

Alberta has recently introduced Bill 33 – a public sector privacy “modernization” bill. Alberta has put significantly more thought into its modernization bill than Ontario, who introduced modest FIPPA reforms in a more splashy and less substantive reform bill earlier this year. This means Bill 33 is significant because it is leading. Might it set the new public sector norm?

Here are Bill 33’s notable features:

  • Bill 33 will require public bodies to give pre-collection notice of an intent to input personal information into an “automated system to generate content or make decisions, recommendations or predications.” Automated system is not defined, and it is unclear if this is meant to foster decision-making transparency or transparency about downstream data use.
  • Bill 33 will require breach notification and reporting based on the “real risk of significant harm” standard. Reports to the OIPC and the Minister responsible for the Act will be required. Requiring reports to the regulator and government is novel.
  • Bill 33 will prohibit the sale of personal information “in any circumstances or for any purpose.” Sale is not defined.
  • Bill 33 has an allowance for disclosing personal information if the disclosure would not constitute an unjustified invasion of personal privacy. This flexible allowance – which contemplates balancing interests – does not typically apply outside of the access request context.
  • Bill 33 has a prohibition on data matching to produce derived personal information about an identifiable individual. This matching will only be permitted for “research and analysis” and “planning, administering, delivering, managing, mentoring or evaluating a program or service” unless additional allowances are implemented by regulation. The Alberta OIPC has said that “research and analysis” should be defined, and that that there should be a transparency requirements for data matching.
  • Bill 33 will establish rules regarding de-identified or “non-personal data.” The rules will permit disclosure of non-personal data to another public body without restriction, but disclosures of non-personal data to others will be limited to specified purposes and subject to requirements that render downstream users accountable to the disclosing public body. Public bodies will also have a duty to secure non-personal data.
  • Bill 33 will require public bodies to establish and implement privacy management programs consisting of documented policies and procedures. It will also mandate privacy impact assessments in circumstances that will be prescribed, with submission to the OIPC also to be prescribed in some circumstances.

There is a long list of exceptions to the indirect collection prohibition in the Bill, but no exceptions that permit the collection of personal information for threat assessment purposes. Violence threat risk assessments have become a standard means by which educational institutions discharge their safety-related duties. “VTRAs” rest on an indirect collection of personal information that should be expressly authorized in any modernized public sector privacy statues.

Saskatchewan IPC issues report on Edge imaging incident

I’m working through a reading pile today, and will note briefly that the Saskatchewan IPC has issued a report about the Edge Imaging cyber incident from earlier this year, which affected a number of Ontario school boards.

It was an atypical incident. Edge Imaging used a subcontractor called Entourage Yearbooks to store and process school yearbook photos. A threat actor accessed an Entourage AWS server, downloaded and deleted photos and held them for ransom. Edge ultimately reported to its school board/division clients that Entourage, “reported that they secured the return of all the Canadian photo files from the threat actors, along with their commitment that the photo files have been deleted, and were not distributed.”

The Saskatchewan IPC report deals with whether the photos contained personal information, whether the affected school divisions met their duty to notify, and whether the service providers investigated reasonably, and whether the affected school divisions took appropriate protective steps in light of the incident. It is very cursory. The matter is simply a reminder about outsourcing risks, which school boards need to manage. The Ontario IPC updated its guidance earlier this year – see Privacy and Access in Public Sector Contracting with Third Party Service Providers.

Edge Imaging (Re), 2024 CanLII 90510 (SK IPC).

BCCA sends notice issue back to BC OIPC

On September 25th, the Court of Appeal for British Columbia partially upheld Airbnb’s successful judicial review of a British Columbia OIPC decision that required the City of Vancouver to disclose short term rental addresses along with related information, but vacated the application judge’s order to notify over 20,000 affected individuals.

Background

The City licenses short term rentals. It publicly discloses license information, presumably to enable renter inquires. However, the City stopped publishing host names and rental addresses with license information in 2018 based on credible reports of safety risks. Evidence of the safety risks was on the record before the OIPC – general evidence about “concerned vigilante activity” and harassment, evidence about a particular stalking episode in 2019 and evidence that raised a concern about enabling criminals to determine when renters likely to be out of the country.

The OIPC nonetheless ordered the City to disclose:

  • License numbers of individuals;
  • Home addresses of all hosts (also principle residences given licensing requirements); and
  • License numbers associated with the home addresses.

It was common ground that the above information could be readily linked to hosts by using publicly available information, rendering the order upsetting to Airbnb’s means of protecting its hosts. Airbnb only discloses the general area of rentals on its platform, which allows hosts to screen renters before disclosing their address.

Supreme Court Decision

The application judge affirmed the OIPC dismissal of the City’s safety concern as a reasonable application of the Merck test, but held that the OIPC erred on two other grounds.

First, the Court held that the OIPC unreasonably held that home address information was contact information rather than personal information. It failed to consider the context in making a simplistic finding that home address information was “contact information” because the home address was used as a place of business. The disclosure of the home address information, in the context, had a significant privacy impact that the OIPC ought to have considered.

Second, the Court held that the OIPC erred in not giving notice to the affected hosts – who numbered at least 20,000 – and for not providing reasons for its failure. The Court said this was a breach of procedural fairness, a breach punctuated by the evidence of a stalking and harassment risk that the OIPC acknowledged but held did not meet the Merck threshold.

Appeal Court Decision

The Court of Appeal affirmed the lower court’s contact information finding. It also held that the matter of notice to third parties ought to have been raised before the OIPC at the first instance, and that the application judge ought not to have ordered notice to be given. It stressed the OIPC’s discretion, and said:

Relevant facts that may inform the analysis include the nature of the records in issue, the number of potentially affected third parties, the practical logistics of providing notice, whether there are alternative means of doing so, and potential institutional resource issues.

Analysis

Giving notice and an opportunity to make submissions to 20,000 affected individuals is no small matter. In this case, valid electronic contact information was likely available. However, even a 2% response rate would generated 400 submissions, each of which deserving of due consideration.

Many institutions, thinking practically, would simply deny access as a means of avoiding this burden and respecting affected party rights, bearing in mind that the Supreme Court of Canada cautioned in Merck that notice should be given prior to disclosure in all but “clear cases.” When an institution denies access to avoid a massive notification burden, that burden transfers to the relevant commissioner/adjudicator, and even recognizing “practical logistics” and “institutional resource issues,” is see no reason why the “clear cases” rule from Merck should not be the governing test.

The Office of the Information and Privacy Commissioner for British Columbia v. Airbnb Ireland UC, 2024 BCCA 333.

Online proctoring report a must read for Ontario institutions

Online proctoring software was critical to higher education institutions during the heart of the pandemic. Though less signficant today, the report of findings issued by the Information and Privacy Commissioner/Ontario last week about McMaster University’s use of online proctoring is an important read for Ontario public sector institutions – with relevant guidance on IT contracting, the use of generative AI tools and even the public sector necessity test itself.

The necessity test

To be lawful, the collection of personal information by Ontario public sector institutions must be “necessary to the proper administration of a lawfully authorized activity.” The Court of Appeal for Ontario adopted the IPC’s interpretation of the test in Cash Converters in 2007. It is strict, requiring justification to collect each data element, and the necessity standard requires an institution to establish that a collection is more than “merely helpful.”

The strictness of the test leaves one to wonder whether institutions’ business judgment carries any weight. This is a particular concern for universities, whose judgement in academic matters has been given special deference by courts and administrative decision-makers and is protected by a FIPPA exclusion that carves out teaching and research records from the scope of the Act. It does not appear that McMaster argued that the teaching and research records exclusion limited the IPC’s jurisdiction to scrutinize its use of online proctoring, but McMaster did argue that it, “retains complete autonomy, authority, and discretion to employ proctored online exams, prioritizing administrative efficiency and commercial viability, irrespective of necessity.”

The IPC rejected this argument, but applied a form of deference nonetheless. Specifically, the IPC did not question whether the University’s use of online proctoring was necessary. It held that the University’s decision to employ online proctoring was lawfully authorized, and only considered whether the University’s online proctoring tool collected personal information that was necessary for the University to employ online proctoring.

This deferential approach to the Ontario necessity test is not self-evident, though it is the same point that the University of Western Ontario prevailed on in2022 in successfully defeating a challenge to its vaccination policy. In Hawke v Western University, the Court declined to scrutinize the necessity of the University’s vaccination policy itself; the only questions invited by FIPPA were (a) whether the the University’s chosen policy was a lawful exercise of its authority, and (b) whether the collection of vaccination status information to enforce the chosen and lawful policy was necessary.

To summarize, the authority now makes clear that Ontario institutions get to set their own “policy” within the scope of their legal mandates, even if the policy invites the collection of personal information. The necessity of the collection is then measured against the purposes of the chosen lawful policy.

IT contracting

It is common for IT service providers to reserve a right to use the information they process in providing services to institutions. Institutions should appreciate whether the right reserved is a right to use aggregate or de-identified information, or a right to use personal information.

The relevant term of use in McMaster’s case was as follows:

Random samples of video and/or audio recordings may be collected via Respondus Monitor and used by Respondus to improve the Respondus Monitor capabilities for institutions and students. The recordings may be shared with researchers under contract with Respondus to assist in such research. The researchers are consultants or contractors to Respondus and are under written obligation to maintain the video and/or audio recordings in confidence and under terms at least as strict as these Terms. The written agreements with the researchers also expressly limit their access and use of the data to work being done for Respondus and the researchers do not have the right to use the data for any other purposes. No personally identifiable information for students is provided with the video and/or audio recordings to researchers, such as the student’s name, course name, institution, grades, or student identification photos submitted as part of the Respondus Monitor exam session.

Despite the (dubious) last sentence of this text, the IPC held that this contemplated a use of test taker personal information was for a secondary purpose that was not a “consistent purpose.” It was therefore not authorized by FIPPA.

In recommending that the University secure a written undertaking from the service provider that it would cease to use student personal information for system improvement purposes without consent, the IPC carefully noted that the service provider had published information that indicated it refrains from this use in certain jurisdictions.

In addition to this finding and a number of related findings about the use of test taker personal information for the vendor’s secondary purposes, the IPC held:

  • the vendor contract was deficient because it did not require the vendor to notify the University in the event that it is required to disclose a test taker’s personal data to authorities; and
  • that the University should contractually require the vendor to delete audio and video recordings from its servers on, at minimum, an annual basis and that the vendor provide confirmation of this deletion.

The McMaster case adds to the body of IPC guidance on data protection terms. The IPC appears to be accepting of vendor de-identification rights, but not of vendor rights to use personal information.

Generative AI

While the IPC recognized that Ontario does not have law or binding policy specifically governing the use of artificial intelligence in the public sector, it nonetheless recommended that the University build in “guardrails” to protect its students from the risks of AI-enabled proctoring software. Specifically, the IPC recommended that the University:

  • conduct an algorithmic impact assessment and scritinize the source or provenance of the data used to train the vendors algorithms;
  • engage and consult with affected parties (including those from vulnerable or historically marginalized groups) and those with relevant expertise;
  • provide an opt out as a matter of accommodating students with disabilities and “students having serious apprehensions about the AI- enabled software and the significant impacts it can have on them and their personal information”;
  • reinforce human oversight of outcomes by formalizing and communicating about an informal process for challenging outcomes (separate and apart from formal academic appeal processes);
  • conduct greater scrutiny over how the vendor’s software was developed to ensure that any source data used to train its algorithms was obtained in compliance with Canadian laws and in keeping with Ontarians’ reasonable expectations; and
  • specifically prohibit the vendor from using students’ personal information for algorithmic training purposes without their consent.

The IPC’s approach suggests that it expects institutions to employ a higher level of due diligence in approaching AI-enabled tools given their inherent risks.

Privacy Complaint Report PI21-00001.

Newfoundland court recognizes intrusion upon seclusion tort

In somewhat strange circumstances, the Supreme Court of Newfoundland and Labrador has recognized the intrusion upon seclusion privacy tort.

The Court made its recognition in deciding a procedural motion in a Municipal Elections Act appeal by two City of Mount Pearl councillors who were sanctioned for not disclosing a conflict of interest. The alleged conflict arose out of their discussions with the Town’s former CAO while he was on administrative leave and the subject of a harassment investigation.

The City had discovered the conflict after it seized the CAO’s work iPad, which was still sending snippets of messages from the CAO’s personal Facebook Messenger account to the iPad’s home screen. Staff from IT saw the troubling messages, gave the iPad to the Clerk who saw more troubling messages, and the City eventually downloaded the messages for its use as evidence. At some point later, the messages were leaked to the CBC.

Whether the common law right of action for intrusion upon seclusion exists in Newfoundland had not yet been determined but was certified as a common issue in Hynes v. Western Regional Integrated Health Authority, 2014 NLTD(G) 137. Here, the Court held that the province has “a common law tort for intrusion upon seclusion” and that it “coexists with rights created under the [Newfoundland and Labrador] Privacy Act.”

Not surprisingly, in light of the Supreme Court of Canada decision in R v Cole, the Court found a privacy expectation that warranted protection, though its analysis on this point bleeds into its finding that the City’s actions were “highly offensive.” It went on to exclude the messages from the appeal record on the basis of its procedural power.

I might have thought this was a closer case than the outcome suggests, but privacy is such a subjective concept that it’s hard to predict how a judge will view a matter. It’s also another case about using a work computer to access content in a private cloud account, which apparently touches a judicial nerve.

Hindsight is 20/20, but as the judge said, the City could have stopped once it viewed the snippets and used the observations made by IT and the Clerk to request access from the CAO (who was presumably still employed and with a duty to cooperate and who faced a possible adverse inference). I would be concerned about the potential destruction of evidence – all stored in the CAO controlled account – but (unfortunately) the Court did not consider this factor.

Power v. Mount Pearl (City), 2022 NLSC 129 (CanLII).

Intrusion upon seclusion is an intentional tort – Ont CA

The Court of Appeal for Ontario has addressed an important point about the intentionality element in the intrusion upon seclusion tort.

The Court dismissed an appeal by a nurse who claimed her employer’s liability insurer had a duty to defend her from claims that arose out of her unauthorized access to patient information. The issue was whether policy language limiting coverage for “expected” or “intended” injury applied, which required the Court to analyze whether an allegation that one has committed the intrusion tort is an allegation of intentional conduct.

The Court said “yes,” and made clear that recklessness is a form of intentional conduct:

Although the Jones decision does not contain a definition of “reckless,” it places reckless conduct side-by-side with intentional or deliberate conduct. Jones adopted the Restatement’s formulation of the tort as involving an intentional intrusion. As well, the decision limited claims for intrusion upon seclusion only to “deliberate and significant intrusions of personal privacy”: Jones, at para. 72. One cannot tease from the discussion in Jones any support for the proposition advanced by Ms. Demme that Jones’ inclusion of a reckless act within the tort of intrusion upon seclusion could involve unintentional conduct.

The Court also articulated the precise state of mind that meets the intentionality element:

For that tort, the relevant intention is the defendant’s intention to access private patient records. If that is demonstrated, the nature of the tort is such that the intention to access the records amounts to an intention to cause injury. 

The appellant had argued that she lacked the intent to cause injury and therefore ought to have been covered.

Demme v. Healthcare Insurance Reciprocal of Canada, 2022 ONCA 503 (CanLII).

IPC upholds university vaccination policy

On April 5th, the Information and Privacy Commissioner/Ontario affirmed a University of Guelph requirement that students in residence for the 2021/2022 academic year be fully vaccinated.

The IPC has jurisdiction to consider whether a public body’s collection of personal information is “necessary” to a lawfully authorized activity based on the Freedom of Information and Protection of Personal Privacy Act. The necessity test has been endorsed by the Court of Appeal for Ontario as strict. Where personal information would merely be helpful to the activity, it is not “necessary” within the meaning of FIPPA. Similarly, where the purpose can be accomplished another way, a public body is obliged to chose the other route.

The IPC’s affirmation of the University’s policy (and its collection of personal information) rested heavily on a letter the University had received from the Wellington-Dufferin-Guelph Health Unit in July 2021. It said:

I am writing to recommend in the strongest possible terms that the University of Guelph require a full (two-dose) course of COVID-19 vaccines for all students living in residence during the 2021-22 school year. Additionally, the University should continue to recommend strongly that all other students, faculty and staff receive both doses of the vaccine.

Students beginning or returning to their studies this fall are looking forward to a safe and relational post-secondary experience. Adding this significant layer of protection will help create a more normal fall on campus. Strong vaccination rates across the University are an important part of student physical and mental well-being, and should contribute peace of mind to all Gryphons.

The IPC affirmation is significant not only because it supports a vaccine mandate based on the strict FIPPA necessity standard, but also because of its adoption of this letter and its reasoning. While mandates must certainly be based on science that establishes that vaccination reduces the risk of exposure, the privacy commissioners, labour arbitrators and judges who will continue to be called upon to evaluate mandates must recognize that they are also based on a need for stability and mental well-being.

We thought we were though the pandemic, and are now in Wave Six. Will there be a Wave Seven? And although the province is trying to give us the stability we all crave by committing to laissez faire policy, why should our public bodies be precluded from adopting stable, medium-term policy that prioritizes safety?

University of Guelph (Re), 2022 CanLII 25559 (ON IPC).