Notes on Nova Scotia’s FOIPOP Reform Bill

On Friday, the Nova Scotia legislature introduced Bill 150, a new statute that consolidates the province’s public sector access and privacy laws and introduces key modernization reforms. Below are some quick highlights from the bill.

Class-based exemption for security control information. I just posted last week about withholding information that could jeopardize network security. Nova Scotia’s proposed legislation includes a novel class-based exemption that permits a head to withhold “information the disclosure of which could reasonably be expected to reveal, or lead to the revealing of, measures put in place to protect the security of information stored in electronic form.” Having previously negotiated with regulators to exclude control-related details from investigation reports, I view this language as both protective and positive.

New privacy impact assessment requirement. Under Bill 150, public bodies will be required to conduct a privacy impact assessment (PIA) before initiating any “project, program, system, or other activity” that involves the collection, use, or disclosure of personal information. The PIA must also be updated if there is a substantial change to the activity. A key question is whether the term “other activity” is broad enough to include non-routine or minimal data collections—which public bodies may prefer not to assess.

Power to collect for threat assessment purposes. This touches on an issue I’ve followed for years: behavioral threat assessment and the conduct of so-called “threat inquiries.” Conducting a threat inquiry in response to concerning behavior to properly assess a human threat is a best practice that arose out of 2004 United States school shooting report. However, their legality has been questioned when conducted by institutions without a law enforcement mandate. Nova Scotia’s proposed legislation includes a new authorization to collect personal information—either directly or indirectly—for the purpose of reducing the risk that an individual will be the victim of intimate partner violence or human trafficking. This is a positive step, but it raises a key question: What about other forms of physical violence? The statute’s narrow focus may leave gaps in protection where threat assessments could be equally justified.

New offshoring rules. The new statute, if passed, will repeal the Personal Information International Disclosure Protection Act (PIIDPA)- Nova Scotia’s statute that prohibits public bodies and municipalities from storing, accessing, or disclosing personal information outside of Canada unless an exception applies. It will replace it with a new provision, however, that could be used to continue a similar prohibition. The new provision prohibits disclosing and storing personal information outside of Canada (as well as permitting personal information to be accessed from outside of Canada) unless in accordance with regulations. It does not contemplate regulation of service providers and their employees, which is a feature of PIIDPA.

New breach notification. The new statute, if passed, will include privacy breach notification and reporting, triggered when “it is reasonable to believe that an affected individual could experience significant harm as a result of the privacy breach.” This is equivalent to the “real risk of significant harm standard” in my view.

Supreme Court power to remedy breaches. The new statute, if passed, will give the Nova Scotia Supreme Court the power to issue orders when “personal information has been stolen or has been collected by or disclosed to a third party other than as authorized by this Act.” British Columbia has a more elaborate version of such a provision, which can help public bodies respond to breaches given ongoing legal uncertainty around the status of personal information as property.

Hat tip to David Fraser.

Notable features of the Alberta public sector privacy bill

Alberta has recently introduced Bill 33 – a public sector privacy “modernization” bill. Alberta has put significantly more thought into its modernization bill than Ontario, who introduced modest FIPPA reforms in a more splashy and less substantive reform bill earlier this year. This means Bill 33 is significant because it is leading. Might it set the new public sector norm?

Here are Bill 33’s notable features:

  • Bill 33 will require public bodies to give pre-collection notice of an intent to input personal information into an “automated system to generate content or make decisions, recommendations or predications.” Automated system is not defined, and it is unclear if this is meant to foster decision-making transparency or transparency about downstream data use.
  • Bill 33 will require breach notification and reporting based on the “real risk of significant harm” standard. Reports to the OIPC and the Minister responsible for the Act will be required. Requiring reports to the regulator and government is novel.
  • Bill 33 will prohibit the sale of personal information “in any circumstances or for any purpose.” Sale is not defined.
  • Bill 33 has an allowance for disclosing personal information if the disclosure would not constitute an unjustified invasion of personal privacy. This flexible allowance – which contemplates balancing interests – does not typically apply outside of the access request context.
  • Bill 33 has a prohibition on data matching to produce derived personal information about an identifiable individual. This matching will only be permitted for “research and analysis” and “planning, administering, delivering, managing, mentoring or evaluating a program or service” unless additional allowances are implemented by regulation. The Alberta OIPC has said that “research and analysis” should be defined, and that that there should be a transparency requirements for data matching.
  • Bill 33 will establish rules regarding de-identified or “non-personal data.” The rules will permit disclosure of non-personal data to another public body without restriction, but disclosures of non-personal data to others will be limited to specified purposes and subject to requirements that render downstream users accountable to the disclosing public body. Public bodies will also have a duty to secure non-personal data.
  • Bill 33 will require public bodies to establish and implement privacy management programs consisting of documented policies and procedures. It will also mandate privacy impact assessments in circumstances that will be prescribed, with submission to the OIPC also to be prescribed in some circumstances.

There is a long list of exceptions to the indirect collection prohibition in the Bill, but no exceptions that permit the collection of personal information for threat assessment purposes. Violence threat risk assessments have become a standard means by which educational institutions discharge their safety-related duties. “VTRAs” rest on an indirect collection of personal information that should be expressly authorized in any modernized public sector privacy statues.

Online proctoring report a must read for Ontario institutions

Online proctoring software was critical to higher education institutions during the heart of the pandemic. Though less signficant today, the report of findings issued by the Information and Privacy Commissioner/Ontario last week about McMaster University’s use of online proctoring is an important read for Ontario public sector institutions – with relevant guidance on IT contracting, the use of generative AI tools and even the public sector necessity test itself.

The necessity test

To be lawful, the collection of personal information by Ontario public sector institutions must be “necessary to the proper administration of a lawfully authorized activity.” The Court of Appeal for Ontario adopted the IPC’s interpretation of the test in Cash Converters in 2007. It is strict, requiring justification to collect each data element, and the necessity standard requires an institution to establish that a collection is more than “merely helpful.”

The strictness of the test leaves one to wonder whether institutions’ business judgment carries any weight. This is a particular concern for universities, whose judgement in academic matters has been given special deference by courts and administrative decision-makers and is protected by a FIPPA exclusion that carves out teaching and research records from the scope of the Act. It does not appear that McMaster argued that the teaching and research records exclusion limited the IPC’s jurisdiction to scrutinize its use of online proctoring, but McMaster did argue that it, “retains complete autonomy, authority, and discretion to employ proctored online exams, prioritizing administrative efficiency and commercial viability, irrespective of necessity.”

The IPC rejected this argument, but applied a form of deference nonetheless. Specifically, the IPC did not question whether the University’s use of online proctoring was necessary. It held that the University’s decision to employ online proctoring was lawfully authorized, and only considered whether the University’s online proctoring tool collected personal information that was necessary for the University to employ online proctoring.

This deferential approach to the Ontario necessity test is not self-evident, though it is the same point that the University of Western Ontario prevailed on in2022 in successfully defeating a challenge to its vaccination policy. In Hawke v Western University, the Court declined to scrutinize the necessity of the University’s vaccination policy itself; the only questions invited by FIPPA were (a) whether the the University’s chosen policy was a lawful exercise of its authority, and (b) whether the collection of vaccination status information to enforce the chosen and lawful policy was necessary.

To summarize, the authority now makes clear that Ontario institutions get to set their own “policy” within the scope of their legal mandates, even if the policy invites the collection of personal information. The necessity of the collection is then measured against the purposes of the chosen lawful policy.

IT contracting

It is common for IT service providers to reserve a right to use the information they process in providing services to institutions. Institutions should appreciate whether the right reserved is a right to use aggregate or de-identified information, or a right to use personal information.

The relevant term of use in McMaster’s case was as follows:

Random samples of video and/or audio recordings may be collected via Respondus Monitor and used by Respondus to improve the Respondus Monitor capabilities for institutions and students. The recordings may be shared with researchers under contract with Respondus to assist in such research. The researchers are consultants or contractors to Respondus and are under written obligation to maintain the video and/or audio recordings in confidence and under terms at least as strict as these Terms. The written agreements with the researchers also expressly limit their access and use of the data to work being done for Respondus and the researchers do not have the right to use the data for any other purposes. No personally identifiable information for students is provided with the video and/or audio recordings to researchers, such as the student’s name, course name, institution, grades, or student identification photos submitted as part of the Respondus Monitor exam session.

Despite the (dubious) last sentence of this text, the IPC held that this contemplated a use of test taker personal information was for a secondary purpose that was not a “consistent purpose.” It was therefore not authorized by FIPPA.

In recommending that the University secure a written undertaking from the service provider that it would cease to use student personal information for system improvement purposes without consent, the IPC carefully noted that the service provider had published information that indicated it refrains from this use in certain jurisdictions.

In addition to this finding and a number of related findings about the use of test taker personal information for the vendor’s secondary purposes, the IPC held:

  • the vendor contract was deficient because it did not require the vendor to notify the University in the event that it is required to disclose a test taker’s personal data to authorities; and
  • that the University should contractually require the vendor to delete audio and video recordings from its servers on, at minimum, an annual basis and that the vendor provide confirmation of this deletion.

The McMaster case adds to the body of IPC guidance on data protection terms. The IPC appears to be accepting of vendor de-identification rights, but not of vendor rights to use personal information.

Generative AI

While the IPC recognized that Ontario does not have law or binding policy specifically governing the use of artificial intelligence in the public sector, it nonetheless recommended that the University build in “guardrails” to protect its students from the risks of AI-enabled proctoring software. Specifically, the IPC recommended that the University:

  • conduct an algorithmic impact assessment and scritinize the source or provenance of the data used to train the vendors algorithms;
  • engage and consult with affected parties (including those from vulnerable or historically marginalized groups) and those with relevant expertise;
  • provide an opt out as a matter of accommodating students with disabilities and “students having serious apprehensions about the AI- enabled software and the significant impacts it can have on them and their personal information”;
  • reinforce human oversight of outcomes by formalizing and communicating about an informal process for challenging outcomes (separate and apart from formal academic appeal processes);
  • conduct greater scrutiny over how the vendor’s software was developed to ensure that any source data used to train its algorithms was obtained in compliance with Canadian laws and in keeping with Ontarians’ reasonable expectations; and
  • specifically prohibit the vendor from using students’ personal information for algorithmic training purposes without their consent.

The IPC’s approach suggests that it expects institutions to employ a higher level of due diligence in approaching AI-enabled tools given their inherent risks.

Privacy Complaint Report PI21-00001.