Online proctoring report a must read for Ontario institutions

Online proctoring software was critical to higher education institutions during the heart of the pandemic. Though less signficant today, the report of findings issued by the Information and Privacy Commissioner/Ontario last week about McMaster University’s use of online proctoring is an important read for Ontario public sector institutions – with relevant guidance on IT contracting, the use of generative AI tools and even the public sector necessity test itself.

The necessity test

To be lawful, the collection of personal information by Ontario public sector institutions must be “necessary to the proper administration of a lawfully authorized activity.” The Court of Appeal for Ontario adopted the IPC’s interpretation of the test in Cash Converters in 2007. It is strict, requiring justification to collect each data element, and the necessity standard requires an institution to establish that a collection is more than “merely helpful.”

The strictness of the test leaves one to wonder whether institutions’ business judgment carries any weight. This is a particular concern for universities, whose judgement in academic matters has been given special deference by courts and administrative decision-makers and is protected by a FIPPA exclusion that carves out teaching and research records from the scope of the Act. It does not appear that McMaster argued that the teaching and research records exclusion limited the IPC’s jurisdiction to scrutinize its use of online proctoring, but McMaster did argue that it, “retains complete autonomy, authority, and discretion to employ proctored online exams, prioritizing administrative efficiency and commercial viability, irrespective of necessity.”

The IPC rejected this argument, but applied a form of deference nonetheless. Specifically, the IPC did not question whether the University’s use of online proctoring was necessary. It held that the University’s decision to employ online proctoring was lawfully authorized, and only considered whether the University’s online proctoring tool collected personal information that was necessary for the University to employ online proctoring.

This deferential approach to the Ontario necessity test is not self-evident, though it is the same point that the University of Western Ontario prevailed on in2022 in successfully defeating a challenge to its vaccination policy. In Hawke v Western University, the Court declined to scrutinize the necessity of the University’s vaccination policy itself; the only questions invited by FIPPA were (a) whether the the University’s chosen policy was a lawful exercise of its authority, and (b) whether the collection of vaccination status information to enforce the chosen and lawful policy was necessary.

To summarize, the authority now makes clear that Ontario institutions get to set their own “policy” within the scope of their legal mandates, even if the policy invites the collection of personal information. The necessity of the collection is then measured against the purposes of the chosen lawful policy.

IT contracting

It is common for IT service providers to reserve a right to use the information they process in providing services to institutions. Institutions should appreciate whether the right reserved is a right to use aggregate or de-identified information, or a right to use personal information.

The relevant term of use in McMaster’s case was as follows:

Random samples of video and/or audio recordings may be collected via Respondus Monitor and used by Respondus to improve the Respondus Monitor capabilities for institutions and students. The recordings may be shared with researchers under contract with Respondus to assist in such research. The researchers are consultants or contractors to Respondus and are under written obligation to maintain the video and/or audio recordings in confidence and under terms at least as strict as these Terms. The written agreements with the researchers also expressly limit their access and use of the data to work being done for Respondus and the researchers do not have the right to use the data for any other purposes. No personally identifiable information for students is provided with the video and/or audio recordings to researchers, such as the student’s name, course name, institution, grades, or student identification photos submitted as part of the Respondus Monitor exam session.

Despite the (dubious) last sentence of this text, the IPC held that this contemplated a use of test taker personal information was for a secondary purpose that was not a “consistent purpose.” It was therefore not authorized by FIPPA.

In recommending that the University secure a written undertaking from the service provider that it would cease to use student personal information for system improvement purposes without consent, the IPC carefully noted that the service provider had published information that indicated it refrains from this use in certain jurisdictions.

In addition to this finding and a number of related findings about the use of test taker personal information for the vendor’s secondary purposes, the IPC held:

  • the vendor contract was deficient because it did not require the vendor to notify the University in the event that it is required to disclose a test taker’s personal data to authorities; and
  • that the University should contractually require the vendor to delete audio and video recordings from its servers on, at minimum, an annual basis and that the vendor provide confirmation of this deletion.

The McMaster case adds to the body of IPC guidance on data protection terms. The IPC appears to be accepting of vendor de-identification rights, but not of vendor rights to use personal information.

Generative AI

While the IPC recognized that Ontario does not have law or binding policy specifically governing the use of artificial intelligence in the public sector, it nonetheless recommended that the University build in “guardrails” to protect its students from the risks of AI-enabled proctoring software. Specifically, the IPC recommended that the University:

  • conduct an algorithmic impact assessment and scritinize the source or provenance of the data used to train the vendors algorithms;
  • engage and consult with affected parties (including those from vulnerable or historically marginalized groups) and those with relevant expertise;
  • provide an opt out as a matter of accommodating students with disabilities and “students having serious apprehensions about the AI- enabled software and the significant impacts it can have on them and their personal information”;
  • reinforce human oversight of outcomes by formalizing and communicating about an informal process for challenging outcomes (separate and apart from formal academic appeal processes);
  • conduct greater scrutiny over how the vendor’s software was developed to ensure that any source data used to train its algorithms was obtained in compliance with Canadian laws and in keeping with Ontarians’ reasonable expectations; and
  • specifically prohibit the vendor from using students’ personal information for algorithmic training purposes without their consent.

The IPC’s approach suggests that it expects institutions to employ a higher level of due diligence in approaching AI-enabled tools given their inherent risks.

Privacy Complaint Report PI21-00001.