Notes on Nova Scotia’s FOIPOP Reform Bill

On Friday, the Nova Scotia legislature introduced Bill 150, a new statute that consolidates the province’s public sector access and privacy laws and introduces key modernization reforms. Below are some quick highlights from the bill.

Class-based exemption for security control information. I just posted last week about withholding information that could jeopardize network security. Nova Scotia’s proposed legislation includes a novel class-based exemption that permits a head to withhold “information the disclosure of which could reasonably be expected to reveal, or lead to the revealing of, measures put in place to protect the security of information stored in electronic form.” Having previously negotiated with regulators to exclude control-related details from investigation reports, I view this language as both protective and positive.

New privacy impact assessment requirement. Under Bill 150, public bodies will be required to conduct a privacy impact assessment (PIA) before initiating any “project, program, system, or other activity” that involves the collection, use, or disclosure of personal information. The PIA must also be updated if there is a substantial change to the activity. A key question is whether the term “other activity” is broad enough to include non-routine or minimal data collections—which public bodies may prefer not to assess.

Power to collect for threat assessment purposes. This touches on an issue I’ve followed for years: behavioral threat assessment and the conduct of so-called “threat inquiries.” Conducting a threat inquiry in response to concerning behavior to properly assess a human threat is a best practice that arose out of 2004 United States school shooting report. However, their legality has been questioned when conducted by institutions without a law enforcement mandate. Nova Scotia’s proposed legislation includes a new authorization to collect personal information—either directly or indirectly—for the purpose of reducing the risk that an individual will be the victim of intimate partner violence or human trafficking. This is a positive step, but it raises a key question: What about other forms of physical violence? The statute’s narrow focus may leave gaps in protection where threat assessments could be equally justified.

New offshoring rules. The new statute, if passed, will repeal the Personal Information International Disclosure Protection Act (PIIDPA)- Nova Scotia’s statute that prohibits public bodies and municipalities from storing, accessing, or disclosing personal information outside of Canada unless an exception applies. It will replace it with a new provision, however, that could be used to continue a similar prohibition. The new provision prohibits disclosing and storing personal information outside of Canada (as well as permitting personal information to be accessed from outside of Canada) unless in accordance with regulations. It does not contemplate regulation of service providers and their employees, which is a feature of PIIDPA.

New breach notification. The new statute, if passed, will include privacy breach notification and reporting, triggered when “it is reasonable to believe that an affected individual could experience significant harm as a result of the privacy breach.” This is equivalent to the “real risk of significant harm standard” in my view.

Supreme Court power to remedy breaches. The new statute, if passed, will give the Nova Scotia Supreme Court the power to issue orders when “personal information has been stolen or has been collected by or disclosed to a third party other than as authorized by this Act.” British Columbia has a more elaborate version of such a provision, which can help public bodies respond to breaches given ongoing legal uncertainty around the status of personal information as property.

Hat tip to David Fraser.

IPC decision highlights issues about threat assessment and PHIPA application

On January 31, 2024, the IPC/Ontario ordered the Ontario Medical Association’s Physician Health Program to provide a complainant with access to a draft assessment report, though it permitted the OMA to withhold behavioral information collected in preparing the report.

Many institutions have processes that support behavioral threat assessment – a process by which multi-disciplinary teams (often including medical clinicians) conduct a threat inquiry to gather behavioral information (usually indirectly), assess behaviors and determine whether someone poses a threat to themselves and/or others. The assessment can lead to interventions, medical and otherwise, that are of benefit to the person being assessed.

The OMA’s Physician Health Program appears to be a threat assessment program, though its mandate is vague, and involves “education; support and referral; assessment; and monitoring and advocacy.” And in responding to an IPC complaint about its access request denial, the OMA argued it was a health information custodian engaged primarily in the provision of health care. The IPC re-articulated the position as follows:

[16]      The OMA PHP describes its monitoring function as “first and foremost a clinical service provided to an individual physician or learner to assist in the maintenance of their health in the context of recovery from a mental health or substance use disorder.” This may involve collecting clinical information, providing clinical opinions, and reviewing urine, hair, blood, or other toxicological tests.

[17]      Overall, the OMA PHP states that its employees provide services “to maintain an individual’s mental condition, … to promote health, and in the case of clients already diagnosed, to prevent disease in the form of recurrence, all of which it states fall under the definition of “health care.”

This position drove the outcome given PHIPA has a very broad right of access to personal health information. The OMA was left with no valid basis to shield its draft report, even though the IPC has held that assessment is different than providing health care. The IPC did find that the (critical and sensitive) behavioral reports made to the OMA could be withheld on the basis of section 52(3), which applies to records “not… dedicated primarily to personal health information about the individual requesting access” and permits reasonable severance.

Threat assessment can and should be framed as beneficial to the person being assessed, which is important because it aligns threat assessment with the duty not to discriminate against individuals with disabilities. In other words, threat assessment is an aspect of accommodating disability and meeting institutional health and safety duties. Threat assessment is both a lawful and critical process.

This framing does not make threat assessment health care, nor should it ever be treated as health care in my view. The interventions that threat assessment invites are meant to help in the long and medium term, but in the short term they are about the restriction privileges (e.g., of practicing, working, attending school) based on the assessed risk. There is therefore a conflict in striving to be both a heath care provider and a threat assessor, and individuals under assessment must know the true nature of the process with which they are engaged. Are you my doctor? Or are you working for the institution? If threat assessment is framed as assessment, even if it is conducted by medical clinicians, PHIPA will not apply.

Ontario Medical Association Physician Health Program (Re), 2025 CanLII 9695 (ON IPC), <https://canlii.ca/t/k9ftg>, retrieved on 2025-07-17.

Perspectives on anonymization report released

On December 18, Khaled El Emam, Anita Fineberg, Elizabeth Jonker and Lisa Pilgram published Perspectives of Canadian privacy regulators on anonymization practices and anonymization information: a qualitative study. It is based on input from all but one Canadian privacy regulator, and includes a great discussion of one of the most important policy issues in Canadian privacy law – What do we do about anonymization given the massive demand for artificial intelligence training data?

The authors stress a lack of precision and consistency in Canadian law. True that the fine parameters of Canadian privacy law are yet to be articulated, but the broad parameters of our policy are presently clear:

  • First, there must be authorization to de-identify personal information. The Canadian regulators who the authors spoke with were mostly aligned against a consent requirement, though not without qualification. If there’s no express authorization to de-identify without consent (as in Ontario PHIPA), one gets the impression that a regulator will not imply consent to de-identify data for all purposes and all manner of de-dentification.
  • Second, custodians of personal information must be transparent. One regulator said, “I have no sympathy for the point of view that it’s better not to tell people so as not to create any noise. I do not believe that that’s an acceptable public policy stance.” So, if you’re going to sell a patient’s health data to a commercial entity, okay, but you better let patients know.
  • Third, the information must be de-identified in a manner that renders the re-identification risk very low in the context. Lots can be said about the risk threshold and the manner of de-identification, and lots that will be said over the next while. The authors recommend that legislators adopt a “code of practice” model for establishing specific requirements for de-dentification.

The above requirements can all be derived from existing legislation, as is illustrated well by PHIPA Decision 175 in Ontario, about a custodian’s sale of anonymized personal health information. Notably, the IPC imposed a requirement on the disclosing custodian to govern the recipient entity by way of the data sale agreement, rooting its jurisdiction in the provision that requires safeguarding of personal health information a custodian’s control. One can question this root, though it is tied to re-identification risk and within jurisdiction in my view.

What’s not in current Canadian privacy legislation is any restriction on the purpose of de-dentification, the identity of recipients, or the nature of the recipient’s secondary use. This is a BIG issue that is tied to data ethics. Should a health care provider ever be able to sell its data to an entity for commercial use? Should custodians be responsible for determining whether the secondary use is likely to harm individuals or groups – e.g., based on the application of algorithmic bias?

Bill C-27 (the PIPEDA replacement bill) permits the non-consensual disclosure of de-identified personal information to specific entities for a “socially beneficial purpose” – “a purpose related to health, the provision or improvement of public amenities or infrastructure, the protection of the environment or any other prescribed purpose.” Given C-27 looks fated to die, Alberta’s Bill 33 may lead the way, and if passed will restrict Alberta public bodies from disclosing “non-personal information” outside of government for any purpose other than “research and analysis” and “planning, administering, delivering, managing, monitoring or evaluating a program or service” (leaving AI model developers wondering how far they can stretch the concept of “research”).

Both C-27 and Bill 33 impose a contracting requirement akin to that imposed by the IPC in Decision 175. Bill 33, for example, only permits disclosure outside of government if:

(ii) the head of the public body has approved conditions relating to the following: (A) security and confidentiality; (B) the prohibition of any actual or attempted re-identification of the non-personal data; (C) the prohibition of any subsequent use or disclosure of the non-personal data without the express authorization of the public body; (D) the destruction of the non-personal data at the earliest reasonable time after it has served its purpose under subclause (i), unless the public body has given the express authorization referred to in paragraph (C),

and

(iii) the person has signed an agreement to comply with the approved conditions, this Act, the regulations and any of the public body’s policies and procedures
relating to non-personal data.

Far be it from me to solve this complex policy problem, but here are my thoughts:

  • Let’s aim for express authorization to-de identify rather than continuing to rely on a warped concept of implied consent. Express authorization best promotes transparency and predictability.
  • I’m quite comfortable with a generally stated re-identification risk threshold, and wary of a binding organizations to a detailed and inaccessible code of practice.
  • Any foray into establishing ethical or other requirements for “research” should respect academic freedom, and have an appropriate exclusion.
  • We need to eliminate downstream accountability for de-identified data of the kind that is invited by the Bill 33 provision quoted above. Custodians don’t have the practical ability to enforce these agreements, and the agreements will therefore invite huge potential liability. Statutes should bind recipients and immunize organizations who disclose de-identified information for a valid purpose from downstream liability.

Do have a read of the report, and keep thinking and talking about these important issues.

Notable features of the Alberta public sector privacy bill

Alberta has recently introduced Bill 33 – a public sector privacy “modernization” bill. Alberta has put significantly more thought into its modernization bill than Ontario, who introduced modest FIPPA reforms in a more splashy and less substantive reform bill earlier this year. This means Bill 33 is significant because it is leading. Might it set the new public sector norm?

Here are Bill 33’s notable features:

  • Bill 33 will require public bodies to give pre-collection notice of an intent to input personal information into an “automated system to generate content or make decisions, recommendations or predications.” Automated system is not defined, and it is unclear if this is meant to foster decision-making transparency or transparency about downstream data use.
  • Bill 33 will require breach notification and reporting based on the “real risk of significant harm” standard. Reports to the OIPC and the Minister responsible for the Act will be required. Requiring reports to the regulator and government is novel.
  • Bill 33 will prohibit the sale of personal information “in any circumstances or for any purpose.” Sale is not defined.
  • Bill 33 has an allowance for disclosing personal information if the disclosure would not constitute an unjustified invasion of personal privacy. This flexible allowance – which contemplates balancing interests – does not typically apply outside of the access request context.
  • Bill 33 has a prohibition on data matching to produce derived personal information about an identifiable individual. This matching will only be permitted for “research and analysis” and “planning, administering, delivering, managing, mentoring or evaluating a program or service” unless additional allowances are implemented by regulation. The Alberta OIPC has said that “research and analysis” should be defined, and that that there should be a transparency requirements for data matching.
  • Bill 33 will establish rules regarding de-identified or “non-personal data.” The rules will permit disclosure of non-personal data to another public body without restriction, but disclosures of non-personal data to others will be limited to specified purposes and subject to requirements that render downstream users accountable to the disclosing public body. Public bodies will also have a duty to secure non-personal data.
  • Bill 33 will require public bodies to establish and implement privacy management programs consisting of documented policies and procedures. It will also mandate privacy impact assessments in circumstances that will be prescribed, with submission to the OIPC also to be prescribed in some circumstances.

There is a long list of exceptions to the indirect collection prohibition in the Bill, but no exceptions that permit the collection of personal information for threat assessment purposes. Violence threat risk assessments have become a standard means by which educational institutions discharge their safety-related duties. “VTRAs” rest on an indirect collection of personal information that should be expressly authorized in any modernized public sector privacy statues.

NSCA outlines the “law of redaction”

Exactly when should an entire document be withheld because redaction is not reaonable?

Freedom of information adjudicators have used the concept of “disconnected snippets” to delineate; if redaction would leave a reader with meaningless “disconnected snippets,” entire records can rightly be withheld.

The Nova Scotia Court of Appeal, on August 7th, applied similar logic in determining that a set of affidavits “could not be redacted without sacrificing their intelligibility and therefore the utility of public access.” It therefore held that the affidavits could be sealed in whole in compliance with the necessity component of the test from Sherman Estate.

Notably, the Court reviewed cases that establish a second basis for full record withholding – cost. In Patient X v College of Physicians and Surgeons of Nova Scotia, the Nova Scotia Supreme Court held that redacting a 120-page records would be too “painstaking and prone to error” given it included a significant number of handwritten notes. And in Khan v College of Physicians and Surgeons of Ontario, the Ontario Superior Court of Justice reached a similar finding given the record requiring redaction was almost 4,500 pages in length, requiring an error prone hunt for (sensitive) patient information.

Back to freedom of information, where costs are passed through to requesters. In Ontario, the norm is to charge through two minutes a page for redaction. Should a premium be chargeable for handwritten records or records that contain very sensitive information?

Dempsey v. Pagefreezer Software Inc., 2024 NSCA 76 (CanLII).

Online proctoring report a must read for Ontario institutions

Online proctoring software was critical to higher education institutions during the heart of the pandemic. Though less signficant today, the report of findings issued by the Information and Privacy Commissioner/Ontario last week about McMaster University’s use of online proctoring is an important read for Ontario public sector institutions – with relevant guidance on IT contracting, the use of generative AI tools and even the public sector necessity test itself.

The necessity test

To be lawful, the collection of personal information by Ontario public sector institutions must be “necessary to the proper administration of a lawfully authorized activity.” The Court of Appeal for Ontario adopted the IPC’s interpretation of the test in Cash Converters in 2007. It is strict, requiring justification to collect each data element, and the necessity standard requires an institution to establish that a collection is more than “merely helpful.”

The strictness of the test leaves one to wonder whether institutions’ business judgment carries any weight. This is a particular concern for universities, whose judgement in academic matters has been given special deference by courts and administrative decision-makers and is protected by a FIPPA exclusion that carves out teaching and research records from the scope of the Act. It does not appear that McMaster argued that the teaching and research records exclusion limited the IPC’s jurisdiction to scrutinize its use of online proctoring, but McMaster did argue that it, “retains complete autonomy, authority, and discretion to employ proctored online exams, prioritizing administrative efficiency and commercial viability, irrespective of necessity.”

The IPC rejected this argument, but applied a form of deference nonetheless. Specifically, the IPC did not question whether the University’s use of online proctoring was necessary. It held that the University’s decision to employ online proctoring was lawfully authorized, and only considered whether the University’s online proctoring tool collected personal information that was necessary for the University to employ online proctoring.

This deferential approach to the Ontario necessity test is not self-evident, though it is the same point that the University of Western Ontario prevailed on in2022 in successfully defeating a challenge to its vaccination policy. In Hawke v Western University, the Court declined to scrutinize the necessity of the University’s vaccination policy itself; the only questions invited by FIPPA were (a) whether the the University’s chosen policy was a lawful exercise of its authority, and (b) whether the collection of vaccination status information to enforce the chosen and lawful policy was necessary.

To summarize, the authority now makes clear that Ontario institutions get to set their own “policy” within the scope of their legal mandates, even if the policy invites the collection of personal information. The necessity of the collection is then measured against the purposes of the chosen lawful policy.

IT contracting

It is common for IT service providers to reserve a right to use the information they process in providing services to institutions. Institutions should appreciate whether the right reserved is a right to use aggregate or de-identified information, or a right to use personal information.

The relevant term of use in McMaster’s case was as follows:

Random samples of video and/or audio recordings may be collected via Respondus Monitor and used by Respondus to improve the Respondus Monitor capabilities for institutions and students. The recordings may be shared with researchers under contract with Respondus to assist in such research. The researchers are consultants or contractors to Respondus and are under written obligation to maintain the video and/or audio recordings in confidence and under terms at least as strict as these Terms. The written agreements with the researchers also expressly limit their access and use of the data to work being done for Respondus and the researchers do not have the right to use the data for any other purposes. No personally identifiable information for students is provided with the video and/or audio recordings to researchers, such as the student’s name, course name, institution, grades, or student identification photos submitted as part of the Respondus Monitor exam session.

Despite the (dubious) last sentence of this text, the IPC held that this contemplated a use of test taker personal information was for a secondary purpose that was not a “consistent purpose.” It was therefore not authorized by FIPPA.

In recommending that the University secure a written undertaking from the service provider that it would cease to use student personal information for system improvement purposes without consent, the IPC carefully noted that the service provider had published information that indicated it refrains from this use in certain jurisdictions.

In addition to this finding and a number of related findings about the use of test taker personal information for the vendor’s secondary purposes, the IPC held:

  • the vendor contract was deficient because it did not require the vendor to notify the University in the event that it is required to disclose a test taker’s personal data to authorities; and
  • that the University should contractually require the vendor to delete audio and video recordings from its servers on, at minimum, an annual basis and that the vendor provide confirmation of this deletion.

The McMaster case adds to the body of IPC guidance on data protection terms. The IPC appears to be accepting of vendor de-identification rights, but not of vendor rights to use personal information.

Generative AI

While the IPC recognized that Ontario does not have law or binding policy specifically governing the use of artificial intelligence in the public sector, it nonetheless recommended that the University build in “guardrails” to protect its students from the risks of AI-enabled proctoring software. Specifically, the IPC recommended that the University:

  • conduct an algorithmic impact assessment and scritinize the source or provenance of the data used to train the vendors algorithms;
  • engage and consult with affected parties (including those from vulnerable or historically marginalized groups) and those with relevant expertise;
  • provide an opt out as a matter of accommodating students with disabilities and “students having serious apprehensions about the AI- enabled software and the significant impacts it can have on them and their personal information”;
  • reinforce human oversight of outcomes by formalizing and communicating about an informal process for challenging outcomes (separate and apart from formal academic appeal processes);
  • conduct greater scrutiny over how the vendor’s software was developed to ensure that any source data used to train its algorithms was obtained in compliance with Canadian laws and in keeping with Ontarians’ reasonable expectations; and
  • specifically prohibit the vendor from using students’ personal information for algorithmic training purposes without their consent.

The IPC’s approach suggests that it expects institutions to employ a higher level of due diligence in approaching AI-enabled tools given their inherent risks.

Privacy Complaint Report PI21-00001.

No Charter-protected expectation of privacy in vehicle operation data

On July 20th, the Court of Appeal for Saskatchewan held that an accused person who drove his pickup truck through a highway intersection and stuck a semi-truck did not have a reasonable expectation of privacy that precluded the police from seizing a control module and its data from his vehicle before it was towed away.

The accident was horrible. There were six people in the truck with the accused, three of whom died, two of whom were children. The police charged the accused with dangerous driving and criminal negligence, and the prosecution relied on evidence retrieved from the wrecked pickup truck at the scene of the accident. Specifically, the police seized the truck’s Airbag Control Module (ACM) from under the driver’s seat. The ACM contained an Event Data Recorder (EDR) with data about the vehicle’s operation during the five seconds before impact in tenth of a second intervals – specifically, speed, accelerator pedal (% full), manifold pressure and service brake (on/off), seatbelt pretensioner readings, airbag deployment readings.

There are competing lines of Canadian jurisprudence regarding the warrantless seizure of on board vehicle computers and their data. The leading Ontario case is Hamilton, a Ontario Superior Court of Justice case that recognizes a reasonable expectation of (informational) privacy. In Yogeswaran, though, the Ontario Superior Court of Justice held that the territorial privacy interest in one’s vehicle is enough to preclude police search and seizure without prior judicial authorization.

Conversely, in Fedan, the Court of Appeal for British Columbia held that one’s territorial privacy interest in their vehicle is extinguished when the vehicle is seized and that EDR data is not associated with a strong enough informational privacy interest to warrant Charter protection.

The Court of Appeal for Saskatchewan followed Fedan. It reasoned that the accused’s truck, being totally destroyed on the side of a public roadway, was in the total control of the police whether or not it was yet to be formally seized based on section 489(2) of the Criminal Code. It concluded:

…the claim to a territorial privacy interest by Mr. Major in that component of his vehicle is weak. While a warrant could have been obtained, that does not mean one was required. I find that the state of the vehicle, Mr. Major’s loss of control over it, the nature of the ACM as a mechanical safety component installed by the manufacturer, and the focused task by Cpl. Green in locating and removing only it, do not support the continued existence of an objectively reasonable territorial privacy interest at the point when the vehicle was entered

Regarding informational privacy, the Court made the point that not all digital evidence is equally sensitive or revealing of one’s “biographical core.” EDR data of the kind at issue is limited to data about the operation of a vehicle immediately before an accident, and provides no “longer-term information about the driving habits of the owner or operator of a vehicle.” The Court concluded:

After considering the two lines of cases regarding EDR data, I find myself in substantial agreement with the reasoning from Fedan for the characterization of the data stored in the EDR. As in Fedan, the data here “contained no intimate details of the driver’s biographical core, lifestyle or personal choices, or information that could be said to directly compromise his ‘dignity, integrity and autonomy’” (at para 82, quoting Plant at 293). It revealed no personal identifiers or details at all. It was not invasive of Mr. Major’s personal life. The anonymous driving data disclosed virtually nothing about the lifestyle or private decisions of the operator of the Dodge Ram pickup. It is hard to conceive that Mr. Major intended to keep his manner of driving private, given that the other occupants of the vehicle – which included an adult employee – and complete strangers, who were contemporaneously using the public roadways or adjacent to it, could readily observe him. His highly regulated driving behaviour was “exposed to the public” (Tessling at para 47), although not to the precise degree with which the limited EDR data, as interpreted by the Bosch CDR software, purports to do. While it is only a small point, I further observe that a police officer on traffic patrol would have been entitled to capture Mr. Major’s precise speed on their speed detection equipment without raising any privacy concerns.

R v Major, 2022 SKCA 80 (CanLII).

Recent cyber presentations

Teaching is the best way of learning for some, including me. Here are two recent cyber security presentations that may be of interest:

  • A presentation from last month on “the law of information” that I delivered to participants in the the Osgoode PDP program on cyber security
  • Last week’s presentation for school boards – Critical Issues in School Board Cyber Security

If you have questions please get in touch!

When it happens, will you be ready? How to excel in handling your next cyber incident

I like speaking about incident response because there are so many important practical points to convey. Every so often I re-consolidate my thinking on the topic and do up a new slide deck. Here is one such deck from this week’s presentation at Canadian Society of Association Executives Winter Summit. It includes an adjusted four step description of the response process that I’m content with.

We’ve been having some team discussions over here about how incident response plans can be horribly over-built and unusable. I made the point in presenting this that one could take the four step model asset out in this deck, add add a modest amount of “meat” to the process (starting with assigning responsibilities) and append some points on how specific scenarios might be handled based on simple discussion if not a bona fide tabletop exercise.

Preparing for a cyber incident isn’t and shouldn’t be hard, and simple guidance is often most useful for dealing with complex problems.

NSCA says no expectation of privacy in address information

On January 28th the Nova Scotia Court of Appeal dismissed a privacy breach allegation that was based on a municipality’s admitted disclosure of address information to a related service commission so the service commission could bill for certain statutorily mandated charges. The Court held there was no reasonable expectation of privacy in the information disclosed, reasoning as follows:

Mr. Banfield’s information was not confidential, secret or anonymous. Neither did it offer a glimpse into Mr. Banfield’s intimate, personal or sensitive activities. Nor did it involve the investigation of a potential offence. Rather, it enabled a regulated public utility to invoice Mr. Banfield with rates approved under statutory authority for a legally authorized service that, in fact, Mr. Banfield received.  

Banfield v. Nova Scotia (Utility and Review Board), 2020 NSCA 6 (CanLII).