Ontario (M)FIPPA institutions, file encryption, and breach notification – a hint

As most of you know, the Ontario IPC released four decisions in the summer relating to breach reporting and notification obligations under PHIPA and the CYSFA. One controversial finding (which is subject to a judicial review application) is that the encryption of files by ransomware actors triggers an unauthorized use and a loss of personal and personal health information. Given there is no risk-based threshold for reporting and notification in PHIPA, custodians and service providers must report and notify in respect of this particular kind of breach, even if the threat actors have not stolen or laid eyes on information.

Leaving legal analysis aside, I’ll say that this is odd policy that has led to odd questions about who is affected by file encryption. Do we really care? Does this have any meaning to “affected” individuals?

The negative impact is that it threatens the clarity of communications about matters that institutions need to communicate clearly: “Yes there’s been a privacy breach, but the threat actor(s) didn’t steal or view your information. And information has been “lost,” but not lost as in “stolen.” 🤦🏽‍♂️

One can honestly question whether there is any public good in this garble. The IPC has lobbied for cyber incident reporting, which this interpretation of PHIPA and the CYFSA effectively achieves. Cyber incident reporting should be brought in properly, through legislation, and leave out the notification obligation.

But how far does the finding extend?

The four decisions released in the summer left a question about how the encryption finding would apply to MFIPPA and FIPPA institutions, who are encouraged (but not yet legally required) to report and notify based on the “real risk of signficant harm” standard. This standard will become a legal imperative when the provisions of Bill 194 come into force.

On December 10, the IPC issued a privacy complaint report that addressed file encryption at an MFIPPA institution and (in qualified terms) held that notification was not required. Mr. Gayle explained:

As the affected personal information remains encrypted and the police’s investigation found no evidence of exfiltration, it is not clear whether the breach “poses a real risk of significant harm to [these individuals], taking into consideration the sensitivity of the information and whether it is likely to be misused”. As such, it is not clear whether the police should have given direct notice of the breach to affected individuals in accordance with the IPC’s Privacy Breach Guidelines.

However, I am mindful of the fact that the police provided some notice to the public about the extent of the ransomware attack, and of the investigative and remedial steps they took to address it. I am also mindful of the fact that the breach occurred more than three years ago.

For these reasons, I find that it would serve no useful purpose in recommending that the police renotify affected individuals of the breach in accordance with the IPC’s Privacy Breach Guidelines and, as a result, do not need to decide whether the breach in this case met the threshold of “real risk of significant harm to the individual”.

This is helpful guidance, and should allow MFIPPA and FIPPA institutions to respond to matters with the clearest possible communication.

Sault Ste. Marie Police Services Board (Re), 2024 CanLII 124986 (ON IPC).

Mandate letters decision applied to give full force to academic freedom exclusion in Alberta

The Supreme Court of Canada issued its “Mandate Letters” decision in February of this year. It was an obscure case for day-to-day freedom of information practice, addressing whether written mandates by a premier to their ministers are accessible to the public under freedom of information legislation. Mandate Letters was nonetheless signficant for its re-framing of statutory purposes: access legislation does not just support transparency, but is meant to “strike a balance.” In the very first line of her judgement Justice Karatkanis said:

Freedom of information (FOI) legislation strikes a balance between the public’s need to know and the confidentiality the executive requires to govern effectively. Both are crucial to the proper functioning of our democracy.

She then held that the IPC/Ontario erred by failing to engage meaningfully with the legal and factual context underlying the cabinet confidences exemption in Ontario FIPPA.

On September 30, 2024, the Court of King’s Bench of Alberta applied Mandate Letters in finding that the Alberta OIPC erred in failing to adequately engage with the teaching and research records exclusion in Alberta FIPPA.

The request was for information pertaining to a complaint made by two University of Calgary law professors to the Canadian Judicial Council regarding Justice Robin Camp, who resigned from the bench in 2017 after the CJC recommended his removal for comments made in hearing a sexual assault case.

The OPIC construed the teaching and research records exclusion narrowly, and expressly stated, “There is no indication in the Act that these categories are determined via balancing interests in disclosure versus academic freedom.” One can plainly see the conflict between this statement and Mandate Letters.

Teaching records. The disputed teaching records included e-mail discussions among professors about what might be taught in a particular course. The Court held the OPIC erred in treating these records as within the Act because they do not themselves impart knowledge, skill or instruction. It said that the exclusion extends to all “materials arising from activities reasonably necessary to facilitate and/or related to the act of teaching.”

Research records. The Court also held that the OPIC erred in constraining research to “systematic investigation,” explaining:

Whatever the field, research is rarely a siloed activity. Breakthroughs and progress often occur in the crucible of conversation, contention and controversy. Accordingly, to encourage research and innovation, it may be necessary to protect discussions among academic colleagues. 

It further commented that the question is not about the quality or social utility of the research in question, nor does a link to “ideological precepts” diminish a claim to academic freedom – judgement on such matters being within the exclusive domain of the academy. The exclusion, however, does not extend to (pure) social activism

Academics who personally involve themselves in social actions/causes do so with the advantage of time, resources, and status afforded to them by virtue of their affiliation with, and funding by, public institutions. It is appropriate, and in line with the fundamental purposes of freedom of information legislation, that their activities in this realm be subject to scrutiny and oversight.

These findings are at odds with the more constrained view of Ontario’s teaching and research records exclusion taken by the Ontario/IPC, though are principled and threfore applicable outside of Alberta.

Note that this decision is about the substantive scope of the exclusion, and not a University’s entitlement to access teaching and research records. These are distinct issues per City of Ottawa. The Court noted, “The University of Calgary identified and categorized the records at issue as either teaching materials or research materials.”

Governors of the University of Calgary v Alberta Information and Privacy Commissioner, 2024 ABKB 522 (CanLII).

Court shields file path information from the public (and threat actors), addresses scope of s-c privilege

On November 7th, the Newfoundland and Labrador Supreme Court issued an access to information decision with some notable points.

First, the Court held that a public body validly redacted file path information from a document set based on the security of a computer system exemption to the public right of access. The public body adduced good evidence that the paths could be used by threat actors to (a) randomly generate usernames amendable to brute forcing or similar attacks (b) identify domain administrators, and (c) map the network, all creating a real and non-speculative risk of attack. The finding is based on the evidence, but there is nothing unique about the the risk that the Court recognized.

Second, the Court affirmed a decision to apply the privilege exemption based on a solicitor-client privilege claim and despite a dispute between the public body and the Newfoundland Information and Privacy Commissioner about the scope of the so called “continuum of communication.” The Court held the following communications were within the protected continuum:

  • E-mail messages between non-lawyers that were subsequent to the direct giving and receiving of legal advice about “process and timing” (and up the e-mail thread).
  • Drafts of documents known to be subject to editing by legal counsel and from which “an informed reader could readily infer what legal counsel had advised.”
  • Notes, questions and references in documents made by an individual who gave evidence that she received legal advice in relation to all the notes, questions and references.

This finding is as sound as it is protective in my view.

Newfoundland and Labrador (Treasury Board) v. Newfoundland and Labrador (Information and Privacy Commissioner), 2024 NLSC 147 (CanLII)

BC arbitrator finds privacy violation arises out of employer investigation

On October 31, British Columbia labour arbitrator Chris Sullivan awarded $30,000 to a union based on a finding that an employer unnecessarily investigated statements made by a union president in a video that the union claimed to be confidential. He based this award on a breach of the anti-union discrimination provision in the Collective agreement, the union interference provision in the BC Labour Relations Code, and a breach of the BC Freedom of Information and Protection of Privacy Act.

The union posted the video on YouTube without password protection. The union president testified, “that he first attempted to use the private setting for posting videos to the website, but this proved difficult to use as he had to manually enter a great deal of information in order to utilize this setting.” He posted the video openly, but rendered it unsearchable, and posted a confidentiality warning on the YouTube account and embedded a confidentiality warning in the video. The latter warning stated, “[this] video content is considered confidential and intended solely for ATU members.”

A union member leaked the URL for the video to someone in management who did not wish to be identified, who in turn reported the video to another member of management, stating, “you should check this out, it goes against what you are trying to build at transit.” That manager used the URL to watch the video and make a copy, ultimately disciplining the president for what he said in the video (later settling for a without prejudice disciplinary withdrawal). When the union demanded the employer destroy its copy, the employer asserted that it had obtained the video from a union member and that it was searchable on YouTube, both proven to be incorrect.

The crux of Arbitrator Sullivan’s finding is that the employer had no basis for investigating. He said:

Mr. Henegar had received only the Post-it note, followed by a conversation, with a supervisor/manager of the Employer, who did not want their identity revealed. On its own terms, the Employer’s Harassment and Respectful Workplace Policy was not engaged against Mr. Neagu, as no formal complaint was ever made against him, nor was he provided with any details of a complaint including the identity of a complainant as is required by that Policy. Mr. Neagu’s comments as Local Union President in the YouTube Video did not warrant an Employer investigation on any reasonable basis.

The employer and union had agreed that the video contained the union president’s personal information, so it followed from the above finding that the employer had collected the video in breach of FIPPA given the collection was not “necessary.”

This was a debacle. If the employer had watched the video and stopped I suspect it would have been found to be blameless. (Recall that it withdrew its disciplinary charge in a without prejudice settlement that had a plainly prejudicial impact on the outcome.) There were also too many other bad facts that bore upon the employer, including the fact it did not (or felt it could not) disclose the identity of the management employee who raised the video as a concern, and the facts that showed its entire premise for proceeding with investigation and discipline was flawed – my reading of the facts, not that of Arbitrator Sullivan, who held that management’s assertions were intentionally dishonest.

I don’t like this privacy finding for two reasons. First, having not seen the video, I question whether a speech from a union president to union members contains the president’s personal information. Second, Arbitrator Sullivan affirmed the president’s expectation of privacy despite the president’s election not to secure the video through the best means possible. As those who follow this blog know, I’m a fan of using the waiver/abandonment doctrine to incentivize good security practices and hold users accountable for bad security practices. That was not done in this case, though Arbitrator Sullivan’s affirmation was obiter.

The damages award is large for a privacy case, but it was driven by a finding that the employer engaged in a serious interference with union rights.

Corporation of The District of West Vancouver v Amalgamated Transit Union, Local 134, 2024 CanLII 124405 (BC LA)

Perspectives on anonymization report released

On December 18, Khaled El Emam, Anita Fineberg, Elizabeth Jonker and Lisa Pilgram published Perspectives of Canadian privacy regulators on anonymization practices and anonymization information: a qualitative study. It is based on input from all but one Canadian privacy regulator, and includes a great discussion of one of the most important policy issues in Canadian privacy law – What do we do about anonymization given the massive demand for artificial intelligence training data?

The authors stress a lack of precision and consistency in Canadian law. True that the fine parameters of Canadian privacy law are yet to be articulated, but the broad parameters of our policy are presently clear:

  • First, there must be authorization to de-identify personal information. The Canadian regulators who the authors spoke with were mostly aligned against a consent requirement, though not without qualification. If there’s no express authorization to de-identify without consent (as in Ontario PHIPA), one gets the impression that a regulator will not imply consent to de-identify data for all purposes and all manner of de-dentification.
  • Second, custodians of personal information must be transparent. One regulator said, “I have no sympathy for the point of view that it’s better not to tell people so as not to create any noise. I do not believe that that’s an acceptable public policy stance.” So, if you’re going to sell a patient’s health data to a commercial entity, okay, but you better let patients know.
  • Third, the information must be de-identified in a manner that renders the re-identification risk very low in the context. Lots can be said about the risk threshold and the manner of de-identification, and lots that will be said over the next while. The authors recommend that legislators adopt a “code of practice” model for establishing specific requirements for de-dentification.

The above requirements can all be derived from existing legislation, as is illustrated well by PHIPA Decision 175 in Ontario, about a custodian’s sale of anonymized personal health information. Notably, the IPC imposed a requirement on the disclosing custodian to govern the recipient entity by way of the data sale agreement, rooting its jurisdiction in the provision that requires safeguarding of personal health information a custodian’s control. One can question this root, though it is tied to re-identification risk and within jurisdiction in my view.

What’s not in current Canadian privacy legislation is any restriction on the purpose of de-dentification, the identity of recipients, or the nature of the recipient’s secondary use. This is a BIG issue that is tied to data ethics. Should a health care provider ever be able to sell its data to an entity for commercial use? Should custodians be responsible for determining whether the secondary use is likely to harm individuals or groups – e.g., based on the application of algorithmic bias?

Bill C-27 (the PIPEDA replacement bill) permits the non-consensual disclosure of de-identified personal information to specific entities for a “socially beneficial purpose” – “a purpose related to health, the provision or improvement of public amenities or infrastructure, the protection of the environment or any other prescribed purpose.” Given C-27 looks fated to die, Alberta’s Bill 33 may lead the way, and if passed will restrict Alberta public bodies from disclosing “non-personal information” outside of government for any purpose other than “research and analysis” and “planning, administering, delivering, managing, monitoring or evaluating a program or service” (leaving AI model developers wondering how far they can stretch the concept of “research”).

Both C-27 and Bill 33 impose a contracting requirement akin to that imposed by the IPC in Decision 175. Bill 33, for example, only permits disclosure outside of government if:

(ii) the head of the public body has approved conditions relating to the following: (A) security and confidentiality; (B) the prohibition of any actual or attempted re-identification of the non-personal data; (C) the prohibition of any subsequent use or disclosure of the non-personal data without the express authorization of the public body; (D) the destruction of the non-personal data at the earliest reasonable time after it has served its purpose under subclause (i), unless the public body has given the express authorization referred to in paragraph (C),

and

(iii) the person has signed an agreement to comply with the approved conditions, this Act, the regulations and any of the public body’s policies and procedures
relating to non-personal data.

Far be it from me to solve this complex policy problem, but here are my thoughts:

  • Let’s aim for express authorization to-de identify rather than continuing to rely on a warped concept of implied consent. Express authorization best promotes transparency and predictability.
  • I’m quite comfortable with a generally stated re-identification risk threshold, and wary of a binding organizations to a detailed and inaccessible code of practice.
  • Any foray into establishing ethical or other requirements for “research” should respect academic freedom, and have an appropriate exclusion.
  • We need to eliminate downstream accountability for de-identified data of the kind that is invited by the Bill 33 provision quoted above. Custodians don’t have the practical ability to enforce these agreements, and the agreements will therefore invite huge potential liability. Statutes should bind recipients and immunize organizations who disclose de-identified information for a valid purpose from downstream liability.

Do have a read of the report, and keep thinking and talking about these important issues.

BC court rejects “mass surveillance” application

On December 16, the Supreme Court of British Columbia dismissed a Charter application that challenged police use of surveillance cameras to continuously record a public space in an attempt to deter further hate crimes.

Police use of surveillance cameras is attracting attention, primarily because of the ability to integrate surveillance technology with facial recognition and other similarly advanced technologies.

This case is about the use of video surveillance alone. The Vancouver Police Department parked a “public safety trailer” or “PST” on a street in Chinatown after a hate crime incident. It did so to deter further incidents and demonstrate to the community that it was taking action.

The PST had cameras mounted on a 10 metre pole. The cameras had the capacity to pan (360 degrees) and zoom (32x), but the VPD only used them to capture 40 square meters of public space outside the community centre that had been targeted with hateful graffiti. The applicant was a local resident who at first didn’t appreciate what the PST was, but then avoided walking near it to avoid “state surveillance.”

Those interested in privacy advocacy and litigation know that privacy is a concept that people value in wildly different ways. The spectre of this type of surveillance would be shocking to some. The applicant in Ontario case R v Hoang (also unsuccessful in their challenge), described pole mounted camera surveillance by police as follows:

A pole camera has a Big Brother undertone to it. Undertone that becomes the very melody when you consider the contemporary availability of ubiquitous wireless networks and increased availability of miniature devices at nominal costs as well as the massive digital storage media now available. All this means entire streets, neighborhoods, cities could be continuously recorded. Unlimited amounts of information about what its citizens are up to could be gathered by the state authorities. The pole camera is truly “the camel’s nose under the tent.”

What strikes me about the VPD case is how well the VPD did in mitigating the risk that the application judge would take this view – both by good advocacy and good privacy management. Here is some of the mitigating evidence that led the judge to find that the applicant had no reasonable expectation of privacy in the circumstances:

  • VPD adduced evidence of the hate crime itself, in detail. The crime involved egregiously racist anti-Asian graffiti.
  • VPD tied this evidence to the broader context, which showed “a troubling increase in the targeted crime against the Asian community.”
  • VPD adduced good evidence of privacy management, including evidence (a) that it configured PST software in consultation with the Office of the Privacy Commissioner of British Columbia, (b) that it generally minimizes the use of PSTs given their perceived privacy impact, and (c) that the entire chain of command was involved in the decision to implement the PST in Chinatown, based on a clearly articulated objective.
  • VPD adduced evidence demonstrating rationality and proportionality of its response to the hate crime – i.e. evidence of its other investigative efforts and interventions, including deploying more officers to Chinatown.

This evidence swayed the judge to view the entire endeavour favourably, even though the record was not perfectly in favour of the VPD. One PST malfunctioned for a period of time, for example, during which someone tagged the PST itself with graffiti. One could use evidence like this to cast the VPD as Keystone Cops, but the application judge found this problem of no great consequence; equipment malfunctions, and the VPD (acting rationally and aligned with its objectives), replaced malfunctioning PSTs more than once.

On all the above facts and others, the application judge found the applicant had no reasonable expectation of privacy. In my view, there were two factors that drove this outcome. First, the surveillance was conducted openly, so the applicant was able to avoid being surveilled by altering how she travelled through her neighbour hood – i.e. she continued to have control over her informational privacy. Second, the surveillance footage was never used by the VPD or even intended to be used given the VPD’s deterrence objective. The judge said:

Had the VPD used the PST for an investigation, it may have provided them with information with which to help identify a suspect using ordinary investigative techniques. However, there is no evidence that the VPD had any ability to identify pedestrians as they walked through the field of view of the PST. Nor was identification of law-abiding citizens what the police were “really after”

The case therefore stands for the proposition that “deterrence video surveillance” of public spaces does not invite a “search” under section 8 of the Charter. It may be alarming to some, especially given the prospect of AI embedded facial recognition. Ironically, the alarmist picture of police surveillance trailers with powerful cameras on ten metre poles that could be connected to all sorts of matching technology supports the aim of deterrence. However, per Tessling, actual impact rather than “theoretical capabilities” determines the scope of section 8 rights.

Note that the judge also dismissed a allegation that the VPD breached section 7 of the Charter, finding that the choice between taking a “short detour” and being subject to video recording by the state does not impede a protected liberty interest.

Papenbrock-Ryan v Vancouver (City), 2024 BCSC 2288 (CanLII)

Notable features of the Alberta public sector privacy bill

Alberta has recently introduced Bill 33 – a public sector privacy “modernization” bill. Alberta has put significantly more thought into its modernization bill than Ontario, who introduced modest FIPPA reforms in a more splashy and less substantive reform bill earlier this year. This means Bill 33 is significant because it is leading. Might it set the new public sector norm?

Here are Bill 33’s notable features:

  • Bill 33 will require public bodies to give pre-collection notice of an intent to input personal information into an “automated system to generate content or make decisions, recommendations or predications.” Automated system is not defined, and it is unclear if this is meant to foster decision-making transparency or transparency about downstream data use.
  • Bill 33 will require breach notification and reporting based on the “real risk of significant harm” standard. Reports to the OIPC and the Minister responsible for the Act will be required. Requiring reports to the regulator and government is novel.
  • Bill 33 will prohibit the sale of personal information “in any circumstances or for any purpose.” Sale is not defined.
  • Bill 33 has an allowance for disclosing personal information if the disclosure would not constitute an unjustified invasion of personal privacy. This flexible allowance – which contemplates balancing interests – does not typically apply outside of the access request context.
  • Bill 33 has a prohibition on data matching to produce derived personal information about an identifiable individual. This matching will only be permitted for “research and analysis” and “planning, administering, delivering, managing, mentoring or evaluating a program or service” unless additional allowances are implemented by regulation. The Alberta OIPC has said that “research and analysis” should be defined, and that that there should be a transparency requirements for data matching.
  • Bill 33 will establish rules regarding de-identified or “non-personal data.” The rules will permit disclosure of non-personal data to another public body without restriction, but disclosures of non-personal data to others will be limited to specified purposes and subject to requirements that render downstream users accountable to the disclosing public body. Public bodies will also have a duty to secure non-personal data.
  • Bill 33 will require public bodies to establish and implement privacy management programs consisting of documented policies and procedures. It will also mandate privacy impact assessments in circumstances that will be prescribed, with submission to the OIPC also to be prescribed in some circumstances.

There is a long list of exceptions to the indirect collection prohibition in the Bill, but no exceptions that permit the collection of personal information for threat assessment purposes. Violence threat risk assessments have become a standard means by which educational institutions discharge their safety-related duties. “VTRAs” rest on an indirect collection of personal information that should be expressly authorized in any modernized public sector privacy statues.

Saskatchewan IPC issues report on Edge imaging incident

I’m working through a reading pile today, and will note briefly that the Saskatchewan IPC has issued a report about the Edge Imaging cyber incident from earlier this year, which affected a number of Ontario school boards.

It was an atypical incident. Edge Imaging used a subcontractor called Entourage Yearbooks to store and process school yearbook photos. A threat actor accessed an Entourage AWS server, downloaded and deleted photos and held them for ransom. Edge ultimately reported to its school board/division clients that Entourage, “reported that they secured the return of all the Canadian photo files from the threat actors, along with their commitment that the photo files have been deleted, and were not distributed.”

The Saskatchewan IPC report deals with whether the photos contained personal information, whether the affected school divisions met their duty to notify, and whether the service providers investigated reasonably, and whether the affected school divisions took appropriate protective steps in light of the incident. It is very cursory. The matter is simply a reminder about outsourcing risks, which school boards need to manage. The Ontario IPC updated its guidance earlier this year – see Privacy and Access in Public Sector Contracting with Third Party Service Providers.

Edge Imaging (Re), 2024 CanLII 90510 (SK IPC).

BCCA sends notice issue back to BC OIPC

On September 25th, the Court of Appeal for British Columbia partially upheld Airbnb’s successful judicial review of a British Columbia OIPC decision that required the City of Vancouver to disclose short term rental addresses along with related information, but vacated the application judge’s order to notify over 20,000 affected individuals.

Background

The City licenses short term rentals. It publicly discloses license information, presumably to enable renter inquires. However, the City stopped publishing host names and rental addresses with license information in 2018 based on credible reports of safety risks. Evidence of the safety risks was on the record before the OIPC – general evidence about “concerned vigilante activity” and harassment, evidence about a particular stalking episode in 2019 and evidence that raised a concern about enabling criminals to determine when renters likely to be out of the country.

The OIPC nonetheless ordered the City to disclose:

  • License numbers of individuals;
  • Home addresses of all hosts (also principle residences given licensing requirements); and
  • License numbers associated with the home addresses.

It was common ground that the above information could be readily linked to hosts by using publicly available information, rendering the order upsetting to Airbnb’s means of protecting its hosts. Airbnb only discloses the general area of rentals on its platform, which allows hosts to screen renters before disclosing their address.

Supreme Court Decision

The application judge affirmed the OIPC dismissal of the City’s safety concern as a reasonable application of the Merck test, but held that the OIPC erred on two other grounds.

First, the Court held that the OIPC unreasonably held that home address information was contact information rather than personal information. It failed to consider the context in making a simplistic finding that home address information was “contact information” because the home address was used as a place of business. The disclosure of the home address information, in the context, had a significant privacy impact that the OIPC ought to have considered.

Second, the Court held that the OIPC erred in not giving notice to the affected hosts – who numbered at least 20,000 – and for not providing reasons for its failure. The Court said this was a breach of procedural fairness, a breach punctuated by the evidence of a stalking and harassment risk that the OIPC acknowledged but held did not meet the Merck threshold.

Appeal Court Decision

The Court of Appeal affirmed the lower court’s contact information finding. It also held that the matter of notice to third parties ought to have been raised before the OIPC at the first instance, and that the application judge ought not to have ordered notice to be given. It stressed the OIPC’s discretion, and said:

Relevant facts that may inform the analysis include the nature of the records in issue, the number of potentially affected third parties, the practical logistics of providing notice, whether there are alternative means of doing so, and potential institutional resource issues.

Analysis

Giving notice and an opportunity to make submissions to 20,000 affected individuals is no small matter. In this case, valid electronic contact information was likely available. However, even a 2% response rate would generated 400 submissions, each of which deserving of due consideration.

Many institutions, thinking practically, would simply deny access as a means of avoiding this burden and respecting affected party rights, bearing in mind that the Supreme Court of Canada cautioned in Merck that notice should be given prior to disclosure in all but “clear cases.” When an institution denies access to avoid a massive notification burden, that burden transfers to the relevant commissioner/adjudicator, and even recognizing “practical logistics” and “institutional resource issues,” is see no reason why the “clear cases” rule from Merck should not be the governing test.

The Office of the Information and Privacy Commissioner for British Columbia v. Airbnb Ireland UC, 2024 BCCA 333.

NSCA outlines the “law of redaction”

Exactly when should an entire document be withheld because redaction is not reaonable?

Freedom of information adjudicators have used the concept of “disconnected snippets” to delineate; if redaction would leave a reader with meaningless “disconnected snippets,” entire records can rightly be withheld.

The Nova Scotia Court of Appeal, on August 7th, applied similar logic in determining that a set of affidavits “could not be redacted without sacrificing their intelligibility and therefore the utility of public access.” It therefore held that the affidavits could be sealed in whole in compliance with the necessity component of the test from Sherman Estate.

Notably, the Court reviewed cases that establish a second basis for full record withholding – cost. In Patient X v College of Physicians and Surgeons of Nova Scotia, the Nova Scotia Supreme Court held that redacting a 120-page records would be too “painstaking and prone to error” given it included a significant number of handwritten notes. And in Khan v College of Physicians and Surgeons of Ontario, the Ontario Superior Court of Justice reached a similar finding given the record requiring redaction was almost 4,500 pages in length, requiring an error prone hunt for (sensitive) patient information.

Back to freedom of information, where costs are passed through to requesters. In Ontario, the norm is to charge through two minutes a page for redaction. Should a premium be chargeable for handwritten records or records that contain very sensitive information?

Dempsey v. Pagefreezer Software Inc., 2024 NSCA 76 (CanLII).