I had the honour of presenting on cybersecurity oversight today at the Association of Workers’ Compensation Boards of Canada annual Governance Summit. The theme ended up being about leadership and empowerment. I’d like board members to believe that the information security knowledge they require to meet their duties is well within their grasp and to feel a little excited about the learning process. Slides below FYI.
Manitoba Ombudsman Jill Perron has issued her report into Manitoba Families’ 2020 e-mail incident. The incident involved the inadvertent e-mailing of personal health information belonging to 8,900 children in receipt of disability services to approximately 100 external agencies and community advocates. It is such a common incident that it is worth outlining the Ombudsman’s incident response findings.
Manitoba Families meant to transfer the information to the Manitoba Advocate for Children and Youth to support a program review. It included information about services received. Some records included diagnoses.
Manitoba Families mistakenly blind copied the external agencies and advocates on an e-mail that included the information in an encrypted file and a follow-up e-mail that included the password to the file. It had made the same mistake about a week earlier. Several agencies alerted Manitoba Families to its error, and it began containment within a half hour.
The Ombudsman held that Manitoba Families’ containment effort was reasonable. She described it as follows.
Attempts at recalling the email began minutes later at 8:29 a.m. and continued at various intervals. Also, at 8:35 a.m., CDS sent an email to all unintended recipients noting in bold that they were incorrectly included on a confidential email from Children’s disAbility Services and requested immediate deletion of the email and any attachments. Follow up calls to the unintended recipients by CDS program staff began to occur that morning to request deletion of the emails and a list was created to track these calls and the outcomes. A communication outline was created for these calls which included a request to delete emails, a further request that emails be deleted from the deleted folder and that any emails that went to a junk email folder also be deleted…
In January 2021, we received additional written communication from the program stating that all agency service providers and advocates were contacted and verified deletion of the personal health information received in error. The log form created to track and monitor the name of the organization, the date and details of the contact was provided to our office.
The Ombudsman reached a similar finding regarding Manitoba Families’ notification effort, though she needed to recommend that Manitoba Families identify the agencies and advocates to affected individuals, which Manitoba Families agreed to do upon request.
What’s most significant – especially given class action proceedings have been commenced – is a point the Ombudsman made about evidence that Manitoba Families appears not to have gathered.
In addition to assuring families about the deletion of the email, additional information such as who viewed the email, if the attachment was opened and read, whether it was forwarded to anyone else or printed, whether it was stored in any other network drive or paper file or, conversely, that no records exist – can be helpful information to provide those affected by a privacy breach. It is best practice, therefore, to provide families with as much assurance as possible about the security of their child’s health information.
The question is, what is one to make of an arguable shortcoming in an incident response investigation? I say “arguable” because the probability of any of these actions occurring is very low in the unique circumstances of this incident, which involved trusted individuals receiving a password-protected and encrypted file. Manitoba Families ought to have collected this evidence because they called the e-mail recipients anyway, it is helpful and was probably available for collection. If it did not do so, however, I believe it is perfectly acceptable to for Manitoba Families to stand by the scope of a narrower investigation and and put the plaintiff to proof.
On April 15th, the Court of Appeal for Manitoba held that an accused had no reasonable expectation of privacy in information that a rental car agency provided to the police without a warrant.
The police were investigating a fatal shooting. The shooter was in a rental car that belonged to a specific agency, they knew. When the police asked, the agency identified the co-accused as the renter and the accused as an authorized driver. It also provided their cell phone numbers, drivers license numbers and credit card numbers.
The Supreme Court of Canada decision in Spencer dictates that the PIPEDA allowance for volunteering information to the police does not vitiate one’s expectation of privacy for the purpose of Charter analysis. The Court of Appeal acknowledged this, and as in Spencer, it also held that contract language allowing for the disclosure of personal information as “required or permitted by law” was “of no real assistance.”
However, the Court of Appeal distinguished Spencer on other grounds. Its decision turns on the following key factors:
- the rental agreement allowed the agency share information with law enforcement “to take action regarding illegal activities or violations of terms of service”
- section 22 of the Manitoba Highway Traffic Act requires agencies to keep a registry of renters that is open to public inspection (even though the registry is to include “particular’s of the [renter’s] drivers license”)
- the overall context – i.e., that driving is a highly regulated activity, with one’s identity as an operator of a vehicle being something that is widely known and ought to be widely knowable
Privacy advocates will take issue with the Court’s reliance on the rental agreement term, though the case does rest on two other significant factors, including a provision of Manitoba law that the accused did not challenge. On a quick look, I see that Saskatchewan has the same provision.
Last autumn, the Ontario government struck an expert panel of cyber advisors. Among other things, it gave the panel a mandate to “assess and identify common and sector-specific cyber security themes and challenges encountered by Broader Public Sector (BPS) agencies and service delivery partners in Ontario.”
The panel got quickly to work, and in late 2020 gathered feedback from panel members and BPS stakeholders to produce an interim report under the name of its Chair, Robert Wong. The interim report is as unsurprising as it is alarming, speaking to wide-ranging maturity levels derived from under-resourcing as well as failures of governance. It includes characterizations of well-understood governance challenges in the university, school board and health care sectors. On universities, for example, the Chair reports:
Even in institutions with relatively strong and mature corporate governance practices, there are still significant challenges to effectively manage cyber security risks that result from competing priorities and inconsistent application of oversight and policies. For example, funding in higher education comes from various sources and is allocated based on various criteria. Some university research groups that have successfully secured grants or private sponsorship dollars often have a sense of entitlement and feel that because it is their money, they get to call the shots and ignore cyber security concerns when they procure technology tools. Why don’t universities impose the same cyber security requirements on their researchers as they do on other faculty and staff?
Notably, the Chair says, “A regional-based shared-services model may be the only viable option for the smaller players to be able to afford and gain access to the limited availability of technical expertise in the marketplace.”
He also makes the following two interim recommendations, one to government and another to BPS entities themselves:
1. That the National Institute of Standards and Technology (NIST) Cybersecurity Framework be endorsed by the Government of Ontario for the Broader Public Sector’s cyber security practices. If an entity has already adopted a cyber security framework other than that of NIST, the expectation is that they map the framework they are using to the NIST framework to ensure alignment and consistency. Understanding that BPS entities vary in size and risk-profile, it is reasonable to expect that the breadth and depth to which the NIST Cybersecurity Framework is implemented will also vary accordingly, following a risk-based approach. To assist small- and medium-sized organizations in adopting and implementing the NIST framework, the Canadian Centre for Cyber Security’s “Baseline Cyber Security Controls for Small and Medium Organizations” is a useful guide that provides the fundamental requirements for an effective cyber security practice that aligns with the NIST framework.
2. That all BPS entities implement a Cyber Security Education and Awareness Training Program. The content of the training materials shall be maintained to ensure currency of information. New employees shall receive the training immediately after joining the company as part of their orientation program, and all existing employees shall receive refresher training on an annual basis, at a minimum. Information Technology and cyber security specialists shall receive regular cyber security technical training to ensure their skills are kept current. Specialized educational materials may be developed that would be appropriate for boards of directors, senior executives and any other key decision-makers. Effective management of cyber security risks requires the efforts and commitment of everyone and cannot simply be delegated to the cyber security professionals. A strong “tone-at-the-top” is a critical success factor to strengthen the cyber security resilience of BPS service delivery partners.
The panel is not a standard setting entity, but the second recommendation does establish something to which BPS entities now ought to strive. Of course, this raises the question of resourcing. Minister Lisa Thompson’s response to the interim report suggests that the government’s assistance will be indirect, via the Cyber Security Centre of Excellence’s learning portal.
We have just posted all the content for our BLG series “Privacy & Cyber Risks, Trends & Opportunities for Business.” See here for some very good content by our privacy and data security team.
Here is a direct link to our most recent webinar, which I delivered together with my partner Patrice Martin. It was very rewarding to work with and learn from Patrice, a very well established technology industry and transactions lawyer.
Enjoy. Learn. Get in touch.
I like speaking about incident response because there are so many important practical points to convey. Every so often I re-consolidate my thinking on the topic and do up a new slide deck. Here is one such deck from this week’s presentation at Canadian Society of Association Executives Winter Summit. It includes an adjusted four step description of the response process that I’m content with.
We’ve been having some team discussions over here about how incident response plans can be horribly over-built and unusable. I made the point in presenting this that one could take the four step model asset out in this deck, add add a modest amount of “meat” to the process (starting with assigning responsibilities) and append some points on how specific scenarios might be handled based on simple discussion if not a bona fide tabletop exercise.
Preparing for a cyber incident isn’t and shouldn’t be hard, and simple guidance is often most useful for dealing with complex problems.
Here is a deck I just put together for the The Osgoode Certificate in Privacy & Cybersecurity Law that gives a high-level perspective on the state of FOI, in particular given (a) the free flow of information that can eviscerate practical obscurity and (b) the serious cyber threat that’s facing our public institutions. As I said in the webinar itself, I’m so pleased that Osgoode PDP has integrated an FOI unit into into its privacy and cyber program given it is such a driver of core “information law.”
For related content see this short paper, Threat Exchanges and Freedom of Information Legislation, 2019 CanLIIDocs 3716. And here’s a blog post from the archives that with some good principled discussion that I refer to – Principles endorsed in Arar secrecy decision.
Hat tip to my good colleague Francois Joli-Coeur, who let our group know yesterday that the OIPC Alberta has issued a number of breach notification decisions about the Blackbaud incident, finding in each one that it gave rise to a “real risk of significant harm” that warrants notification and reporting under Alberta PIPA.
Blackbaud is a cloud service provider to organizations engaged in fundraising who suffered a ransomware incident last spring in which hackers exfiltrated the personal information of donors and educational institution alumni. The true scope of the incident is unknown, but likely large, affecting millions of individuals across the globe.
Blackbaud issued notably strong communications that de-emphasized the risk of harm. It rested primarily on the payment of a ransom, assurances by the threat actors that they would delete all data in exchange for payment and its ongoing dark web searches. Most affected institutions (Blackbaud clients) notified anyway.
On my count the OIPC issued seven breach notification decisions about the incident late last year, each time finding a “real risk.” In a decision involving an American college with donors or alumni in Alberta, the OIPC said:
In my view, a reasonable person would consider the likelihood of significant harm resulting from this incident is increased because the personal information was compromised due to a deliberate unauthorized intrusion by a cybercriminal. The Organization reported that the cybercriminal both accessed and stole the personal information at issue. The Organization can only assume that cybercriminal did not or will not misuse, disseminate or otherwise make available publicly the personal information at issue.
This is not surprising, but tells us how the OIPC feels about the assurance gained from paying a ransom to recover stolen data.
On December 9th, the Court of Appeal for British Columbia rejected a media challenge that alleged that the Court’s access policy violates section 2(b) of the Canadian Charter of Rights and Freedoms because it precludes wholly unfettered inspection of court records. The Court held that the Charter guarantees no such right, which would be inconsistent with a court’s responsibility for supervising the handling of its records.
The policy that the applicants challenged requires those seeking access to court records in criminal appeals to fill out of a form. Based on the content of completed forms, the Registrar may refer the request to the Chief Justice, who may seek input from the parties. In practice, if parties who are consulted don’t agree that the Court should provide access, those seeking access must file a formal application for access.
The media brought its application after the Court denied administrative access to records (filed in an application for bail pending appeal) that involved the investigation of a police officer for sexual misconduct. The media argued that the policy reverses the burden of justification provided for by Dagenais/Mentuck.
Chief Justice Bauman disagreed, stating:
Unfettered public access to court records is not the promise of the open court principle. That access is subject to supervision by the court, in recognition of the need to protect social values of superordinate importance. Judges have the discretion to order restrictions on access, exercised within the boundaries set by the principles of the Charter:
There is also nothing unlawful, Chief Justice Bauman held, in requiring requesters to confront a matter of administration (which was not associated with any proven material delay) in order to relieve the parties to a proceeding from preemptively seeking a sealing order.
I’ve had a long time interest in threat assessment and its application by educational institutions in managing the risk of catastrophic physical violence, though it has been a good ten years since the major advances in Canadian institutional policy. Here is a pointer to a journal article about an apparent new United States trend – automated monitoring of online and social media posts for threat assessment purposes.
Author Amy B. Cyphert starts with an illustrative scenario that’s worth quoting in full:
In 2011, a seventeen–year–old named Mishka,1 angry that his friends had recently been jumped in a fight, penned a Facebook post full of violence, including saying that his high school was “asking for a [expletive] shooting, or something.” Friends saw the post and alerted school officials, who contacted the police. By the time psychologist Dr. John Van Dreal, who ran the Safety and Risk Management Program for Mishka’s Oregon public school system, arrived, Mishka was in handcuffs.4 Mishka and his classmates were lucky: their school system employed a risk management program, and Dr. Van Dreal was able to help talk with Mishka about what caused him to write the post. Realizing that Mishka had no intention of harming anyone, Dr. Van Dreal helped Mishka avoid being charged with a criminal offense. Dr. Van Dreal also arranged for him to attend a smaller school, where he found mentors, graduated on time, and is today a twenty–five–year–old working for a security firm.
Had Mishka’s story happened today, just eight short years later, it might have looked very different. First, instead of his friends noticing his troubled Facebook post and alerting his school, it might have been flagged by a machine learning algorithm developed by a software company that Mishka’s school paid
tens of thousands of dollars to per year. Although Mishka’s post was clearly alarming and made obvious mention of possible violence, a post flagged by the algorithm might be seemingly innocuous and yet still contain terms or features that the algorithm had determined are statistically correlated with a higher likelihood of violence. An alert would be sent to school officials, though the algorithm would not necessarily explain what features about the post triggered it. Dr. Van Dreal and the risk management program? They might have been cut in order to pay for the third-party monitoring conducted by the software company. A school official would be left to decide whether Mishka’s post warranted some form of school discipline, or even a referral to the authorities.
Cyphert raises good questions about the problem of bias associated with algorithmic identification and about the impact of monitoring and identification on student expression, privacy and equality rights.
My views are quite simple.
I set aside algorithmic bias as a fundamental concern because the baseline (traditional threat assessment) is not devoid of its own problems of bias; technology could, at least in theory, lead to more fair and accurate assessments.
I also put my main concern on the matter of efficacy. Nobody disputes that schools and higher education institutions should passively receive threat reports from community members. My questions. Has the accepted form of surveillance failed? What is the risk passive surveillance will fail? How will it fail? To what degree? Does that risk call for a more aggressive, active monitoring solution? Is there an active monitoring solution that is likely to be effective, accounting concerns about bias?
If active internet monitoring cannot be shown to be reasonably necessary, however serious the problem of catastrophic physical violence, I question whether it can be either legally justifiable or required in order to meet the standard of care. Canadian schools and institutions who adopt new threat surveillance technology because it may be of benefit, without asking the critical questions above may invite a new standard of care with tenuous underpinnings.
Cyphert, Amy B. (2020) “Tinker-ing with Machine Learning: The Legality and Consequences of Online Surveillance of Students,” Nevada Law Journal: Vol. 20 : Iss. 2 , Article 4.
Available at: https://scholars.law.unlv.edu/nlj/vol20/iss2/4