Here’s a copy of a presentation I gave yesterday at the High Technology Crime Investigation Association virtual conference. It adresses the cyber security pressures on public bodies that arise out of access-to-information legislation, with a segment on how public sector incident response differs from incident response in the private sector
The Supreme Court of the United Kingdom has decided an important vicarious liability case in favour of a company whose rogue employee stole payroll information and posted it online.
The company entrusted the employee with payroll data pertaining to over 120,000 of its employees to facilitate an audit. The employee – who was still aggrieved about some discipline the company had earlier imposed – passed the data to the auditor as instructed, but kept a copy and posted it online as part of a personal vendetta.
As in Canadian law, United Kingdom law deems employers to be responsible for the wrongful acts of their employees that are not authorized if there is a “sufficient connection” between the wrongful act and the work that is authorized. The creation of “opportunity” to commit the wrong is a factor, and the analysis is to be conducted with a view to the policy-implications, leading some to argue that data security concerns justify broadly-imposed vicarious liability.
Nonetheless, the Court held that cause (or the creation of opportunity) was not enough to warrant this employer’s liability for its employee’s data theft. That is, the employee’s theft (and his public disclosure) was caused by the company’s provision of data to the employee, but the employee was still motivated to harm the employer and “on a frolic of his own” that did not warrant employer liability.
Many of you know Dustin Rivers and Chris Lutz of the Public Service Information Community Connection, who run some of our major Canadian privacy conferences. Like the great entrepreneurs they are, Dustin and Chris have put together an online kids camp for delivery to COVID-sequestered kids from across the globe!
I volunteered as a camp instructor and just did this presentation. It was fun, and a great exercise to reduce the subject matter I deal with in a far different context to something that could be understood by six to ten year olds! Not only that, my son and I created the deck together – more learning.
Here’s the deck. Next time I’ll record!
I blogged about Arbitrator Sudykowski’s decision in Providence Health when it was released in 2011 for its ratio – employers are entitled to more than a bare medical certification when an employee is absent from work.
I had occasion to use the case in a matter I argued yesterday, and was pleasantly surprised to re-read what Arbitrator Surdykowski said about data security and the impossibility of “ensuring” data security. The union had made an argument for minimizing the collection of health information that rested on data security risk, to which Mr. Surkyowski replied:
I agree with the Union’s assertion that there is always a possibility that private and confidential medical information may be inadvertently released or used inappropriately. Try as they might, it is impossible for anyone to absolutely guarantee information security. All that anyone can do in that respect is the best they can. There is nothing before me that suggests the extent to which the inadvertent (or intentional) release or misuse of confidential information occurs, either generally or at the workplaces operated by this Employer. More specifically, there is no indication of how often it happens, if at all, or that best efforts are demonstrably “not good enough”.
In a perfect world, the security and proper use of confidential private medical (or other) information could and would be guaranteed. But to be perfect the world would have to be populated by perfect human beings.
This is a nice quote to bring forward in this blog, of course, because it’s always a good to remind ourselves (and others) that the mere happening of a security incident doesn’t mean fault!
It’s a hard point to argue when hindsight bears heavily on a decision-maker, but is indisputable. I once defended on employer in a charge that followed a rather serious industrial accident in which an employee at truck dealership was run over by a tractor. The Court of Appeal held that the tractor wasn’t a “vehicle” for the purposes of the Occupational Health and Safety Act and entered an acquittal. In examining the context for this finding Justice Cronk made the same point as Arbitrator Surdykowski:
That said, consideration of the protective purposes of the legislative scheme is not the only consideration when attempting to ascertain the scope of s. 56 of the Regulation. The Act seeks to achieve “a reasonable level of protection” (emphasis added) for workers in the workplace. For obvious reasons, neither the Act nor the Regulation mandate or seek to achieve the impossible — entirely risk-free work environments.
Every security incident is an opportunity to tell a story about pre-incident due diligence that highlights this basic truth. (If your defence rests our horrendously vague privacy law you’re in trouble, I say.) It’s also reason to hold our tongues and not judge organizations who are victimized, at least before learning ALL the facts. Security incidents are complex. Data security is hard.
Hat tip to investigation firm Rubin Thomlinson for bringing an illustrative British Columbia arbitration decision to my attention. The remarkable April 2019 case involves an iPhone wiped by an employee’s wife mid-investigation!
The iPhone was owned by the employer, but it set it up using the employee’s personal Apple ID. That is not uncommon, but the employer apparently did not use any mobile device management software. To enforce its rights, the employer relied solely on its mobile device (administrative) policy, which disclaimed all employee privacy rights and stipulated that all data on employer devices is employer-owned.
Problems arose after the employer received a complaint that the employee was watching his female colleagues. The complainants said the employee “might also be taking pictures” with his phone.
The employer met with the employee to investigate, and took custody of the phone. The employee gave the employer the PIN to unlock the phone, but then asked for the phone back because it contained personal information. The employer excluded the employee and proceeded to examine the phone, but did not finish its examination before the employee’s wife (who the employee had phoned) remotely wiped the phone and refused to restore it with backup data.
The employer terminated the employee for watching the complainants (though not necessarily taking their pictures) and for insubordination.
The arbitrator held that the employer did not prove either voyeurism or insubordination. In doing so, he held that the employer had sufficient justification to search the phone but that it could not rely on its mobile device policy to justify excluding the employee from the examination process and demanding the recovery of the lost data. Somewhat charitably, the arbitrator held that the employee ought to be held “accountable for failing to make an adequate effort to encourage his wife to allow for recovery of the data” and reserved his decision on the appropriate penalty.
The employer took far too much comfort from its ownership of the device. Given the phone was enabled by the employee’s personal Apple ID, the employer was faced with all the awkwardness, compromise and risks of any BYOD arrangement. Those risks can be partially mitigated by the use of mobile device management software. Policy should also clearly authorize device searches that are to be conducted with a view to the (quite obvious) privacy interest at stake.
For Rubin Thomlinson’s more detailed summary of the case, please see here.
On November 27th, the Saskatchewan Information and Privacy Commissioner faulted the Saskatchewan Legal Aid Commission for failing to have and maintain a clean desk policy – i.e., a policy requiring files to be put away and locked overnight – given cleaning staff had unsupervised after hours access to its office. The IPC relied on the Commission’s own policy, which encouraged but did not mandate clean desks. The matter came to the IPC’s attention after cleaning staff left two layers of doors open one night.
I finally got around to reading the @PrivacyPrivee report of findings on Loblaw’s manner of authenticating those eligible for a gift card. The most significant (or at least enlightening) thing about the report is that the OPC held that residential address, date of birth, telephone number and e-mail address were, together, “sensitive.” It did so in assessing the adequacy of the contractual measures Loblaw used in retaining a service provider for processing purposes. It said:
- The contract also provided guarantees of confidentiality and security of personal information, and included a list of specific safeguard requirements, such as: (i) implementing measures to protect against compromise of its systems, networks and data files; (ii) encryption of personal information in transit and at rest; (iii) maintaining technical safeguards through patches, etc.; (iv) logging and alerts to monitor systems access; (v) limiting access to those who need it; (vi) training and supervision of employees to ensure compliance with security requirements; (vii) detailed incident response and notification requirements; (viii) Loblaw’s pre-approval of any third parties to whom JND wishes to share personal information, as well as a requirement for JND to ensure contractual protections that are at a minimum equivalent to those provided for by its contract with Loblaw; and (ix) to submit to oversight, monitoring, and audit by Loblaw of the security measures in place.
- As outlined above, the additional ID’s requested by the Program Administrator were collected through a secure channel (if online) or by mail, verified and then destroyed.
- In our view, given the limited, albeit sensitive, information that was shared with the Program Administrator, as well as the limited purposes and duration for which that information would be used, Loblaw’s detailed contractual requirements were sufficient to ensure a level of protection that was comparable to that which would be required under the Act. Therefore, in our view, Loblaw did not contravene Principle 4.1.3 of Schedule 1 of the Act.
Residential address, date of birth, telephone number and e-mail address is a set of basic personal information. In analyzing it, one must recall the “contact information” that the Ontario Superior Court of Justice said was not “private” enough to found a class action claim in Broutzas.
Don’t be misled, though. The OPC made its finding because Loblaw was engaged in authentication, and collected a data set precisely geared to that purpose. The potential harm – identity theft – was therefore real, supporting finding that the data set as a whole was sensitive. Context matters in privacy and data security. And organizations, guard carefully the data you use to identify your customers.
As imperfect a means of authentication as they are, “memorized secrets” like passwords, pass phrases and PINs are common, and indeed are the primary means of authentication for most computer systems. In June, the National Institute of Standards and Technology issued a new publication on digital identity management that, in part, recommends changes to password policy that has become standard in many organizations – policy requiring passwords with special characters.
Here is what the NIST says:
Memorized secrets SHALL be at least 8 characters in length if chosen by the subscriber. Memorized secrets chosen randomly by the CSP or verifier SHALL be at least 6 characters in length and may be entirely numeric. If the CSP or verifier disallows a chosen memorized secret based on its appearance on a blacklist compromised values, the subscriber SHALL be required to choose a different memorized secret. No other complexity requirements for memorize secrets SHOULD be imposed.
The NIST believes that the complexity derived from special characters is of limited benefit to security, yet creates (well known) useability problems and promotes “counterproductive” user behaviour – writing passwords down or storing them electronically in plain text. It’s better, according to the NIST, to allow for long passwords (that may incorporate spaces) and use other protective measures such as password blacklists, secure hashed password storage and limits to the number of failed authentication attempts.
The NIST publication includes other related guidance, including a recommendation against routine password resetting.