Good quotes on the impossibility of “ensuring” security and achieving zero risk

I blogged about Arbitrator Sudykowski’s decision in Providence Health when it was released in 2011 for its ratio – employers are entitled to more than a bare medical certification when an employee is absent from work.

I had occasion to use the case in a matter I argued yesterday, and was pleasantly surprised to re-read what Arbitrator Surdykowski said about data security and the impossibility of “ensuring” data security. The union had made an argument for minimizing the collection of health information that rested on data security risk, to which Mr. Surkyowski replied:

I agree with the Union’s assertion that there is always a possibility that private and confidential medical information may be inadvertently released or used inappropriately.  Try as they might, it is impossible for anyone to absolutely guarantee information security.  All that anyone can do in that respect is the best they can.  There is nothing before me that suggests the extent to which the inadvertent (or intentional) release or misuse of confidential information occurs, either generally or at the workplaces operated by this Employer.  More specifically, there is no indication of how often it happens, if at all, or that best efforts are demonstrably “not good enough”.

In a perfect world, the security and proper use of confidential private medical (or other) information could and would be guaranteed.  But to be perfect the world would have to be populated by perfect human beings.

This is a nice quote to bring forward in this blog, of course, because it’s always a good to remind ourselves (and others) that the mere happening of a security incident doesn’t mean fault!

It’s a hard point to argue when hindsight bears heavily on a decision-maker, but is indisputable. I once defended on employer in a charge that followed a rather serious industrial accident in which an employee at truck dealership was run over by a tractor. The Court of Appeal held that the tractor wasn’t a “vehicle” for the purposes of the Occupational Health and Safety Act and entered an acquittal. In examining the context for this finding Justice Cronk made the same point as Arbitrator Surdykowski:

That said, consideration of the protective purposes of the legislative scheme is not the only consideration when attempting to ascertain the scope of s. 56 of the Regulation. The Act seeks to achieve “a reasonable level of protection” (emphasis added) for workers in the workplace. For obvious reasons, neither the Act nor the Regulation mandate or seek to achieve the impossible — entirely risk-free work environments.

Every security incident is an opportunity to tell a story about pre-incident due diligence that highlights this basic truth. (If your defence rests our horrendously vague privacy law you’re in trouble, I say.) It’s also reason to hold our tongues and not judge organizations who are victimized, at least before learning ALL the facts. Security incidents are complex. Data security is hard.

Organization stumbles into BYOD nightmare

Hat tip to investigation firm Rubin Thomlinson for bringing an illustrative British Columbia arbitration decision to my attention. The remarkable April 2019 case involves an iPhone wiped by an employee’s wife mid-investigation!

The iPhone was owned by the employer, but it set it up using the employee’s personal Apple ID. That is not uncommon, but the employer apparently did not use any mobile device management software. To enforce its rights, the employer relied solely on its mobile device (administrative) policy, which disclaimed all employee privacy rights and stipulated that all data on employer devices is employer-owned.

Problems arose after the employer received a complaint that the employee was watching his female colleagues. The complainants said the employee “might also be taking pictures” with his phone.

The employer met with the employee to investigate, and took custody of the phone. The employee gave the employer the PIN to unlock the phone, but then asked for the phone back because it contained personal information. The employer excluded the employee and proceeded to examine the phone, but did not finish its examination before the employee’s wife (who the employee had phoned) remotely wiped the phone and refused to restore it with backup data.

The employer terminated the employee for watching the complainants (though not necessarily taking their pictures) and for insubordination.

The arbitrator held that the employer did not prove either voyeurism or insubordination. In doing so, he held that the employer had sufficient justification to search the phone but that it could not rely on its mobile device policy to justify excluding the employee from the examination process and demanding the recovery of the lost data. Somewhat charitably, the arbitrator held that the employee ought to be held “accountable for failing to make an adequate effort to encourage his wife to allow for recovery of the data” and reserved his decision on the appropriate penalty.

The employer took far too much comfort from its ownership of the device. Given the phone was enabled by the employee’s personal Apple ID, the employer was faced with all the awkwardness, compromise and risks of any BYOD arrangement. Those risks can be partially mitigated by the use of mobile device management software. Policy should also clearly authorize device searches that are to be conducted with a view to the (quite obvious) privacy interest at stake.

District of Houston v Canadian Union of Public Employees, Local 2086, 2019 CanLII 104260 (BC LA).

For Rubin Thomlinson’s more detailed summary of the case, please see here.

 

 

Saskatchewan Commissioner recommends clean desk policy for lawyers

On November 27th, the Saskatchewan Information and Privacy Commissioner faulted the Saskatchewan Legal Aid Commission for failing to have and maintain a clean desk policy – i.e., a policy requiring files to be put away and locked overnight – given cleaning staff had unsupervised after hours access to its office. The IPC relied on the Commission’s own policy, which encouraged but did not mandate clean desks. The matter came to the IPC’s attention after cleaning staff left two layers of doors open one night.

Saskatchewan Legal Aid Commission (Re), 2019 CanLII 113284 (SK IPC).

What’s significant about the Loblaw report

I finally got around to reading the @PrivacyPrivee report of findings on Loblaw’s manner of authenticating those eligible for a gift card. The most significant (or at least enlightening) thing about the report is that the OPC held that residential address, date of birth, telephone number and e-mail address were, together, “sensitive.” It did so in assessing the adequacy of the contractual measures Loblaw used in retaining a service provider for processing purposes. It said:

  1. The contract also provided guarantees of confidentiality and security of personal information, and included a list of specific safeguard requirements, such as: (i) implementing measures to protect against compromise of its systems, networks and data files; (ii) encryption of personal information in transit and at rest; (iii) maintaining technical safeguards through patches, etc.; (iv) logging and alerts to monitor systems access; (v) limiting access to those who need it; (vi) training and supervision of employees to ensure compliance with security requirements; (vii) detailed incident response and notification requirements; (viii) Loblaw’s pre-approval of any third parties to whom JND wishes to share personal information, as well as a requirement for JND to ensure contractual protections that are at a minimum equivalent to those provided for by its contract with Loblaw; and (ix) to submit to oversight, monitoring, and audit by Loblaw of the security measures in place.
  2. As outlined above, the additional ID’s requested by the Program Administrator were collected through a secure channel (if online) or by mail, verified and then destroyed.
  3. In our view, given the limited, albeit sensitive, information that was shared with the Program Administrator, as well as the limited purposes and duration for which that information would be used, Loblaw’s detailed contractual requirements were sufficient to ensure a level of protection that was comparable to that which would be required under the Act. Therefore, in our view, Loblaw did not contravene Principle 4.1.3 of Schedule 1 of the Act.

Residential address, date of birth, telephone number and e-mail address is a set of basic personal information. In analyzing it, one must recall the “contact information” that the Ontario Superior Court of Justice said was not “private” enough to found a class action claim in Broutzas.

Don’t be misled, though. The OPC made its finding because Loblaw was engaged in authentication, and collected a data set precisely geared to that purpose. The potential harm – identity theft – was therefore real, supporting finding that the data set as a whole was sensitive. Context matters in privacy and data security. And organizations, guard carefully the data you use to identify your customers.

NIST’s recommended password policy evolves

As imperfect a means of authentication as they are, “memorized secrets” like passwords, pass phrases and PINs are common, and indeed are the primary means of authentication for most computer systems. In June, the National Institute of Standards and Technology issued a new publication on digital identity management that, in part, recommends changes to password policy that has become standard in many organizations – policy requiring passwords with special characters.

Here is what the NIST says:

Memorized secrets SHALL be at least 8 characters in length if chosen by the subscriber. Memorized secrets chosen randomly by the CSP or verifier SHALL be at least 6 characters in length and may be entirely numeric. If the CSP or verifier disallows a chosen memorized secret based on its appearance on a blacklist compromised values, the subscriber SHALL be required to choose a different memorized secret. No other complexity requirements for memorize secrets SHOULD be imposed.

The NIST believes that the complexity derived from special characters is of limited benefit to security, yet creates (well known) useability problems and promotes “counterproductive” user behaviour – writing passwords down or storing them electronically in plain text. It’s better, according to the NIST, to allow for long passwords (that may incorporate spaces) and use other protective measures such as password blacklists, secure hashed password storage and limits to the number of failed authentication attempts.

The NIST publication includes other related guidance, including a recommendation against routine password resetting.

NIST Special Publication 800-63B – Digital Identity Guidelines (June 2017)