The Five Whys, the discomfort of root cause analysis and the discipline of incident response

Here is a non-law post to pass on some ideas about root cause analysis, The Five Whys, and incident response.

This is inspired by having finished reading The Lean Startup by Eric Ries. It’s a good book end-to-end, but Ries’ chapter on adaptive organizations and The Five Whys was most interesting to me – inspiring even!

The Five Whys is a well-known analytical tool that supports root cause analysis. Taichii Ohno, the father of the Toyota Production System, described it as “the basis of Toyota’s scientific approach.” By asking why a problem has occurred five times – therefore probing five causes deep – Ohno says, “the nature of the problem as well as its solution becomes clear.” Pushing to deeper causes of a failure is plainly important; if only the surface causes of a failure are addressed, the failure is near certain to recur.

Reis, in a book geared to startups, explains how to use The Five Whys as an “automatic speed regulator” in businesses that face failures in driving rapidly to market. The outcome of The Five Whys process, according to Ries, is to make a “proportional” investment in corrections at each five layers of the causal analysis – proportional in relation to to the significance of the problem.

Of course, root cause analysis is part of security incident response. The National Institute of Standards and Technology suggests that taking steps to prevent recurrences is both part of eradication and recovery and the post-incident phase. My own experience is that root cause analysis in incident response is often done poorly – with remedial measures almost always targeted at surface level causes. What I did not understand until reading Ries, is that conducting the kind of good root cause analysis associated with The Five Whys is HARD.

Ries explains that conducting root cause analysis without a strong culture of mutual trust can devolve into The Five Blames. He gives some good tips on how to implement The Five Whys despite this challenge: establishing norms around accepting the first mistake, starting with less than the full analytical process and using a “master” from the executive ranks to sponsor root cause analysis.

From my perspective, I’ll now expect a little less insight out of clients who are in the heat of crises. It may be okay to go a couple levels deep while an incident is still live and while some process owners are not even apprised of the incident – just deep enough to find some meaningful resolutions to communicate to regulators and other stakeholders. It may be okay to tell these stakeholders “we will [also] look into our processes and make appropriate improvements to prevent a recurrence” – text frequently proposed by clients for notification letters and reports.

What clients should do, however is commit to conducting good root cause analysis as part of the post-incident phase:

*Write The Five Whys into your incident response policy.

*Stipulate that a meeting will be held.

*Stipulate that everyone with a share of the problem will be invited.

*Commit to making a proportional investment to address each identified cause.

Ries would lead us to believe that this will be both unenjoyable yet invaluable – good reason to use your incident response policy to help it become part of your organization’s discipline.

Cyber defence basics – Maritime Connections

I was pleased to do a cyber defence basics presentation to privacy professionals attending the Public Service Information Community Connection “Maritime Connections” event yesterday. The presentation (below) is based off of recent publications by the New York Department of Financial Services and the Information Commissioner’s Office (UK) as as the (significant) Coveware Q3 ransomware report.

As I said to the attendees, I am not a technical expert and no substitute for one, but those of us outside of IT and IT security who work in this space (along with the predominantly non-technical management teams we serve) must engage with the key technical concepts underpinning IT security if we are to succeed at cyber defence.

I’ll do an updated version next week at Saskatchewan Connections next week. Join us!

The twelve security failures underscoring the ICO’s recent £500,000 fine

On February 10th the Information Commissioner’s Office fined Cathay Pacific £500,000 for breaching the security principle established under the UK Data Protection Act. Here are the twelve security failures that were the basis of the finding (with underlined text in the ICO’s words plus my annotation):

    • The database backups were not encrypted. The ICO said this was a departure from company policy undertaken due to a data migration project, but a company approval and risk mitigation requirement was apparently not followed.
    • The internet-facing server was accessible due to a known and publicized vulnerability. The Common Vulnerabilities and Exposure website listed the vulnerability approximately seven years before it was exploited, said the ICO.
    • The administrator console was publicly accessible via the internet. This was done to facilitate vendor access, without a risk assessment according to the ICO. The ICO said the company ought to have used a VPN to enable vendor access.
    • System A was hosted on an operating system that was (and is) no longer supported. The ICO noted that the company neither replaced the system or purchased extended support.
    • Cathay Pacific could not provide evidence of adequate server hardening.
    • Network users were permitted to authenticate past the VPN without multi-factor authentication. The ICO noted that this allowed the attackers to mis-use stolen credentials (pertaining to a 41,000 user base).
    • The anti-virus protection was inadequate. This was apparently due to operating system comparability problems (on an operating system other than the legacy system on System A).
    • Patch management was inadequate. Logs were missing on some systems, the ICO said. It also noted that one server was missing 16 updates that resolved publicly known vulnerabilities, 12 of which were described as “easily exploitable.”
    • Forensic evidence was no longer available during the Commissioner ‘s investigation. The ICO said that servers images analyzed in the post-incident investigation were not retained and provided to the ICO.
    • Accounts were given inappropriate privileges. “Day-to-day” user accounts were given administrator privileges according to the ICO.
    • Penetration  testing  was  inadequate. The ICO said three years without penetration testing was inadequate given the quantity and nature of the information at issue, which included passport numbers.
    • Retention periods were too long. It appears (though is not clear) that transaction data was preserved indefinitely and that user data was purged after seven years of inactivity.

£500,000 is the maximum fine. The ICO said it was warranted, in part, because the failures related to “fundamental principles.” The failure to retain evidence was another notable factor.

Good quotes on the impossibility of “ensuring” security and achieving zero risk

I blogged about Arbitrator Sudykowski’s decision in Providence Health when it was released in 2011 for its ratio – employers are entitled to more than a bare medical certification when an employee is absent from work.

I had occasion to use the case in a matter I argued yesterday, and was pleasantly surprised to re-read what Arbitrator Surdykowski said about data security and the impossibility of “ensuring” data security. The union had made an argument for minimizing the collection of health information that rested on data security risk, to which Mr. Surkyowski replied:

I agree with the Union’s assertion that there is always a possibility that private and confidential medical information may be inadvertently released or used inappropriately.  Try as they might, it is impossible for anyone to absolutely guarantee information security.  All that anyone can do in that respect is the best they can.  There is nothing before me that suggests the extent to which the inadvertent (or intentional) release or misuse of confidential information occurs, either generally or at the workplaces operated by this Employer.  More specifically, there is no indication of how often it happens, if at all, or that best efforts are demonstrably “not good enough”.

In a perfect world, the security and proper use of confidential private medical (or other) information could and would be guaranteed.  But to be perfect the world would have to be populated by perfect human beings.

This is a nice quote to bring forward in this blog, of course, because it’s always a good to remind ourselves (and others) that the mere happening of a security incident doesn’t mean fault!

It’s a hard point to argue when hindsight bears heavily on a decision-maker, but is indisputable. I once defended on employer in a charge that followed a rather serious industrial accident in which an employee at truck dealership was run over by a tractor. The Court of Appeal held that the tractor wasn’t a “vehicle” for the purposes of the Occupational Health and Safety Act and entered an acquittal. In examining the context for this finding Justice Cronk made the same point as Arbitrator Surdykowski:

That said, consideration of the protective purposes of the legislative scheme is not the only consideration when attempting to ascertain the scope of s. 56 of the Regulation. The Act seeks to achieve “a reasonable level of protection” (emphasis added) for workers in the workplace. For obvious reasons, neither the Act nor the Regulation mandate or seek to achieve the impossible — entirely risk-free work environments.

Every security incident is an opportunity to tell a story about pre-incident due diligence that highlights this basic truth. (If your defence rests our horrendously vague privacy law you’re in trouble, I say.) It’s also reason to hold our tongues and not judge organizations who are victimized, at least before learning ALL the facts. Security incidents are complex. Data security is hard.

Organization stumbles into BYOD nightmare

Hat tip to investigation firm Rubin Thomlinson for bringing an illustrative British Columbia arbitration decision to my attention. The remarkable April 2019 case involves an iPhone wiped by an employee’s wife mid-investigation!

The iPhone was owned by the employer, but it set it up using the employee’s personal Apple ID. That is not uncommon, but the employer apparently did not use any mobile device management software. To enforce its rights, the employer relied solely on its mobile device (administrative) policy, which disclaimed all employee privacy rights and stipulated that all data on employer devices is employer-owned.

Problems arose after the employer received a complaint that the employee was watching his female colleagues. The complainants said the employee “might also be taking pictures” with his phone.

The employer met with the employee to investigate, and took custody of the phone. The employee gave the employer the PIN to unlock the phone, but then asked for the phone back because it contained personal information. The employer excluded the employee and proceeded to examine the phone, but did not finish its examination before the employee’s wife (who the employee had phoned) remotely wiped the phone and refused to restore it with backup data.

The employer terminated the employee for watching the complainants (though not necessarily taking their pictures) and for insubordination.

The arbitrator held that the employer did not prove either voyeurism or insubordination. In doing so, he held that the employer had sufficient justification to search the phone but that it could not rely on its mobile device policy to justify excluding the employee from the examination process and demanding the recovery of the lost data. Somewhat charitably, the arbitrator held that the employee ought to be held “accountable for failing to make an adequate effort to encourage his wife to allow for recovery of the data” and reserved his decision on the appropriate penalty.

The employer took far too much comfort from its ownership of the device. Given the phone was enabled by the employee’s personal Apple ID, the employer was faced with all the awkwardness, compromise and risks of any BYOD arrangement. Those risks can be partially mitigated by the use of mobile device management software. Policy should also clearly authorize device searches that are to be conducted with a view to the (quite obvious) privacy interest at stake.

District of Houston v Canadian Union of Public Employees, Local 2086, 2019 CanLII 104260 (BC LA).

For Rubin Thomlinson’s more detailed summary of the case, please see here.

 

 

Saskatchewan Commissioner recommends clean desk policy for lawyers

On November 27th, the Saskatchewan Information and Privacy Commissioner faulted the Saskatchewan Legal Aid Commission for failing to have and maintain a clean desk policy – i.e., a policy requiring files to be put away and locked overnight – given cleaning staff had unsupervised after hours access to its office. The IPC relied on the Commission’s own policy, which encouraged but did not mandate clean desks. The matter came to the IPC’s attention after cleaning staff left two layers of doors open one night.

Saskatchewan Legal Aid Commission (Re), 2019 CanLII 113284 (SK IPC).

Legal Privilege and Data Security Incident Response – Law and Practice

I’m off to a cyber conference in Montreal this week to sit on a panel about threat exchanges. My role will be to address the legal risks associated with sharing threat information and a university’s ability to effectively assert a confidentiality interest in the same information. I’m genuinely interested in the topic and have prepared not just one, but two papers!

Here is the first one – a nuts and bots presentation on privilege and data security incident response. I hope it is useful to you. Feedback welcome through PMs.

NIST’s recommended password policy evolves

As imperfect a means of authentication as they are, “memorized secrets” like passwords, pass phrases and PINs are common, and indeed are the primary means of authentication for most computer systems. In June, the National Institute of Standards and Technology issued a new publication on digital identity management that, in part, recommends changes to password policy that has become standard in many organizations – policy requiring passwords with special characters.

Here is what the NIST says:

Memorized secrets SHALL be at least 8 characters in length if chosen by the subscriber. Memorized secrets chosen randomly by the CSP or verifier SHALL be at least 6 characters in length and may be entirely numeric. If the CSP or verifier disallows a chosen memorized secret based on its appearance on a blacklist compromised values, the subscriber SHALL be required to choose a different memorized secret. No other complexity requirements for memorize secrets SHOULD be imposed.

The NIST believes that the complexity derived from special characters is of limited benefit to security, yet creates (well known) useability problems and promotes “counterproductive” user behaviour – writing passwords down or storing them electronically in plain text. It’s better, according to the NIST, to allow for long passwords (that may incorporate spaces) and use other protective measures such as password blacklists, secure hashed password storage and limits to the number of failed authentication attempts.

The NIST publication includes other related guidance, including a recommendation against routine password resetting.

NIST Special Publication 800-63B – Digital Identity Guidelines (June 2017)

Saskatchewan health authority criticized for slow incident response

Good incident response involves nailing your timing – not going too fast or too slow. 

On August 17th the Saskstchewan Information and Privacy Commissioner held that a health authority breached the Saskatchewan Health Information Privacy Act by failing to respond to an incident in a timely manner. 

The Commissioner’s report does describe a dilatory response – with a discovery of “snooping” in mid October 2015, an investigation that led to a paid suspension at the end of January 2016, notification to the Commissioner at the end of February 2016, notification to the Commissioner towards the end of March that the breach was bigger than first reported and eventual notification to affected individuals in July 2016. 

Think and don’t react, and you can even pause to momentarily to gain confidence in a next critical step, but always keep the ball moving.

Investigation Report 030-2016 (17 August 2016, Sask OIPC).  

USB key treated as a private receptacle by labour tribunal – but why?

On March 29th the Grievance Settlement Board (Ontario) held that a government employer did not breach its collective agreement or the Charter by examining a USB key that it found in the workplace.

They key belonged to an employee who used it to store over 1000 files, some of which were work-related and allegedly confidential and sensitive. Remarkably, the employee also stored sensitive personal information on the key, including passport applications for his two children and a list of his login credentials and passwords. The key was not password protected and not marked in any way that would identify it as belonging to the employee.

The employee lost the key in the workplace. The employer found it. An HR employee inserted they key in her computer to read its contents. She identified the key as possibly belonging to the employee. She gave the key to the employee’s manager, who inserted it in his computer on several occasions. The manager identified that the key contained confidential and sensitive information belonging to the employer. The manager then ordered a forensic investigation. The investigation led to the discovery of a draft of an e-mail that disparaged the manager and had earlier been distributed from an anonymous e-mail account.

The GSB held that the employee had a reasonable expectation of privacy – one so limited as not to be as “pronounced” as the expectation recognized in R v Cole. The GSB also held, however, that the employer acted with lawful authority and reasonably. The reasonableness analysis contains some helpful statements for employers, most notably the following statement on the examination of “mixed-use receptacles” (my words):

The Association argues that the search conducted by Mr. Tee was “speculative” and constituted “rummaging around” on the USB key. It asserts that if Mr. Tee had been interested in finding files which might contain government data, he would have or should have searched directories which appeared to be work related, such as EPS, TPAS or CR. I do not find this a persuasive argument. As noted in R. v. Vu, in discussing whether search warrants issued in relation to computers should set out detailed conditions under which the search might be carried out, such an approach does not reflect the reality of computers: see paras. 57 and 58. Given the ease with which files can be misfiled or hidden on a computer, it is difficult to predict where a file relevant to an inquiry will be found. It may be filed within a directory bearing a related name, but if the intention is in fact to hide the file it is unlikely that it will be. Further, the type of file, as identified by the filename extension, is not a guarantee of contents. A photograph, for example can be embedded in a Word document. Provided that the Employer had reasonable cause to view the contents of the USB key in the first place (as I have found there was in this case), an employee who uses the same key for both personal and work related purposes creates and thereby assumes the risk that some of their personal documents may be viewed in the course of an otherwise legitimate search by the employer for work related files or documents.

I learned about this case shortly before it was decided and remarked that it was quite bizarre. I couldn’t fathom why anyone would be so utterly irresponsible to store such sensitive information on a USB key. This is one reason why I’m critical of this decision, which treats this employee’s careless information handling practice as something worthy of protection. The other reason I’m critical of  this decision is that it suggests the expectation of privacy recognized in Cole is higher than contemplated by the Supreme Court of Canada – which remarked that Richard Cole’s expectation of privacy was not “entirely eliminated” by the operational realities of the workplace. Not all of our dealings with information demand privacy protection, and in my view we need to make the reasonable expectation of privacy threshold a real, meaningful threshold so management can exercise its rights without unwarranted scrutiny and litigation.

I also should say that it’s very bad to stick USB keys found lying around (even in the workplace) into work computers (or home computers), at least without being very careful about the malware risk. That’s another reason why USB keys are evil.

Association of Management, Administrative and Professional Crown Employees of Ontario (Bhattacharya) v Ontario (Government and Consumer Services), 2016 CanLII 17002 (ON GSB).