Should You Have a “No Blame” Culture?
Who should be to blame when something goes wrong? What about if we forget about blame altogether – it is a healthier and more practical approach to abandon blame assignment and just work the problem?
By Dave Cartwright, CISSP
6 February 2023
Most of us have heard of people being disciplined – or even fired – for making a cybersecurity error. And it’s an understandable reaction: attacks and data breaches can result in financial and reputational costs along with lost customers and/or revenue, so the reason behind such an event needs to be established and dealt with.
However, there's a problem, and it's illustrated very nicely by a report published in 2022 by email defense vendor Tessian, which noted that 21% of respondents to their survey said they'd lost their job as a result of committing a security faux-pas. A scary number, but it gets worse. The author goes on to note: "Perhaps it's little surprise, then, that the percentage of people who didn't report the incident increased, with 21% of employees saying they didn't tell their IT team about the mistake - up from 16% in 2020".
Cyber security specialist dislike unknown unknowns. If someone tells you they just made a mistake, you can do something about it. But if they don’t, you clearly can’t.
On one hand, you can say to your staff: if you don’t tell us, we’re bound to find out eventually and the consequences will be even worse than if you ’fess up. But this just won’t help, because there are three potential outcomes: (a) they tell you and they get disciplined; (b) they don’t tell you, you find out eventually, and they get disciplined; or (c) they don’t tell you, you don’t find out, and life carries on. The reward for admitting to a failure isn’t exactly attractive, but the outcome of keeping quiet has a chance of them “getting away with it”. Surely, then, there’s something to be said for encouraging people to come forward by telling them that they’re not going to get shot for doing so. After all, it’s the surest way of finding out about an issue promptly and being able to do something about it.
On the face of it, this sounds like an odd approach. Disciplinary policies and procedures exist for a reason, after all, and if someone commits (say) a health-and-safety offence it’s likely that HR will pull out that policy and start wielding it. There’s a catch, though: the cause of the incident seldom lies with one individual.
Yes, there are examples of cyber incidents where the user has been ridiculously negligent, or has even knowingly or deliberately caused the issue. In such cases, by all means dust off the disciplinary policy and do what you have to. Such situations are in the minority, though.
Take our health-and-safety example. What happened if the incident occurred because someone wasn’t adequately trained? Or because the company hadn’t maintained a piece of equipment sufficiently? Or because it was a genuine accident?
The Cultural Need to Assign Blame
In all walks of life, when something goes wrong there’s generally a desire to assign culpability. We seem to use the word “incident” rather than “accident” because of an implicit belief that if something goes wrong, someone must be to blame and so it can’t be an accident. Now, the definition of an accident is: “an unfortunate incident that happens unexpectedly and unintentionally, typically resulting in damage or injury”. But rather than investigating to find who is to blame, why don’t we look to understand what went wrong?
In most cases, even if someone did make a mistake there was something the company could have done to reduce either the chance of them doing so or the impact of the outcome. For example, a contact center operator who gave away personal data as part of a social engineering attack was able to do so only because the CRM system they were using didn’t enforce rigorous identification of callers before allowing them to access personal data. Or take the sales agent who had been corresponding with a known client for days then suddenly succumbed to a ransomware attack because the client’s PC became infected mid-week and neither party’s anti-virus software spotted the malware in an email attachment. In both cases the individual could perhaps have been a little more diligent, but similarly the company could have had better controls that would have shrunk the risk.
Understanding the reason cyber incidents happen is essential. Investigating issues to establish the root cause is the only way we can possibly make improvements or introduce controls that will reduce the likelihood or impact (or, preferably, both) of a reoccurrence. And if one or more individuals have clearly behaved unacceptably then the disciplinary process can be exercised.
However, this latter option should be used sparingly. Cyber incidents can have hugely negative effects on organizations, but we must take a balanced, calm approach when they do happen. Our default position must be to encourage people who make mistakes to come forward and tell us – and to help them avoid doing so in future. Engaging with staff and encouraging a culture in which punishment is the last resort will result in honesty, openness, communication … and the opportunity to deal with incidents as soon as they happen.