Real Ransomware Risk vs. False Sense of Cybersecurity
New survey data shows a troubling disconnect between security professionals’ confidence and their track record in beating ransomware. What gives?
- Overoptimism is not a security professional’s friend.
- Yet new research finds that many may be overconfident in thinking they can beat ransomware.
- Here’s how to tamp down three self-defeating tendencies: hindsight bias, optimism bias and the Dunning-Kruger Effect.
A new survey has found that three-quarters of security professionals believe their company to be very or extremely prepared for a ransomware attack. Even more say they are somewhat or extremely likely to recover in a day or two without paying ransomware. But their own track records belie these responses.
In fact, eight in 10 of the same survey respondents also say they’ve been successfully attacked by ransomware; nearly four in 10 paid the ransom; and about the same number experienced significant downtime. What’s more, the respondents give themselves low scores on a range of security best practices that could prevent future attacks from succeeding. For instance, fewer than half say they have a disaster recovery plan in place.
The numbers all come from Mimecast’s new State of Ransomware Readiness 2021 report, conducted by Hanover Research, and they raise a troubling question: Is there a disconnect between the way security professionals perceive themselves and their actual ability to protect their company?
Security is a field that does not benefit from overoptimism, so it’s important to know why people are so likely to underestimate their own risk exposure. Three cognitive biases help to explain it: hindsight bias, optimism bias and the Dunning-Kruger Effect.
Hindsight Leads Us to Believe We Would Make Better Choices
Typically, if we already know the outcome, we suffer a significant disadvantage when reading a case study or analyzing an incident after it has happened. In a tendency known as hindsight bias, we tend to judge the actions and solutions of those involved relative to the known outcome.
We also judge that outcome as more likely than it appeared to be when the incident actually occurred. In doing so, we’re exhibiting a false sense of security, thinking that we are more capable than we truly are — and concluding that we would not make the same mistakes that “they” did.
We Underestimate the Risk to Ourselves
Another bias that may hinder our ability to realistically assess risk, known as optimism bias, refers to our tendency to underestimate the likelihood of negative events occurring to us and overestimate the likelihood of positive events occurring to us. Hence people who smoke may not worry that they themselves will get lung cancer.
It is perhaps a dark irony that people suffering from mild depression hold more realistic perspectives on their current circumstances and future prospects. This suggests that while there may certainly be competitive business advantages to having an optimistic world view, these may come with the hidden cost of underpreparing for disasters.
The Dunning-Kruger Effect: Believing We Are More Capable Than We Are
Every Thanksgiving my family enjoys watching my out-of-shape uncle yell at the football players on TV, ranting about their mistakes and how he could have done better. An irony of expert performance is that it looks so easy that anyone could do it, but nothing could be further from the truth. This misconception gives rise to a psychological phenomenon known as the Dunning-Kruger Effect.
Dunning-Kruger may also lead us to believe that we are more capable than we truly are to face threats such as ransomware. But the effect will quickly disappear should we face an actual ransomware attack.
Getting a Realistic Perspective
How do you fight human nature? Here’s a breakdown on each bias and how to tame its impact on your performance:
- Hindsight bias: The best way to mitigate these effects on our risk judgments is to seek out someone who has personally experienced incidences like ransomware. Discuss their experiences. Try to understand their thought process as events unfolded and they managed the situation — without foreknowledge of its outcome. In other words, resist the urge to focus on the outcome and instead focus on their process. This will help keep hindsight bias from influencing your judgments.
- Optimism bias: Calculate the probability that you will be hit with ransomware this year and understand that this is your base rate probability. Take another look at that top-line statistic from the State of Ransomware Readiness 2021 report cited above: Eight in 10 survey respondents reported being successfully attacked by ransomware. Consider this as your base rate probability of being hit within the next year and adjust your security posture according to these odds.
- Dunning-Kruger Effect: Understand that you may not perform more effectively than others who have faced a ransomware attack. Learn from their mistakes, but also try to understand why they made those mistakes, and how to avoid repeating them.
In addition to understanding how our cognitive biases can sometimes work against us, we should also consider how we think about risk. Risk is often calculated using the formula Probability X Impact = Risk. When considering risk, we often overemphasize the probability variable and underemphasize the impact variable. This can lead us to ignore the potential for existential risk.
Can your organization really survive a multimillion-dollar ransomware attack, right now, today? If not, then ransomware poses an existential risk that needs to be avoided or mitigated, even if the perceived probability is low.
The Bottom Line
Security professionals can benefit from a healthy dose of pessimism. Being vigilant against cognitive biases now can help avoid ransomware and its impact later.
 “Hindsight Bias,” Perspectives on Psychological Science
 “The optimism bias,” Current Biology
 “Depressive symptoms are associated with unrealistic negative predictions of future life events,” Behaviour Research and Therapy
 “Unskilled and unaware of it: how difficulties in recognizing one's own incompetence lead to inflated self-assessments,” Journal of Personality and Social Psychology
 “A philosophy of security architecture design,” Wireless Personal Communications
Subscribe to Cyber Resilience Insights for more articles like these
Get all the latest news and cybersecurity industry analysis delivered right to your inbox
Sign up successful
Thank you for signing up to receive updates from our blog
We will be in touch!