Governance, risk and compliance is front of mind for many cybersecurity teams, with many aspects of security moving from a matter of trust to a matter of risk. A panel of experts recently convened to discuss this and other issues relating to quantifying cybersecurity risk.
At the recent ISC2 Spotlight on governance, risk and compliance (GRC), moderator Brandon Dunlap hosted a debate on quantifying risk in today’s GRC setups. Joining Dunlap were: Alexander Antukh, CISO at AboitizPower; Omar Khawaja, CISO at Databricks; and John Sapp Jr, CISO of Texas Mutual Insurance Company – all three of whom are board members of the FAIR Institute, whose existence is all about risk quantification and management.
Dunlap began by asking Sapp to define just what we mean by cybersecurity risk quantification. The key word was value: according to Sapp we have to “determine and understand what is the value of what's at risk,” should we have a cybersecurity or technology incident. Specifically, Sapp said it is about “being able to put it into financial terms, which is a language that every c-level [executive] in the world understands,” and that we have to be “able to put it into those terms, that common language and a nomenclature that is consistent across the board that everyone can understand”. Dunlap concurred, commenting that in a previous cybersecurity role he held in a large company: “The chief risk officer and I took probably six months to come to terms on a common language and a common way of articulating risk”.
Do You Quantify Risk?
Dunlap dipped into the numbers behind a live poll that had been running for the first few minutes, asking the audience: “Do you have a mature cybersecurity risk quantification program?” Over a third (37%) said no, but that it was on the roadmap. A further 16% claimed to have a mature program and another 21% said that they were mid-implementation. Nearly a quarter (23%) said, however, that they have neither a cybersecurity risk program nor an intention to implement one.
The host turned to Khawaja, asking how we should adapt risk quantification depending on whom we’re talking to. The answer was: by identifying the right use cases and using the right language. If the audience is focused on risk then we need to examine the relative risk reduction of the various initiatives and controls we have. If the audience is the CFO then try “doing so in terms of dollars and cents and probabilities is a way more effective mechanism to do it than to do it in terms of threats and vulnerabilities and CDEs and TTPs.” The most relevant audience, Khawaja said, are the core organization teams, because: “the organization understands dollars and cents, that's the only unit of measure that is easily understood by [everyone]”.
Antukh’s view had hints of pessimism. “I like to look at [risk quantification] as measuring our uncertainty about what what's going on with our risk, what's going on about our controls,” he said. The evaluation of our risk profile depends on a large number of inputs and variables, which brings a challenge when trying to quantify anything. Are tools being effectively used across the whole organization? What is the finance team telling us, or the legal team, or the supply chain management people? He identified his four key inputs: “What's going on in the world in terms of threat landscape? What is our understanding of our impacts? What is our control effectiveness? How [do] we validate this continuously?”.
A CISO Approaching Risk Quantification
Dunlap turned to Sapp to understand his experience in the insurance industry which, in theory at least, is likely to be competent at quantifying risk. How, then, does a CISO present hard-to-quantify risk to management? “You never know what you're going to wake up to any given morning as a CISO,” he noted, “so how do you make sure that there is a degree of, … rigor, discipline, veracity to the numbers that you're bringing forward?” Sapp’s response was that he tries to avoid references to “likelihood” (a term used frequently in cybersecurity) but instead uses “probability”, because while likelihood is a fairly subjective term “probability is more of a science, because it's data-based”. The numbers he uses, he said, come from the organization itself, so they are facts (and often audited ones) and can be used as the basis of understanding the value of, say, a system being unavailable following an attack.
We can overdo things, Khawaja added. “Sometimes we over-index on how perfect the numbers need to be,” he commented, “and the reality is to do risk quantification fairly well, you actually don't need your numbers to be anywhere near perfect.” He pointed viewers at the concepts of Fermi decomposition and Fermi calibration. These are techniques popularized by physicist Enrico Fermi for addressing complex problems by breaking them down into manageable elements and then validating those pieces with approximate, "back of an envelope" estimates. Khawaja’s point was that precision is neither possible nor particularly desirable: “There's no one in the organization that's expecting you to predict the future down to plus or minus one percentage point. If you can do it down to plus or minus 20, 25, even 30 percentage points, in most cases and for most businesses, that's probably going to be okay”.
The presenter put a question from the audience to Antukh: “What's the minimum data that I need to gather to get something like this off the ground?” The answer began with an observation that in some cases, if you are at the beginning of your security program, there may not be any data available at all and anyhow, you might not need cybersecurity risk quantification at that stage. “Maybe you just need to … do MFA and backups and maybe some logging and incident response, and be fine with that,” he said. Antukh then went on to echo earlier views about Fermi calibration – starting with some basic assumptions and then adding in elements and constantly revising. “Where else do we go?” he asked. “Maybe I know something about the averages. Maybe I know something about our state of controls. Maybe I know something else from my career or from the advice from our CFO. We bake it in and we revise it continuously”.
A question to Sapp followed, about how we deal with the volume of information we have. “With the quantification efforts, I may have too much information, or I may be having too many variables cast into this equation,” said Dunlap. “How does this methodology narrow that scope to make this approachable and believable?” Once more the answer was to not over-complicate things. “You don't have to be precise, but you want to be accurate, right?” Sapp noted, continuing: “Governance is about visibility, bringing visibility to the probability of the occurrence of these things, but being able to think about it in a way that is more methodical.” Efficiency, effectiveness and consistency were the watchwords, he said. Sapp also pointed out that the approach we take has to be appropriate for our given industry too, though: “If you are publicly traded versus privately held, whether or not you're a government entity, you look at things very differently than maybe a private organization does”.
Frameworks and Advice
Antukh referred to the FAIR cyber risk management framework, using the various techniques available to establish what he termed “return on security investment.” Khawaja squeezed in three points: a recommendation of Doug Hubbard’s book How to Measure Anything; second was building proficiency in data engineering and data analysis in order to collect and use the data available; and finally he urged the audience to use tools like the FAIR framework to “bridge between the world of technology and cybersecurity controls and the business”.
Sapp’s final thought was that although the tools and frameworks available to us are of value, we need knowledge to use them properly. “Just remember that a fool with a tool is still a fool,” he said. “A good carpenter never blames their tools. It won't be the tool itself that is the problem – it is the individual and their approach and how they go about it. So, educate yourself on it”.
ISC2 Spotlight – Cloud SecurityCloud adoption continues to accelerate. With this growth comes increasing complexity. Many organizations now operate across multiple cloud providers, each with its own architecture, tools and shared responsibility models. At the same time, new regulations, regional compliance requirements and evolving industry standards are reshaping how cloud security is designed, implemented and maintained. Join leading experts across April 28-29, 2026 for a focused two-day ISC2 Spotlight look at how to scale and secure the cloud in today’s fast-changing landscape. |


