Risk-based Vulnerability Management: A Bigger Bang for Your Buck
Leon WardEvery five to ten years, major technology shifts change the way that vulnerability assessment and the related IT risk mitigation processes are approached or implemented. What has remained constant is the formula we use to measure risk and thus prioritize and triage vulnerabilities.
Risk = (Likelihood of event) * (Impact of consequences)
It’s an approach that intuitively makes sense, but there have been two challenges with how this formula has been applied.
- Over focus on consequences. Most companies and vendors are focused much more on the consequences and severity of a vulnerability versus the likelihood they may be impacted. Both are important, but if you are focused too much on severity and consequence, you may not see the complete picture. CVSS scores, for example, focus mainly on severity, with global values for likelihood that are assumed valid for all organizations. This is a mistaken assumption. Yes, a vulnerability may be critical and of highest severity, but this vulnerability is more or less relevant based on the threats that target your organization.
This is where custom likelihood comes in. Understanding your own likelihood is critical for prioritization and triage.
Historically, the largest input to the likelihood component in the risk equation has been internal:
- Is the vulnerable asset accessible to unauthenticated users?
- Is the asset directly contactable via the public internet?
- How many of the vulnerable assets are there?
The last point, the vulnerable asset count, has always been a key component in an internal vulnerability priority score since it quantifiably impacts the exposure that a vulnerability creates. If 1,000 endpoints present the same vulnerability, it’s more likely that the risk will manifest in comparison to if only one of the devices present the vulnerability.
- Internal-only view of likelihood. The other weakness with how this formula has been implemented is that there is far more to likelihood than just what you know about your own infrastructure. You also need to look outside your own organization.
Today, organizations have a new wealth of internal and external data available to make more data-informed, or data-driven choices with regard to actions to take and threats to respond to. While exposure is an important input into the risk equation, it only really has relevance once you understand your threat landscape and the handful of adversaries you are going against. For example:
- Who are the threat actors that target your company, industry, geographical region, or that of your customers or partners? This is your threat landscape and the context you need to drill down into the vulnerabilities themselves.
- What is the cost for adversaries to develop exploitation tools for the vulnerability, or is it now available within the existing off-the-shelf attack tool sets such as Cobalt Strike, Metasploit, Core Impact, Canvas and others? This is one of the largest factors in the likelihood of it targeting the masses.
- Does exploitation of the vulnerability result in a situation that fits into the threat actor’s tools, techniques and procedures (TTP) sweet spot, meaning it’s easy for them to execute upon their objective?
- Have the threat actors in your threat landscape been seen to leverage the vulnerability?
These are elements that the enterprise has absolutely no control over but can get visibility into in order to get ahead of the response process, or stop current mitigation efforts and pivot to other issues that are potentially more likely to impact you.
Melding internal and external threat data to enhance prioritization and triage
ThreatQuotient’s data-driven security operations platform, ThreatQ, helps organizations use a data-driven approach to address vulnerability management and prioritization. ThreatQ uses its DataLinq Engine to fuse together disparate data sources and tools to help teams prioritize, automate and collaborate on security incidents; enables more focused decision making; and maximizes limited resources by integrating existing processes and technologies into a unified workspace. The result is reduced noise, clear priority threats and the ability to automate processes with high fidelity data.
To learn more about how to implement a data-driven approach to risk-based vulnerability management, the tools and technologies ThreatQ integrates with to inform vulnerability management, and real-world customer use cases, download our new whitepaper today.
0 Comments