Oct 24, 2016

A Vulnerability Management Primer – Part 2 : Challenges

by admin

Contents

Share

In Part 1 of this blog series, we talked about why Vulnerability Management should be an integral part of all InfoSec programs and tried to define the scope for this discussion. In Part 2 of the Vulnerability Management Primer blog series we are going to talk about the common challenges that prevent organizations from being effective in achieving their vulnerability management goals.

Data Overload

Most organizations manage, monitor and scan tens of thousands of assets. Often these assets are owned and maintained by different teams, each with their own sets of tools and processes. With the exponential growth of non-traditional enterprise boundaries – cloud infrastructure, other virtual environments, mobile devices, IOT – existing lines of ownership and responsibility are constantly being blurred and redrawn. The vulnerability monitoring industry has done a great job of keeping up and extending coverage to accommodate the new normal, but to ensure coverage InfoSec teams invariably end up using more than one analysis method and tool. The reasons for doing so may be strategic – passive and active network scanning afford different opportunities for monitoring. While in some cases this might be necessary – SAST and DAST tools may have overlaps is results but they serve different purposes. Whatever the reasons, it is reasonable to expect that in such highly diverse environments the vulnerability data you are going to have to process and analyze is going to be increasingly heterogenous. The better you do at ensuring complete monitoring coverage for your infrastructure, the more prepared you must to be to analyze, parse and take decisions at the speed that vulnerabilities are being reported.

Changing Attack Landscape

The problem of analyzing very large volumes of highly diverse data is compounded by the fact that there is a lot of invaluable metadata associated with these vulnerabilities that is constantly changing.

If a new toolkit is released that exploits a particular vulnerability to create a successful breach, should that have an impact on your priority for remediating that particular vulnerability?

The answer to this question for most organizations would be YES, however very few programs factor in this information into their vulnerability prioritization models. While vulnerability scanners are constantly improving to provide more threat metadata, this pales in comparison to intelligence that is available from other, more dedicated sources.

Zero-day vulnerabilities provide another stark example of this. A zero-day vulnerability refers to a hole in software that is unknown to the vendor. These provide ripe opportunities for hackers to affect compromises before vendors, scanners and the host organization catch on. In 2014, a record 24 zero-day vulnerabilities were reported. The fallout from the most significant of these – Poodle, Shellshocked, Heartbleed, highlighted some dire facts :

These vulnerabilities were successfully exploited within hours of disclosure which tells us that potential hackers are well equipped and highly coordinated.

The nature of these vulnerabilities (ShellShock exploited a design flaw that went unnoticed for 25 years) tells us that attackers are evolving their methods and tactics. They are investing in vulnerability research to target areas and products that might not be a focus for traditional vulnerability management programs.

The top 5 zero-day vulnerabilities of 2014 combined, left organizations exposed for a total of 295 days which tells us that vendors are not responding at the speed required to deal with emerging threats.

Lack of Business Context

Security intelligence systems provide real-time, real-world context for vulnerabilities. Just like this external context is invaluable for remediation prioritization, an understanding of how a vulnerability impacts your particular organization can provide tangible benefits to remediation effectiveness. In an ideal world InfoSec teams would have the resources necessary to fix all critical vulnerabilities. In reality, this is hardly ever the case. InfoSec teams have to perform a balancing act of fixing critical vulnerabilities across the entire infrastructure, and keeping critical infrastructure as vulnerability-free as possible.

If you are an investment bank, is a Severity 5 vulnerability on Box 1 (delivering your cafeteria menus) the same as a Severity 5 vulnerability on Box 2 (running your trading applications)? What about a Severity 4 vulnerability on Box 2?

The answer may depend on whether these network devices are on segregated network segments or if they are interconnected. But regardless of the actual configuration, we need this information to take an informed decision. Business context provides a powerful dimension, not only for remediation prioritization but also to demonstrate that remediating efforts are being undertaken with a clear understanding of cost and impact.

Unintelligent Prioritization

Threat intelligence and business context can significantly improve remediation efforts by highlighting vulnerabilities that are most at risk of being exploited as well those that can have the biggest impact to the organization. In spite of this, most organizations leverage basic criteria like vulnerability severity or CVSS score as the primary dimension for vulnerability prioritization. This is often due to the effort required to aggregate, consolidate and collate data from multiple sources to build a prioritization model that takes all these factors into account. Manually consolidating and analyzing threat intel and business context is an option but one that is hardly repeatable or consistent. Despite the effort and cost associated, organizations that can find the right tools or resources to build a prioritization model that not only factors in default classifications like CVSS, but also leverages threat intelligence and business context can expect a significant improvement is program effectiveness.

Manual Processes

Most vulnerability management programs employ manual processes in several stages of the remediation cycle. Analyzing vulnerabilities, collating threat intel, creating tickets, assigning ownership, setting SLAs – are often done manually. In addition to being inefficient, manual processes are also detrimental to the crucial goals of maintaining consistency and developing repeatable processes.

Gap to Remediation

A survey of vulnerability management programs reported that it often takes hundreds of days for vulnerabilities to be targeted to remediation. This statistic does not reflect the actual amount of time it takes to remediate a vulnerability, simply the time before vulnerabilities are assigned to a ticket/owner for remediation. This is surely a consequence of the challenge we have discussed above, but on its own represents a big challenge that vulnerability management programs face. To keep the organization safe, InfoSec teams must be able to react quickly and directly to any emerging threats. They must address all of the challenges above with the goal that they can identify critical vulnerabilities quickly and accurately and target them for remediation. This means streamlining all the processes involved in identification, contextualization, analysis, and remediation of vulnerabilities.

In the next post in the series we are going to discuss the crucial components that all vulnerability management programs must have in place to address these challenges effectively.

Read more about Brinqa Threat and Vulnerability Management here.

Read Next

< Prev

A Vulnerability Management Primer – Part 1 : Incentive & Scope

Next >

Secure, Future-Proof your Organization’s Digital Transformation with Brinqa and Cherwell