Continuous Threat Exposure Management and what it means for enterprise security [Q&A]
This summer, Gartner introduced Continuous Threat Exposure Management (CTEM). This is a set of processes and capabilities that allow organizations to create a system for review of exposures that is faster than the periodic project-based approach.
With endless threats and vulnerabilities hammering today's organizations, exposure management that evaluates the accessibility, exposure and exploitability of all digital and physical assets is necessary to govern and prioritize risk reduction for enterprises.
We spoke with Haggai Polak, CPO of Skybox Security, to find out more about what this new term means for enterprise security, especially as organizations prepare to face new threats in 2023.
BN: Can you tell us more about CTEM and how it differs from traditional vulnerability management?
HP: The name itself suggests a lot. The objective of CTEM is to create an actionable security posture remediation plan that organizations can understand and security teams can implement. The goal of the process is to continuously identify and address threats that are most likely to be exploited rather than attempt to remediate every threat that is identified.
CTEM is an improvement over vulnerability management programs that have been employed in the past, mainly in the continuous aspect. Organizations need to recognize that just running a scan or bringing in an agency to do testing twice a year is not enough. Because new vulnerabilities are introduced all the time, we know that common vulnerability databases grow at the rates of tens of thousands a year, all while an organization's infrastructure is constantly changing. Today's organizations need the ability to do threat assessment and exposure management on a continuous basis. We typically recommend that customers run daily scans to see what changed, both in terms of their infrastructure and its configuration, as well as new threats or exploits that have become available to malicious actors.
Another change I would point to is that we're now using the term exposure instead of just vulnerabilities. This is also an expansion of the scope of the original vulnerability management programs that were really focused on just vulnerabilities, something that is potentially compromised on a specific piece of software or hardware that somebody could go and exploit. Exposure is much broader than that. A misconfiguration could be an exposure, as well as a missing control gap, for example. So, it's not necessarily that there's a software bug or something is broken. It could be as simple as a human error that allows access to an area of the network that should be protected. That's why CTEM is viewed as a more mature way to manage cyber risk across the enterprise.
BN: Can you take us through what a CTEM program looks like?
HP: When we look at the definition of exposure management programs, there are a few very distinct stages. The first one is discovery. Exposure management starts with understanding what assets an organization has in detail, who is the owner of the assets, what is their importance and more. The next part is detecting the exposures, and that's where it’s critical to integrate with various scanners both in the IT and OT world to start mapping the various exposures that exist on different assets.
Once we're done with this second stage, there is typically a list of tens of thousands of potential exposures in the infrastructure that need to be sorted. Companies that rely on traditional vulnerability scores typically run into the issue of having too many critical vulnerabilities and limited remediation bandwidth, often leading them to focus on the wrong things. That's where risk-based prioritization technologies can help decide what's most critical -- for example by figuring out which of these vulnerabilities are being exploited in the wild, or which assets are exposed.
Then we reach the fourth stage, which is remediation and verification. That's where security teams launch remediation of the exposures, manually or automatically. This is also where they verify that the remediation was indeed effective and measure various metrics around the program like ability to meet remediation SLAs.
BN: What trends are you seeing in the vulnerability management space?
HP: There are a few trends I would point out. Cyber risk quantification is a big one. We are seeing security teams increasingly ask for our help to translate cyber risk into a language the broader business can understand. For example, if there's a big risk on a small number of servers, but these servers are extremely critical to the organization, any downtime on those servers translates to substantial monetary losses. That's something our customers want help calculating and showcasing so they can show why they're prioritizing remediation and additional security expenses on that small group of assets.
The second one would be around automation. We all know about the cybersecurity personnel shortages. That is why many organizations across industries are seeking to automate their vulnerability and configuration management. In some ways, more traditional automation solutions have let customers down because they were not smart enough to do automation safely with a lot of context. Security customers are not giving up and they're still looking for ways to remediate quicker with less human involvement, fewer human errors, and respond quickly to high-severity threats in the environment.
The last one is the expansion from applying vulnerability management on traditional legacy IT, to performing exposure and threat management on somewhat neglected parts of your infrastructure, like operational technology (OT) equipment. This is where we see security teams asking for the application of vulnerability and threat management on their OT, cloud workloads and, in some cases, for remote workers. It's no longer enough to just do vulnerability management on the servers in a data center or the devices in the building.
Image credit: alexskopje/depositphotos.com