When it comes to IT security how do you separate marketing hype from reality? One approach is with third-party tests of IT security solutions—an efficient, neutral way to validate vendor claims of solution effectiveness and performance. But it isn’t enough to just look at a point-in-time result. What really counts is consistency year after year. Here’s why.
Buying security technology isn’t like buying a car or a bottle of wine where the product is static. As a buyer, as long as you made your purchase in a year when your car received top rankings by Motor Magazine for its performance or that bottle of wine was lauded by Gourmet Traveller Wine you can feel confident in your choice.
Yet, security solutions aren’t frozen in time. In order to be effective in today’s dynamic threat and IT environments, they need to evolve. Each time these solutions evolve, effectively you are dealing with a “new” solution. IT security teams need to be able to count on that “new” solution just as they could at the time of the initial purchase. If a solution has a spotty track record with respect to security effectiveness and performance the potential ramifications to your buying decision and day-to-day security management must be carefully considered.
Consistency of security effectiveness and performance come into play in two important aspects of any security solution—maintenance and upgrades.
With respect to maintenance, vendors of technologies such as anti-virus (AV), intrusion detection and prevention systems (IDS/IPS), firewalls, log management and security information and event management (SIEM), frequently issue updates to monitor and protect against the latest threats.
Vendors who don’t demonstrate security effectiveness year after year have a greater likelihood of providing inconsistent levels of protection from one update to the next. Whatever the reason—inconsistent investment in resources, a faltering commitment to quality, organisational changes, or complacency—the result is the same: missing potential new threats despite these updates. From a security management point of view, IT security teams need to remain even more vigilant to the possibility of breaches with vendors that cannot demonstrate a history of security effectiveness.
As networks expand and new applications, content and devices drive higher performance requirements, security teams considering upgrades must understand if hardware performs “as advertised.” When making buying decisions and considering future growth, it is important to know if actual throughput levels are on par with claimed performance and if hardware performance levels are linear. For example if a 10Gbps appliance when tested performs at 17 Gbps, does a 20Gbps model perform at 34Gbps? Reliable performance metrics are critical for planning.
Traditionally, organisations have had to sacrifice performance for effectiveness, but with today’s advanced technologies and engineering this is no longer true. IT security teams need to identify vendors they can trust to continue to develop new capabilities while maintaining security effectiveness and performance.
It’s not enough to look at the latest third-party test results. Understanding historical performance and comparing track records is essential to selecting a vendor you can count on to provide consistent protection now and in the future.
Chris Wood is the Australian and New Zealand regional director for Sourcefire.