Claims of an imminent 'Cybergeddon' are usually best consumed with a sizeable sack of salty cynicism. After all the information security industry is notorious for dodgy statistics and inflated claims. Scare stories sell services but they also lead to complacency. So just how alarmed do we actually need to be?
Well, the latest evidence would seem to suggest that things are getting worse with a consistent and rapid rise in attacks in 2012. It isn't just the number of attacks that's concerning but the fact that the focus of the attackers is also shifting.
"One of the largest companies in the United States told me that in April they were having about a thousand attacks in the first quarter, then five thousand in the second quarter. These are major attacks, not port scans [which merely probe for vulnerabilities]," Alan Paller told security professionals in Sydney last week.
Paller is founder and director of research for the SANS Institute, which runs the Internet Storm Centre and provides training and certification for security professionals. He's connected.
According to Paller, the bad guys have started playing rough in the last two months. It's no longer just espionage and noisy but easy-to-counter distributed denial of service attacks (DDoS), Paller says. Attacks are now causing physical damage.
Mid-August saw what is arguably the most damaging attack ever. Oil company Saudi Aramco had 30,000 computers infected and wiped. With their master boot record destroyed, every machine needed on-site attention and a complete rebuild.
"That's the same kind of problem you'd have if you hit it with a bomb. Not literally, but close enough in terms of the amount of rebuilding you have to do," Paller says.
Now add in last week's alert from the US Department of Homeland Security (DHS) into this mix and perhaps Paller may have a point.
Hacktivists are getting interested in industrial control systems (ICS), the gadgets that run everything from hotel air conditioners through chocolate factories to nuclear power stations. We've known for years that ICS are vulnerable, but so far there have been few confirmed hacks, such as the widely-reported Stuxnet attack against Iran's nuclear program and, back in 2000, the Maroochy Shire raw sewage dump.
But as the DHS points out, specialist search engines are regularly lighting up the path to ICS networks, still foolishly left exposed to the internet. A newly-discovered vulnerability affecting devices from 261 different manufacturers could allow hackers to sieze control of whatever they find.
"Control systems engineers are very good at keeping power systems on [but] the security aspect of it gets lost because the code was not written with that in mind," according to Canadian ICS security consultant Dave Lewis.
"I'm hoping that over time that is actually is going to be corrected," he says.
But the correction could take a while to arrive given that ICS service life is measured in decades.
"The problem is pervasive throughout the industry," fellow Canadian security expert James Arlen says -- though both experts are quick to play down the risk of immediate carnage.
Stick to the basics
It seems like every security professional can point to examples of the rising danger. But when it comes to dealing with it, the industry suffers a conceptual disconnect.
Vendors are selling an ever-bigger big data approach. Log everything and pay for their secret-sauce analytics so they can dig back and tell you that, yes, your network was first penetrated on 18 December 2011 at precisely 4.28pm. But practitioners tell us it's more important to concentrate on the basics, making steady improvements through "continuous monitoring" (CM) of risk, rather than infrequent security audits, and "measured risk reduction".
CM is now mandated for all US federal networks following a wildly successful implementation by the State Department and Paller believes other nations will soon follow. It'll be a "huge shift" in the way information security is done and could potentially transform the industry.
Monitoring risk all day, everyday
Implementing CM can be surprisingly straightforward. The trick is to use automated tools to measure daily how many computers in the organisation are vulnerable and in what ways, and report that information back to every systems administrator using a metric that places vulnerabilities on a common scale.
System adminstrators typically have just 20 minutes per day to spend on security, according to Paller. So identify one or two high-return security fixes that can be done in that timeframe, every day, and send the sysadmins specific how-to instructions.
Using this approach, the US State Department managed to patche 90 per cent of its machines for a certain Internet Explorer vulnerability in just 11 days. By comparison the US Department of Defense's traditional approach had patched only 65 per cent of their machines after four months.
As for knowing what to do each day, Australia's Defence Signals Directorate (DSD) has the answer: their Top 35 Mitigation Strategies.
Last year DSD's award-winning research showed that implementing just four strategies can block 85 per cent of targeted attacks. Patch applications. Patch operating systems. Limit administrator rights to those who truly need them. And implement application whitelisting -- that is, allow only approved software to run.
This year, DSD has moved application whitelisting to the top of the list.
According to Paller, application whitelisting has always been and will continue to be a critical measure.
"They were just afraid to put it at the top of the list at first because it scared people a little bit. But it's so much more important than the other ones that it needed to be at the top," Paller says.
DSD's updated strategies now come with the added support of defence minister Stephen Smith, who launched them last week with a poster, promotional video and catchy slogan.
"The evidence to date clearly indicates the 'Catch, Patch, Match' approach is the best way to mitigate against cyber intrusions, protect your most valuable information and enhance the resilience of your networks," Smith said.
Paller agrees. "First do the top four. When you are done doing the top four, evaluate the others," he said
Go against the flow
This is the exact opposite of what anti-virus and other malware-detection strategies try to do, which is to identify the ever-changing bad software and prevent it from running.
"[Anti-virus vendors] define new product verticals when a different kind of code that I didn't want running on my computer starts running on my computer," says Arlen.
"Is malware different from a virus? Do these things not require the same response? Don't run crap on my computer that I didn't ask to run on my computer... I look at my iPad and think 'You know what? It's kind of awesome that only a limited amount of software can run on it."
Allowing only specific known-good software to run and banning everything else? Giving systems administrators work in such a way that it actually gets done? Five years from now we'll look back and wonder why we ever did it another way.