A lot has been written about Stuxnet, one of the big revelations was the malware had jumped an air-gap. The on-going debate is whether air-gaps work, or would joining the networks in a controlled way REDUCE the vulnerability.
Stuxnet is a highly sophisticated computer worm that targets SCADA systems. The worm initially spreads using infected removable drives such as USB flash drives, and then uses other techniques to infect and update other computers inside private networks that are not directly connected to the Internet. Once inside the private network it includes a highly specialized malware payload to target SCADA systems control and monitor specific industrial processes.
Security is obtained when a combination of people, product and process all operate together to create an effective mitigation. In the Stuxnet case, the process of using a USB stick to transfer data enabled the security attack.
In a background briefing document Air-Gaps, Firewalls and Data Diodes in Industrial Control Systems, Nexor describe how the use of Data Diode technology can be used to enable business data flows in a controlled way, reducing the need for data transfers via USB sticks. This alone would not have prevented Stuxnet, as the briefing document explains, additionally data guard technology is vital to ensure the data that flows over the connection conformed to the schema of the expected business process.
One of the standard rules of thumb in security for a long time, has been “if your threat actor has physical access to the systems, it’s game over”. Initiatives around disk encryption, use of Trusted Platform Modules etc have tried to address this, in their various ways.
With Stuxnet, my understanding is that a bit of social engineering was used to get a USB stick into physical contact with the SCADA control systems; as the duped stick-carrier was therefore a threat actor, at first inspection, it’s therefore “game over”.
Things are more interesting, though. Stuxnet carried a bunch of 0-day attacks, so it was designed to pass cleanly through the kind of sheep-dip process that should be used when bringing untrusted media into an environment such as the one it was targeting. Whether it was sheep-dipped, is not something I know. If it was sheep-dipped and still got through, it would also have got through a diode / proxy appliance running the same filtering as the sheep-dip machine. If, however, the social engineering managed to persuade the duped stick-carrier to *not* sheep-dip the stick, that’s the kind of human breach of policy that a permanently-connected diode is designed to prevent. Also, there’s the likelihood that a permanently-connected diode would enable a “no removable media, no arguments” policy to be implemented on site.
So, the argument isn’t completely cut and dried (and knowing the security community, will therefore run and run), but my view is that a diode backed by a simpler policy (which the diode enables) would make life harder for an attacker – which, ultimately, is the point.
LikeLike