This recurring blog covers news about ISA Automation Week: Technology and Solutions Event from the unique viewpoint of the event’s project manager, Carol Schafer. With a technical background to draw on, a penchant for humor and the inside track on conference updates, Carol informs and entertains with messages that are always illuminating and often downright funny.
If you’re threatening the security, safety or long-term viability of my workplace, you’re threatening me. And I don’t like it. I take it personally. And so should every person working in critical infrastructure facilities, because there are literally thousands of people in this world who do nothing all day, every day, but sit in a room figuring out ways to get through firewalls (easy – they do it all the time) and once they’re in, imagination is the only limit to the damage they can do.
We’re all aware of the so-called “cybersecurity threat,” and we hear a lot of casual conversation about it with comments like “Oh, a cyber-attack, yes, that sure would be terrible if it ever happened to some poor besieged company.” None of us doubt that hackers and cyber terrorists are out there working hard. After all, everyone’s heard about Stuxnet, so we all know, at least, that SCADA systems are potential targets. We just don’t believe it will happen to us. We just don’t take it personally.
But the fact is, your company’s firewall (if your facility even has one) is likely being compromised while you’re reading this. At the very least, cyber attackers are banging on that firewall repeatedly like a battering ram on a castle door. With most SCADA and controls systems accessible to the Internet, sensitive information and critical processes are essentially vulnerable to having their back doors pried open to anything – or anybody – on the worldwide web.
This isn’t particularly “breaking news.” I still remember my shock –and considerable anger – when I discovered that hackers had infiltrated the extremely modern 486 computer on which we ran our manufacturer’s rep business years ago in California’s Silicon Valley. The CPU usage indicated 92 percent when the computer was idle and the hard drive light was always on – an “old school” alarm that told us something was wrong. Invading subroutines had barged in over the Internet, installed themselves inside our nice warm PC and were happily grinding away, doing massive numbers of computations for their hacker masters. Take it personally? You bet I did.
Back then, safeguarding infrastructure and critical facilities like oil platforms, nuke plants or wastewater facilities wasn’t top of mind. But today, we must face the staggering odds in favor of a major facility being taken off-line, having sensitive information stolen, or losing valuable intellectual property through cybersecurity breaches. The possibilities for extortion, weakening of defense capabilities, the loss of intellectual property, and the crippling of communities – or even countries – so that an invading tyrant can make demands are simply overwhelming.
So if we know all this, why does it seem so difficult to take it personally – that is, to realize that individually we have a measure of the overall responsibility and take whatever actions we can? It’s certainly not for lack of caring or good intention. Here are my four favorite answers to that question:
- The consequences of a cybersecurity breach are overwhelming. Humans find it hard to accept the eventuality of potentially catastrophic events. And no one can anticipate and monitor every single threat vector within a facility. Admitting that our process controls or SCADA systems have weaknesses for which we are responsible is decidedly uncomfortable. It’s much easier just to hope it never happens. I know all about that, because that’s often what I do with regard to my home PC and laptops. I don’t want to think about what might happen if all my personal information becomes available to hackers – but I find it overwhelming deciding which of the many anti-virus program to install (and I wonder if the anti-virus programs carry viruses), and I don’t know how often I should update to protect against the latest viruses. So I let anti-virus program updates go far too long because, after all, I’ll probably be okay, right? Risky assumption!
- Automation systems are often an integration of technologies supplied from all over the world. This creates thousands of potential vulnerabilities that are difficult to anticipate. For example, how do we know what software might be lying dormant in an NIC (network interface card) until it’s installed and activated?
- Financially, some companies handle the risk of damage or loss from cyber-attack with an insurance policy, covering themselves for “business interruption.” The safety of the plant and its employees and the welfare of the community at large are not well considered when business risks are seen as purely financial in nature. An insurance policy may be revealed as a poor strategy in the aftermath of a security breach which severely impacts profitability, threatens the viability of the enterprise and compromises human safety.
- Taking cybersecurity personally means remaining ever-vigilant. And that’s nearly impossible for humans to do. For example, if a controls operator turns off the alarms on the system because he doesn’t need them for that part of the process, and then forgets to turn them back on later when he does need them, he will not know if a parameter has been changed by a cyber-attack while the alarms were turned off. I’m not picking on my I&C pals here, but with everything a controls engineer must do at any given moment, human error is inevitable over time. The controls operator also must decide what indicators, buried within the metric ton of plant data available at any given moment, should be considered anomalies or constitute a threat. Again…overwhelming.
What can we do, now that we are all standing together and taking the cyber threat thing personally? (We are, aren’t we? Yes, I thought we were.) The answer is, we can do one thing more than we did yesterday, and every step forward reduces the chance that we’ll wake up the next day, boot up our computer, and see a message from a cyber-hack demanding that we open our windows and shout “I'm as mad as hell, and I'm not going to take this anymore” or else they’ll make us all watch endless reruns of The Gong Show. <shiver> No, thanks.
Here are a couple of suggestions for “getting personal” with cybersecurity, outside of what you can find in the (overwhelming) amount of research and data available:
- Get some knowledge, especially that hard-to-find knowledge you can get from your peers who have “been there.” I would refer you to ISA Automation Week, a conference of your peers featuring many well-respected experts in the field of industrial infrastructure cybersecurity. The Industrial Network Security Track is chock-full of information that you will only get one-on-one and not in a text book. And there are several high-level discussions with industry experts that won’t often appear in front of an industrial audience to tell what they know about the infrastructure security. Check it out.
- Employ a cybersecurity standard. The ISA99 standard on Industrial Automation and Control Systems Security is a free download for ISA members and available to non-members as well. We’ll have knowledgeable people at ISA Automation Week who can help answer questions about the standards and practices of ICS security as well.
- Stay vigilant and trust your instincts. You know what you know. If you see something…say something.
Okay, autopros*, I’m outta here. Actually, I have to go home and install a new anti-virus program on my laptop. It’s been a while. And you know how I feel – I take it personally.
*autopros = automation professionals … remember?
About the Author
Carol M. Schafer has more than 35 years of experience in the industrial automation and control field as a technical sales and marketing professional. She spent 14 years in the field as principal of a manufacturer’s representative company, selling flow and humidity products, air and gas analyzers, CEM equipment, and sampling systems. She also worked for several years as the East Coast sales manager for a leading weather instrument/systems manufacturer. Carol joined ISA in 1996, and is currently project manager for the Society’s annual conference, ISA Automation Week. She also serves as a senior consultant with the ISA Corporate Partnerships Program. She obtained a bachelor's degree in business administration from the California State University at Sacramento, and a master’s degree in business administration from San Jose State University.