ISA Interchange

Weathering the Perfect Storm: ISA in Conversation with Newsweek [Q&A]

Written by Jennifer Halsey | Jul 24, 2020 9:15:00 AM

Earlier this year, Newsweek Vantage published an independent report on cyber risks to critical infrastructure. The International Society of Automation (ISA) served as its expert partner, helping with concept development, research, and survey creation and analysis

Eric Cosman, a consulting engineer and the 2020 ISA president, as well as Steve Mustard, an independent consultant and the incoming 2021 ISA president, provided insights to the report as subject-matter experts. They recently joined a phone call with Nigel Holloway, the director of research and editorial at Newsweek Vantage, to discuss key findings.

At the time of the report's writing, COVID-19 had not yet morphed into a global issue. The group's conversation, recorded in late June 2020, assessed this recent development. They discussed the impact of the pandemic on organizational cyber threats as well as the broader industry challenges raised in the report, such as bridging the perceived IT/OT divide and building a strong organization-wide cybersecurity culture. 

You are invited to listen to the audio recording below, or read the transcript that follows.

 

Newsweek: My name is Nigel Holloway and I'm director of research and editorial at Newsweek Vantage. I'm here with Eric Cosman, the 2020 president of the International Society of Automation, and Steve Mustard, an executive board member of ISA. Thanks very much for being here, Eric and Steve. Welcome.

We're here to talk about the findings of a report published in March 2020 by Newsweek Vantage called "Weathering the Perfect Storm: Securing the Cyber-Physical Systems of Critical Infrastructure." We defined cyber-physical systems as engineered systems that orchestrate things like industrial controls to interact with the physical world, including humans, to enable safe, secure, and resilient performance. Critical infrastructure includes the energy sector, transportation, government, and healthcare. Now, since the publication of the report, we've become preoccupied with the impact of the COVID-19 virus. So I wanted to bring our listeners up to date and ask Eric Cosman first, what impact has the pandemic had on cyber-physical security systems in critical infrastructure?

 

EC: Well, it's difficult to give a specific answer to that without surveying people, but based just on anecdotal evidence, people arein some wayscombining their cyber response into their pandemic response. The pandemic has shown them that unanticipated events are something that they have to be prepared for, and I think we're seeing an increased interest in business continuity planning at the at the operations level.


Newsweek: Steve, do you want to add to that?


SM: I think Eric makes a great point about continuity planning. One of the common things that gets fed back to us when we talk about cyber incidents is, "Something has never happened before. So why do we need to prepare for it?" As Eric points out, COVID-19 shows that you can't always plan for everything to happen exactly as you expect.

Cyber is like thatit's a high-impact, low-probability incident, and many organizations are not well-prepared for it. The other thing I would add is that COVID-19because of the necessity to lock downhas meant that organizations with critical infrastructure equipment have been forced to work remotely more. As we all know, working remotely creates some new cyber vulnerabilities that perhaps didn't exist in the organization before.

Newsweek: Can you talk a bit about the impact of remote working on cybersecurity, Steve? Obviously, that's a major concern since the pandemic struck.

 

SM: I think many organizations have had a good cyber posture with regards to remote workingthat it was denied by default. You had to be physically present to be able to access or maintain equipment.

It was by necessity that they had to, in many cases, find ways to overcome that. For instance, equipment needs to be maintained. Then there's still a need to get access to commission systems, projects which are underway, and such like. So organizations have been forced to find ways to address that.

From a cybersecurity point of view, the challenge is that when you're forced to do something quickly, you don't always come up with the best solution. There might be a quick and easy solution that allows someone to get remote access to a system that might serve the business need right now, but it doesn't necessarily provide the best, more secure solution for them going forward.

The second point is that if you make it difficult for people to get access to systems, and they need to do that for their job, they will find a way around it. The last thing organizations want is to have people creating their own backdoors that you're not aware of.


EC: I think there's another another twist on this, too. Many companies and asset owners have had remote access for certain parts of their systems that maybe were deemed less critical. I have no evidence of this, but I'd be pretty willing to make a bet that now they're basically extending that same access into areas of their process that previously had been walled off, like Steve said. The problem with that is that the remote access method may not have been designed with those extra risks in mind.

 

Newsweek: Is it possible to reverse engineer and redesign an existing system? It sounds harder than installing a new one that would take account of the vulnerability you're talking about.


EC: Yeah, neither of those are trivial exercises. You have to do a fairly detailed risk assessment. My concern is that people are extrapolating a risk assessment for less critical systems and applying the results to more critical systems. The assumptions they made may not be valid.

Newsweek: Has COVID had more of an impact on cyber-physical systems and critical infrastructures than other sectors?


SM: It has more impact in those organizations that have very high availability and high production requirement demands compared with other sectors. Again, coming back to some of the points about access and operations, it's much more important to keep those systems running. It's therefore much more important to get access to them, and so that creates some demands which might not be present in, say, a less critical system. I don't know if it's significant, but I think that that's the biggest factor that we face with the critical infrastructure.


Newsweek: I remember just before the report was published, there were news reports about a cyber attack on the Department of Health and Human Services here in the United States. It seemed to coincide with the beginnings of the efforts to control the pandemic. I just wondered whether you have any idea of whether hackers and criminal organizations are taking advantage of COVID to focus on critical infrastructure.


EC: I would say that it's highly likely. Cyber adversaries are very much creatures of opportunity. They're very good at taking advantage of weaknesses. One of those weaknesses right now is that everyone is preoccupied with things that they've never dealt with before, and they may be shifting some of their attention away from other areas. It's almost a certainty that some of those adversaries are going to try to exploit that weakness.

We've seen some anecdotal reports in the press. In the healthcare system, for examplemaybe at this time, one of the most critical of critical infrastructuresthe adversaries are trying to penetrate for all kinds of reasons, not the least of which is trying to get access to some of the information that's being generated by research on the pandemic.


Newsweek: Do you have anything to add to that, Steve?


SM: Well, one of the things that is very common in hacking situations is phishing attempts. They are heavily based on having the right context for requesting somebody to provide sensitive information. Something like COVID-19 provides the perfect opportunity, for instance, for a hacker to pretend to be a customer to ask an IT company for the credentials it needs in order to get access to the system because they can't get to the office and such like. It creates opportunities for plausible scenarios where hackers can get sensitive information. As Eric says, when the organization itself is trying to deal with many other things, oftentimes, something gets dropped. Unfortunately, today's cybersecurity doesn't get significant focus, sometimes, if there are limited resources.

Newsweek: Now, let's turn to the report itself. One of the important points was that the survey of more than 400 executives shows that employees are the biggest source of vulnerability. What, in your view, Eric, are the implications of this for cyber-physical security systems?


EC: First of all, I would agree with that finding. I think that's fairly well-understood in the security community. After all, the employees know their way around inside these systems. Steve referenced phishing, for example. Phishing exercises target those employees to be able to use their knowledge of those internal systems to nefarious ends.

I think that as companies perhaps lessen their focus in certain areas out of necessitybecause they're focusing on COVID-19 responsethey cannot afford to take attention away from the internal awareness programs that they have with their employees around security. They have toif anythingup it a little bit and make people understand that just because we're dealing with this crisis, security is still important, and it's everybody's problem. A lot of companies do mock phishing exercises to see if people actually take the bait. I would hope that those are being maintained, if not actually increased a little bit.


SM: I would agree with that. As they say, people are the weakest link, but they're also the best line of defense for an organization. People are the weakest link because, as Eric says, they're the ones who are going to get targeted by social engineering attacks. But they're also the people who can spot when something suspicious or unusual is going on.

That awareness piece is really critical. If you see something peculiar that you're not sure about, then you should raise that. There's no more important time to do that than now, when a lot of people are being forced to work in different locationsmaybe at home, maybe in other locationsand you've got unfamiliar setups and different security. You've got a whole host of different environmental conditions that you've got to grapple with, and at the same time, not forget about the importance of security.


Newsweek: Another conundrum that the report focused on: organizations need to balance efficiency by means of, say, IT/OT integration, against the need for security. That's obviously a crucial balance that they have to achieve. Can you talk a bit about that, Eric, and then if you could follow on, Steve, and give your take on it?


EC: This is an interesting one, because it comes up every time we talk about security. It's the cultural aspect of the whole problem. People tend to view security as an impediment to doing their job. In the security community, we have always been challengedand are continuing to be challengedto try to address that. First, determine whether it's real or a matter of perception. Sometimes it is perception, and sometimes people even use it as an excuse. But to the extent that it is real, we need to find ways to make security robust, yet almost invisible.

Now, the good thing is that technologies are coming to the fore that can help that process. The theme that goes through all of this is to integrate security into your work processes in such a way that is not seen as something that's added on. In order to do that, you've got to collaborate. You engage all of the stakeholders and get their buy-in, their investment, their ideas, their perceptions, and everything else to come up with a truly robust solution. It's an evolving area of research and investigation and engineering design. I know we're not there yet, but we're better off than we used to be.


SM: I think that's a great point. I think what Eric said about invisible security, or "frictionless" as some people call it, is critical to this efficiency. Convenience is at the other end of the scale to security. Organizations at the moment take quite a binary choice as to, "Well, I'm either going to have maximum security, which means for instance, nobody has remote access," or you go all the way to the other side, where you say, "Everybody can do anything they want, wherever they want," and you've got no security at all. The right answer, as Eric says, is not only somewhere in between those two, but it actually changes over time as well. It has to be continually updated and adapted as consequences change.

When we talk about situations like COVID-19, one of the challenges for organizations is that some of the things that are put in place that are supposedly temporary solutions to achieve or to maintain efficiency may be forgotten about once things return to the normal or the "new normal." Those backdoors that are left are things that organizations really need to take care of when they move forward, to make sure they're continually addressing how secure they are and how vulnerable they are.


EC: I think Steve touched on this earlier. He said that, faced with stringent security and the necessity of getting the job done, people are very ingenious. They'll find ways around it. One of my former colleagues used to say that it's impossible to make things foolproof because fools are so ingenious. There's truth to that. If you take the absolute approach to security, using Steve's metaphor of the "lock it down" end of the spectrum, that's seldom effective, because people will find creative ways around it. We see there are countless examples of people putting sensitive information on thumb drives, for example, because they don't have a convenient and secure way to transfer it. It just makes the problem worse.

Newsweek: I think it's fascinating that, each time we talk about technological solutions, we end up talking about human behavior. That leads into the fact that the personnel in IT and OT do not collaborate particularly well. That is a problem if you're going to take a holistic approach to cyber-physical security. People have been talking about this lack of collaboration for many years. Why is this problem so difficult to solve? How important is it to solve this problem to achieve a holistic approach?


SM: I can comment on that first. I think the most obvious problem is that the culture of IT and OT teams is so very different. The main observation is that IT is very tolerant of rapid change and failure. In fact, agile methodologies emphasize the point about fast failure and trying things and continuous change. That's the very opposite of what we want in the OT environment, where we want stability, we want no change, and we want reliability.

Especially when you're talking about introducing new technology, you're naturally talking about making changes to things, the way things work, the way procedures are done. OT people, by their very nature, do not like change at all. It introduces risk, real or perceived. It's hard sometimes to square that circle and to bring them in line and say, "We understand the risks. We've assessed the risks, and we understand what changes need to be made in order to continue to maintain the right level of safety and security." So, until we can find a way to bring those cultures together, I think it is always going to be a big challenge.


EC: I think the we've all heard the phrases, "the IT/OT divide" or "the IT/OT conflict" or "IT versus OT." My own view is that it's an oversimplification and maybe a misleading one.

First of all, the cultures are absolutely different, but it's not as simple as IT/OT. People who have done research and investigated the space in depth will tell you that there are organizations and teams and people who work on what we might consider the IT side that are every bit as risk-averse as the engineers in operations, because they are in fact running mission-critical systems. If you're running a point-of-sale system, or if you're running an enterprise resource planning system for a large corporation, there are airtight lockdown management of change processes that are every bit as robust as engineers use in operations. So it's not black and white. It's a spectrum. The spectrum runs from, as Steve put out there, the need for continuous evolution and rapid try-and-fail methodologies, to the other end, the absolute inability to tolerate that.

We all have a phone in our pocket or on our hip, and chances are, you have the auto update feature enabled. Whether you're getting it from the Apple Store or the Android Play Store or whatever, your applications will probably set the update whenever somebody decides to put out a new update.

Well, that is absolutely not tolerated in a highly critical application, be it in operations or in a back-end business-critical system. Those changes have to be planned and tested ahead of time. That's the spectrum you're running on. I think sometimes the use of IT/OT just clouds the issue, because it allows people to assume that if you're an IT person, you don't understand high availability. That's a false assumption.

Newsweek: So how can organizations overcome this problem? I'm thinking of IT teams and OT teams swapping for a week, or somehow enabling them to talk about their common problems and and share ways to overcome them. What are your main solutions?


SM: I can give some suggestions that I've seen work well. You do blend the teams. The point that Eric made is quite correct, which is, this IT/OT thing actually creates some of that division in the first place. Bringing experts together into one, combined team actually really does help a lot to overcome some of those challenges.

The other problem we have in big organizations is that they're often structured in a way which doesn't really help the division between the IT function and the OT function. IT is often some kind of central function that provides support to the business, and the OT team is often part of the engineering function, which might be regional and local. They really don't like the central team, whoever they areIT or otherwiseinterfering in their day-to-day business. So somehow bringing the skills together into one team is, I think, the best solution. It can be difficult but that's one approach.


EC: I think that you have to go back to the fundamentals of team building when you're pulling a group of people together to achieve a result. You want to find the right people with the right skills and the right experience, who have the right understanding, irrespective of what organization they may come from.

You bring them together, and the first thing you do is establish their common vision. You tell them what you're trying to achieve. The vision naturally morphs, and if you're successful, they articulate and adopt a shared mission in order to achieve that vision.

Whether the mission is to launch a capsule to dock with the International Space Station, or whether it's to have an effective, secure, and responsive operations and control system in manufacturing, or whether it's a point-of-sale system or an enterprise resource planning system, the exercise of putting that team together is the same. I've had experience doing that in my career before I retired, and it's been very successful.

I've seen many examples of bringing people who worked in a traditional IT role into the operations side, and they get a whole new view of the world. They start to understand the imperatives, and work towards addressing those imperatives. My experience has been that it's easier to bring those people into the OT side than it is the reverse, but I'm sure there have been cases where it's done both ways.

The last thing I want to stress before I stop is, this is not an organizational problem, lest people think that they can just combine two organizations under the same manager and that will solve the problem. It won't. It was pointed out that it is a cultural problem. It is a mission-critical problem.

 

Newsweek: Another big challenge is complacency. So often, organizations are metaphorically closing the stable door after the horses bolted, and they've been hit by a cyberattack. How can the problem of complacency be dealt with to achieve a risk-aware culture?


SM: Let me offer thoughts based on something Eric said early on, which was about disaster recovery and the business continuity planning side. Thinking about what can happen and being prepared for it is is key to successful business continuity. Unfortunately, for whatever reason, we're still not at the point where organizations see cybersecurity being a cause of significant failure, whether it be a health and safety or environmental incident or production loss. They still believe it's a relatively minor thing that usually will happen to someone else and not them.

One of the biggest factors with that, I think, is the way health-and-safety culture is in organizations. If you see some bad behavior or poor safety behavior, you're allowed to intervene; you're allowed to stop the job; you're allowed to raise an issue. That gets treated very, very seriously, even if it didn't actually have a consequential accident. We're not there yet with cybersecurity.

If you do the things that Eric mentioned earlier, like putting some sensitive information on a thumb drive and somebody gets it, you don't report that as a near miss. You don't say, "We were lucky. We didn't lose that information, but we could have done."

If you look at the health-and-safety world, the safety triangle is something which has been around for a long time and establishes the principle that those near misses become minor accidents become major accidents. There's a well-understood relationship between those, so the more near misses you have, the more likely you are to have serious incidents in the future.

So, it's a leading indicator of that kind of safety behavior. If we use that same approach for cybersecurity, we can start to say, "Well, our organization is having a lot of near misses in cybersecurity, which means we're more likely to have a serious cybersecurity incident in the future."


EC: I agree with the parallel with safety. The words we need to focus on here are education, awareness, sensitivity, and understandingthat it can, in fact, happen to you. The one thing that makes it more difficult in security is that in safety, there's an inherent self-interest. If you say in your safety program, "Our objective is to make sure that you, individually, go home every night in the same condition that you were when you walked into the morning," people can buy into that. They can personalize it. It's a lot harder to personalize security, because they may not feel the the impact of the consequences.


Newsweek: Well, thank you very much, both of you. We've come to the end of our recording. I'd like to thank Eric Cosman and Steve Mustard for their very interesting insights. I recommend to you to read the report "Weathering the Perfect Storm: Securing the Cyber-Physical Systems of Critical Infrastructure," published by Newsweek Vantage. Thank you, gentlemen, very much again.


EC: You're very welcome. Thank you.

 

SM: Thank you.

This transcript has been edited for clarity. Interested in reading more articles like this? Subscribe to ISA Interchange and receive weekly emails with links to our latest interviews, news, thought leadership, tips, and more from the automation industry.