ISA Interchange

Welcome to the official blog of the International Society of Automation (ISA).

This blog covers numerous topics on industrial automation such as operations & management, continuous & batch processing, connectivity, manufacturing & machine control, and Industry 4.0.

The material and information contained on this website is for general information purposes only. ISA blog posts may be authored by ISA staff and guest authors from the automation community. Views and opinions expressed by a guest author are solely their own, and do not necessarily represent those of ISA. Posts made by guest authors have been subject to peer review.

All Posts

Ask the Automation Pros: The Use of Artificial Intelligence in Process Control

The following discussion is part of an occasional series, "Ask the Automation Pros," authored by Greg McMillan, industry consultant, author of numerous process control books and 2010 ISA Life Achievement Award recipient. Program administrators will collect submitted questions and solicit responses from automation professionals. Past Q&A videos are available on the ISA YouTube channel; you can view the playlist here. You can read posts from this series here.

Looking for additional career guidance, or to offer support to those new to automation? Sign up for the ISA Mentor Program.

Russ Rhinehart’s and Brian Hrankowsky’s Question

What are your thoughts about the use of artificial intelligence in process control?

Greg McMillan’s Perception

There are many conceptions about what is artificial intelligence (AI). I suggest we focus more on AI that is utilizing instrumentation and is combined with existing technologies in the process industry such as proportional-integral-derivative (PID) and model predictive control (MPC) for closed loop control and principal component analysis (PCA) and projections to latent structures (PLS) for data analytics to deal with situations that would normally require human intervention. This includes machine learning and deep learning that are modeled after the decision-making processes by humans using knowledge of technologies, process and equipment.

The IBM site “What Is Artificial Intelligence (AI)?” offers the following explanation of the differences between machine learning and deep learning:

“Machine learning and deep learning differ in the types of neural networks they use, and the amount of human intervention involved. Classic machine learning algorithms use neural networks with an input layer, one or two ‘hidden’ layers, and an output layer. Typically, these algorithms are limited to supervised learning: the data needs to be structured or labeled by human experts to enable the algorithm to extract features from the data.

“Deep learning algorithms use deep neural networks — networks composed of an input layer, three or more (but usually hundreds) of hidden layers, and an output layout. These multiple layers enable unsupervised learning: they automate extraction of features from large, unlabeled and unstructured data sets. Because it doesn’t require human intervention, deep learning essentially enables machine learning at scale.”

Michel’s Ruel’s Response

Thanks; Russ and Brian have suggested an interesting topic.

Initially, I contemplated contrasting various process control methodologies:

The traditional PID controller is renowned for its ease of tuning to achieve specific goals such as performance metrics, operational objectives and mimicking operator behavior.

Subsequently, I considered assessing the advantages for selecting the most suitable approach based on existing resources (assuming a thorough design review and proper equipment configuration):

  • Do we already attain satisfactory performance levels?
  • Are there models available or multiple models to work with?
  • Do we possess operational insights necessary for effective implementation (operator expertise)?
  • Is historical data accessible for AI?

I like to use this decision tree to select the right approach:

Decision TreeHere is a discussion on different techniques:

PID (proportional-integral-derivative control): PID control acts as an error regulator, focusing on driving the error to zero. It is commonly applied in systems with varying or nonlinear models, making it critical to choose tuning parameters carefully for stable performance. PID operates in a single-input, single-output (SISO) fashion, though combining multiple PID controllers can introduce complexity to the control scheme.

MPC (model predictive control): In contrast to PID, MPC leverages process models to optimize several variables concurrently towards predefined objectives. One key challenge with MPC lies in requiring known process models. Unlike PID, variations in models can lead to low performance, often necessitating a matrix of models for effective control in intricate processes.

FLC (fuzzy logic controller): Alternatively, FLC steps in when dealing with varying or unknown models by emulating skilled operators. Rather than modeling the process directly (as with MPC) or focusing on error reduction (like PID), FLC mimics ideal operator behavior in different scenarios.

AI control: Utilizing historical and real-time data, AI controllers strive to achieve objectives without prior process knowledge. AI systems operate as black boxes, unlike FLC, offering adaptability based on data without explicit understanding of the process or operation.

Each control approach: When utilizing PID, tuning involves leveraging process knowledge to set appropriate controller parameters swiftly, based on the desired relationship between these parameters and process responses. For instance, flow loops typically necessitate a low proportional gain (<0.1), while level loops demand higher values depending on the application.

In MPC, complex modeling replaces educated guesses, emphasizing the importance of well-defined process models.

FLC relies on understanding operational success rather than detailed process models, making it a valuable choice where processes are not well-characterized.

For AI control, substantial data and clear objectives are imperative to guide the system effectively toward its goals.

Ultimately, effective process control surpasses the sophistication of the controller alone. Just like in a car race where a skilled driver (controller) needs a high-performance vehicle (well-designed process and equipment) to succeed, achieving optimal performance requires a holistic approach beyond just employing an "intelligent controller."

If a computer program can drive a car successfully in traffic, we should be ready to see such controllers performing in process control!

Greg McMillan’s Response

I agree with the ISA August 2022 InTech feature article “Enhancing Human Effort with Intelligent Systems” that sees the major opportunities for AI to be in predictive maintenance, quality inspection and assurance, manufacturing process optimization and supply chain optimization.

Conducting design of experiments (DOE) and generating a large variety of scenarios that include instrumentation maintenance and performance, and supply chain demands using digital twin first principle dynamic simulations, could train AI neural networks. Definitive identifiable abnormal situations could also be introduced. The source of oscillations could possibly be diagnosed based on amplitude and period and total loop dead time. For example, decaying oscillation periods less than 4 dead times are most likely caused by higher that desirable PID gain, periods larger than 6 dead time are most likely caused by greater than desirable integral action and periods larger than 20 times the dead time most likely caused by a product of the PID gain and reset time being too low for a near-integrating, true integrating and runaway process. If the oscillation amplitude has a constant component that does not change with PID gain but whose period changes with PID gain and reset time, the cause is most likely valve resolution (e.g., stiction). If the oscillation amplitude of a integrating process has a constant component that does change with PID gain and whose period changes with PID gain and reset time, the cause is most likely lost motion (e.g., backlash). Oscillations that develop for large upsets and setpoint changes are most likely caused by poor valve response time (e.g., stroking time). Oscillations that develop at low production rates are most likely caused by poor measurement rangeability or worse valve resolution and lost motion near the closed position and higher process gains and process dead times for low flows.

The ability to handle process startups, shutdowns, transitions and abnormal operations could be improved introducing scenarios into digital twin first principle dynamic simulations that have shown to match plant data. AI can help identify causes and effects resulting in possible solutions in terms of procedure automation and state-based control.

There are many challenges to just simply using plant operation data to improve closed loop control. Foremost is the complexity and slowness of dynamic responses in process control not seen in machine control including large dead times, negative and positive feedback time constants, unidirectional response (batch applications), integrating action and open loop gains that change with time, production rate, equipment conditions and stream compositions. There is also the interaction between loops and the dramatic effect of tuning and algorithms (e.g., PID forms and structures) and the transfer of variability from controlled variables to manipulated variables. Also, plants are increasingly unwilling to change setpoints or flows for a design of experiments (DOE).

Many millions of dollars were spent on expert systems in the 1980s and 1990s in the process manufacturing companies that I worked for, with only one success being a smart alarm that I think could have been achieved with some simple concise logic. There were so many false alarms and distracting alerts in other control rooms that operators asked for the expert systems announcements to be turned off. Eventually, the systems fell into disuse. A significant problem was the inability to drill down and see the order of execution of rules or test the value and scope of individual rules.

Neural networks have a history of performance issues I have personally experienced due to correlations between inputs, steady state closed loop data, lack of change in operating point or operating conditions, inability to drill down into hidden layers for analysis and understanding, interpolation process gain reversals, missing compensation for complex dynamics and bizarre extrapolation beyond the data training set. Principal component analysis (PCA), open loop tests and more sophisticated dynamic compensation could help address many of these issues, but there seems to be a lack of communication between the neural network and PCA and modeling experts to enable a better total solution. There are AI opportunities for optimization, diagnostics, inferential measurements and metrics and getting us all (e.g., operators, research scientists, process and mechanical engineers, automation professionals, maintenance professionals and information technology specialists) on the same page if we are realistic about what is achievable, as discussed in the Control Talk columns “Top of the Bottom Line” and “At the IIoT Crossroads.”

I think there is a particular opportunity for us all to be able to ask questions and get AI responses that we can further investigate and possibly use to improve control system and plant performance.

Todd Jaco suggested: “It would be interesting to hear a discussion on the use of AI and digital twins to look at actual results versus optimal (i.e. first principle) results for a given operating period and suggest how much of the deviation is related to operational, technological and automation tuning/configuration issues.”

Mark Darby’s Response

Here are my thoughts:

First a look back. Neural nets have been successfully applied in one particular industry — polymers — going back to the mid-1990s. It is important to point out that they used a hybrid modeling approach based on a Wiener structure — linear dynamics with the nonlinear static part modeled with a neural net. These industrial applications dealt explicitly with the problem of extrapolation outside their training base.

Over this same time frame, neural nets were also successfully applied to soft sensors development where nonlinearities were significant. However, the majority of soft sensors development utilized a different modeling approach such as PCA and PLS, perhaps with nonlinear terms included in the model based on engineering insight.

In the 1990s, important academic contributions in the application of neural nets were done by the process systems community. These included hybrid modeling with neural nets, where the unknown relationships and/or parameters were fit with a neural net model. Another notable approach incorporated PLS-type functionality into the network, but allowed nonlinear terms instead of linear ones as with PLS. Other contributions addressed detecting abnormal operation, using neural nets in a classification approach (can be thought of as a nonlinear PCA).

Subsequent development in AI and ML was mostly done by big tech, and were therefore not motivated by applications or needs of the process industry. And so the application of these approaches may not carry over 100% to our space. Where they do, of course, is great. Image processing is one example. Newer networks now build in the provision to model dynamics, offering an improvement over the recurrent networks used in the past. An example is ChatGPT, which was developed for large language models but has been shown to be equally successful in modeling time series data. I’ve seen promising results with this technology for soft sensors and hybrid modeling, but we have so far seen little industrial application.

We are still early in the journey of figuring out what the new developments in AI and ML mean for the process industry. There is a lot of hype, but I believe there is a lot of promise. I think the biggest impact will be in leveraging or combining these AI and ML tools with existing approaches, as opposed to assuming they’ll be total replacements.

David De Sousa’s Perspective

Additional to all of the precedent valuable thoughts and perspectives from Michael, Greg and Mark, I will also add the potential benefits it could have on enhancing and supporting the daily activities of colleagues working in Process Control & Process Automation.

We are starting to see some examples coming from industry practitioners and researchers alike. Three relative recent examples come to my mind:

  • Last year, at the 2023 ETFA – IEEE 28th International Conference on Emerging Technologies and Factory Automation, the Best Paper Award went to a remarkable paper titled "ChatGPT for PLC/DCS Control Logic Generation." The authors created a curated collection of 100 natural language prompts to generate control logic in IEC 61131-3 Structured Text using large language models (LLMs). Some of the prompts are inspired by real engineering projects. The collection is structured in the following categories: Standard Algorithms, Mathematical Functions, PLC Programming Tasks, Process Control, Sequential Control, Interlocks, Diagnostics/Communication, Advanced Process Control, Various Engineering Inputs and Programmer Support. They tested the prompts by generating answers with ChatGPT using the GPT-4 LLM. It generated syntactically correct IEC 61131-3 Structured Text code in many cases and demonstrated useful reasoning skills that could boost control engineer productivity.
  • This year, the 46th ICSE, the joint IEEE/ACM International Conference on Software Engineering co-hosted the First International Workshop on Large Language Models for Code (LLM4Code 2024). In a paper titled “LLM-Based and Retrieval-Augmented Control Code Generation,” the authors used LLMs and retrieval-augmented generation to create IEC 61131-3 Structured Text control logic with proprietary function blocks. With this method, control engineers could benefit from the code generation capabilities of LLMs, re-use proprietary and well-tested function blocks and speed up typical programming tasks significantly. The ICSE24 paper evaluated the method using a prototypical implementation based on GPT-4, LangChain, OpenPLC and the open-source OSCAT function block library.
  • More recently, in a paper published on Elsevier's Journal of Systems and Software titled “Fast state transfer for updates and live migration of industrial controller runtimes in container orchestration systems,” a major industrial automation OEM demonstrated their success in updating cyclic control applications at runtime without disrupting or stopping them. They used OPC UA, as well as open62541, OpenPLC and StarlingX (all open-source) to validate the approach, in addition to their proprietary branded software.

Some closing thoughts from an AI event I attended a couple of weeks ago (AI Cybersecurity Forum: Insights from the Front Lines) hosted by the SANS Institute, where multiple topics were also discussed on the impact of AI in our process control work, more particularly when it comes to the security of critical systems.

  • AI can be both a threat but can also enhance our work on threat hunting and intelligence.
  • Our younger colleagues currently working in the ever-expanding field of industrial process automation and control will benefit from gaining literacy on AI; the fundamentals, theories, methodologies, differences between them and their applications.
  • As people in the industry have mentioned before, our future jobs will not be taken away by AI, but by others engineers that know how to use AI to gain a competitive advantage in our field.

Michael A. Taube’s Perspective

“AI,” machine learning (ML) or deep learning (DL) all amount to large statistical regressions. To obtain useful models from these applications requires LOTS of “high frequency” data with lots of movement, as well as lots of excursion outside the desired performance boundaries. All of which is required so that the model “knows” the nominal location of the “edge of the cliff.” Much long-term historical data has been overly compressed in the name of saving disk space. Consequently, the expression “Garbage In, Garbage Out” very much applies.

As with any other statistical model, ML does fairly well with interpolation, but over-fitting has the well-known effect of making extrapolation dodgy, at best. As has been pointed out, closed-loop data often skews model results in strange ways. And, as with all ML applications, the “domain expertise” is still required to ensure the model nominally reflects reality.

For process control applications, one area that I’ve yet to see effectively addressed is understanding of physical limitations with control valves, instrument ranges, etc. This was an issue that was recognized by early model predictive control (MPC) developers: the applications were built to recognize that they did not have DIRECT control of the process. Thus, understanding when a PID controller was constrained or limited in movement in one or both directions was foundational. ML applications (as far as I’ve seen, limited as it is) don’t seem to have grasped this concept.

Lastly, the use of historical data for “learning” is subject to ensuring that the underlying process and control structures are the same for the learning data as well as for current operations (in addition to the compression issues cited above). Thus, changing control valve capacity, heat exchanger and/or pumps, etc., might skew the model and give unreliable/unpredictable results.

Ed Farmer’s Perspective

During the 1950s, a major oil company set about adapting new technology to their process control efforts. Three super guys who had brought the refinery through WWII were promoted to the “Valve Shop” where the increasing spectrum of valves were modified for process control applications. Their work included designing and machining “trim” so the valve’s response matched the needs of a specific application. The impact of these valves marked a huge improvement in process control capabilities because the valve shop could “build in” special characteristics. I recall them being referred to as “smart valves.”

It was pushing 20 years later that I worked with them on enhancing process control with analog computers, e.g., PID controllers. A few were built with pneumatic components but it was clear that electronics were the future.

In the late ‘70s, I was awarded work on a refinery unit optimization. It involved an electronic loop-focused control approach, but augmented with broad information collection and some pre-processing based on signs that some parameter was changing. All these strategies were deterministic and reduced operator involvement and hence increased the amount of process an operator could effectively manage.

In the ‘80s, computers and programmable controllers came along and carried our earlier efforts to bigger and faster systems that required even more information and could manage more optimal control over larger systems. Operator duties migrated toward ensuring the system was running with confidence to the desired results.

Faster computers and networks came along facilitating even greater performance over larger process areas. My client purchased pipeline systems in five states. They moved all the regional control rooms to a single central facility in Texas from which control and optimization of the whole thing could be handled by a small staff.

As automation expanded, it became possible to track changing conditions and adjust the oft-many control parameters toward optimization. At one point, I demonstrated optimization of a control application on the other side of the world from an iPhone. All this demonstrated tremendous advancement in optimization and control made possible by electronic information management. Increased computer power provided incredible capability in information gathering, analysis and use.

So, nothing’s change since my old friends first walked into their refiner’s valve shop. We see opportunity, imagine a way to take advantage of it, adapt (or create) the necessary technology, carefully test, evaluate and monitor the technology, the strategy and the outcomes. The application of our knowledge and the capability of the technology increases with each project, bringing those involved to wonder, “Since we can now do this, what if we could …” Each leap, though, involves a series of steps often involving experience, insight, education, perspective, vision and an idea.

Today perhaps, three guys (um… a Bill, Leon and Dick) might walk into a process situation and see, “This would work so much better if…”. Those three guys probably wouldn’t have a future in a “refinery valve shop” but fortunately for keeping our society moving along, that’s not necessary. This has all happened from quantum leaps in available equipment with applications envisioned by the leading people standing in front of the opportunity at the time it is spawned.

Motivation for this sort of thing comes from perceived value. Often, that’s economic, but may be from a desire to expand knowledge, preserve resources or enhance society.

As with all the forces that affect human civilization the value, even the efficacy, of the motivation may not be uniform throughout all those affected by it. Fundamentally, there is the truth which often comes from the basics of greater nature. “Believing” is highly personal while societal value involves the truths of reality, regardless of individual desires. In process control, the efficacy of a particular expert system is easily measurable and quantifiable. In the greater societal scope, evaluation can be difficult, and even deliberately warped. This easily becomes a significant issue.

David De Sousa’s Updated Perspective on Recent News and Development

Normally Unattended Facilities (NUF) are those where operations are either completely automated or operated remotely, with no personnel typically onsite. There are several challenges (technological, logistical, financial and regulatory) facing a broader application of NUF approaches in the industry, and how and when these challenges could be approached. There are industry-led initiatives aimed in this direction, like the IOGP´s (International Association of Oil & Gas Producers) Normally Unattended Facilities (NUF) Expert Group, which is a collaborative industry effort, addressing encumbrances in codes, standards and regulations as well as encouraging technology development initiatives that could enable this new operating philosophy, and, ultimately, position NUF as a safe, cost-effective and widely accepted design and operation method for oil and gas facilities.

AI combined with advanced model predictive control and advanced regulatory control strategies may help in reaching this target.

On May 24th, two Japanese leading companies in the oil and chemical sector announced that they started continuous autonomous operation of an atmospheric distillation unit for processing crude oil, back in January 2024.

The atmospheric distillation unit is currently operated autonomously with an AI system located in a refinery in Kawasaki.

With 24 key operational factors to control and as many as 930 sensors to monitor, the atmospheric distillation unit especially requires a high level of skills and experience. The AI system simultaneously adjusts 13 final control elements to stabilize fluctuations resulting from crude oil switching as well as changes in crude oil throughput.

The AI system has demonstrated higher stability and efficiency compared with previous human-based operations, successfully controlling the stability even under external disturbances by maintaining the key operational value close to the target value.Manual operations vs AI autonomous control

This is the first example in the world of reinforcement learning AI being formally adopted for direct control of a plant. In a previous pilot test, in 2022, that extended for a 35 day (840 hour) consecutive period, the two companies have already confirmed that the AI solution could control distillation operations that were beyond the capabilities of their existing control methods (based on PID control + advanced process control/model predictive control).

This was also noted by these two companies with the continuous autonomous operation of a butadiene extraction unit announced in 2023.

Crude oil switchingHow much of this experience that combine AI and advanced control strategies to achieve Autonomous and Normally Unattended Operations can be reproduced by other operators and in different process units is still something that the industry needs to assess also from a safety and reliability perspective.

Brian Hrankowsky’s Follow-Up

I thought since I instigated this that I should provide a response. Before doing so, I read the content from Michel, Greg, Mark, David, Michael and Ed. I have no new insights to add to what they have contributed. What resonated with me was the basic concept that great engineers constantly seek to understand where there are business/process optimization opportunities and evaluate adding new tools to their toolbox with the goal of being able to realize value. They are less concerned with the name, category, source, original intent, etc. of the tool and more concerned with selecting the right tool for the specific problem.

Greg McMillan
Greg McMillan
Greg McMillan has more than 50 years of experience in industrial process automation, with an emphasis on the synergy of dynamic modeling and process control. He retired as a Senior Fellow from Solutia and a senior principal software engineer from Emerson Process Systems and Solutions. He was also an adjunct professor in the Washington University Saint Louis Chemical Engineering department from 2001 to 2004. Greg is the author of numerous ISA books and columns on process control, and he has been the monthly Control Talk columnist for Control magazine since 2002. He is the leader of the monthly ISA “Ask the Automation Pros” Q&A posts that began as a series of Mentor Program Q&A posts in 2014. He started and guided the ISA Standards and Practices committee on ISA-TR5.9-2023, PID Algorithms and Performance Technical Report, and he wrote “Annex A - Valve Response and Control Loop Performance, Sources, Consequences, Fixes, and Specifications” in ISA-TR75.25.02-2000 (R2023), Control Valve Response Measurement from Step Inputs. Greg’s achievements include the ISA Kermit Fischer Environmental Award for pH control in 1991, appointment to ISA Fellow in 1991, the Control magazine Engineer of the Year Award for the Process Industry in 1994, induction into the Control magazine Process Automation Hall of Fame in 2001, selection as one of InTech magazine’s 50 Most Influential Innovators in 2003, several ISA Raymond D. Molloy awards for bestselling books of the year, the ISA Life Achievement Award in 2010, the ISA Mentoring Excellence award in 2020, and the ISA Standards Achievement Award in 2023. He has a BS in engineering physics from Kansas University and an MS in control theory from Missouri University of Science and Technology, both with emphasis on industrial processes.

Books:

Advances in Reactor Measurement and Control
Good Tuning: A Pocket Guide, Fourth Edition
New Directions in Bioprocess Modeling and Control: Maximizing Process Analytical Technology Benefits, Second Edition
Essentials of Modern Measurements and Final Elements in the Process Industry: A Guide to Design, Configuration, Installation, and Maintenance
101 Tips for a Successful Automation Career
Advanced pH Measurement and Control: Digital Twin Synergy and Advances in Technology, Fourth Edition
The Funnier Side of Retirement for Engineers and People of the Technical Persuasion
The Life and Times of an Automation Professional - An Illustrated Guide
Advanced Temperature Measurement and Control, Second Edition
Models Unleashed: Virtual Plant and Model Predictive Control Applications

Related Posts

How Did Automation Professionals Benefit from ISA in 2024?

The International Society of Automation (ISA) is proud to be the professional home of thousands of member...
Kara Phelps Dec 17, 2024 9:30:00 AM

Ensuring RCM or DCS Redundancy and Its Security in a Complex Industrial Environment

In industrial automation, remote control managers (RCM) or distributed control systems (DCS) are critical...
Ashraf Sainudeen Dec 13, 2024 10:00:00 AM

ISA Podcast Explores Automation and Smart Agriculture

The International Society of Automation (ISA) podcast, Podomation, curates and shares insightful discussi...
Kara Phelps Dec 10, 2024 11:00:00 AM