The following discussion is part of an occasional series, "Ask the Automation Pros," authored by Greg McMillan, industry consultant, author of numerous process control books, and 2010 ISA Life Achievement Award recipient. Program administrators will collect submitted questions and solicits responses from automation professionals. Past Q&A videos are available on the ISA YouTube channel. View the playlist here. You can read all posts from this series here.
Looking for additional career guidance, or to offer support to those new to automation? Sign up for the ISA Mentor Program.
Ed Farmer's Perspective and Questions:
A few years ago, I was drawn into a discussion with friends from the old days, as well as some younger, fresher folks in which the “Oppenheimer moment” concept came up. Oppenheimer’s moment came from the collective insight of people like Plank, Feynman, Schrodinger and Einstein who discovered and developed concepts and their connected realities that strayed far beyond the reality everyone thought they understood. Their “new” was more than “significantly” beyond what was understood as “reality”. Is AI about to do that?
In my experience, there are three levels of inquiry. First, there are questions a practitioner should know, or be able to discern from the usual resources (e.g., textbooks, handbooks, scientific dictionaries, conference papers, company standards).
Second, there are questions involving use of the available information and technology. The underlying question is about how to visualize and create a solution for a particular situation. The real need, though, is deeper understanding of the need assessment and solution/design process. This sort of thing is probably the most crucial in career development.
Third, there are the “limit of knowledge” inquiries. These benefit most from the help of true experts on the subject and can spawn a lot of thinking and great, fresh ideas. The answers are often much deeper and broader than just the current motivating situation. It is easy for such things to move past what we think of as engineering—and on into science. It is sometimes easy for a blog response to move toward a PhD thesis.
We always need to size up the nature and purpose of the question as well as satisfying the true needs of the inquirer. This can often involve a chain of questions, each increasing focus on the core issue, and usually involves a few exchanges—or at least some hypotheses. This situation is easy in person, but can be challenging with a web in the way.
Approaching a problem is facilitated with some "W’s". Remember the 5 W’s: Who, What. When, Where, and Why?
The “start point” is usually “What” do we need to accomplish? In engineering this often presents with a spectrum of options and opinions. Exactly what is needed must be understood with logic and in clarity—a lot of “Why.” I have also respected a suggestion from an engineering economy book:
Why do this:
At all? Now? This way?
“Where” depends on the application and its purpose.
“When” often depends on changes stimulated by present or evolving needs and opportunities. We often confront issues like computing power, communication capability, update intervals, data security,
“How” creeps in with concerns about data precision or validity, noise, timeliness, and quantity. An idea is one thing, but the implementation of it can be quite another.
Another helpful paradigm came from some Army tactics training a long, long time ago. It emphasizes perspective:
Where are we now? Where should we be?
Where are we going? Is that where we should be going? When will we get there?
Why are we going there and should we be?
What happens when we get to where we are going, or should be going?
What happens if we do not get to where we are (or should) be going?
It's easy to apply these concepts to project and management work we encounter. They quickly and simply help organize logical thinking. Sometimes, “When” involves change—the understandable conditions of a “present” evolve with natural and human-motivated intervention. The Industrial Revolution, for example, affected many things in ways that dramatically produced a different-than-anticipated future. Industrial automation produced huge economic and manufacturing changes in a spectrum of ways.
Is AI another revolution? Is it natural? Is it useful? Is it dangerous? Does it enhance or inhibit societal control? Does it assist or dominate societal organization and development? Is its development scientific or political? Does it serve people, or commit them to serving it? AI is a diverse field: can we categorize it conceptually or structurally into segments capable of different applications and outcomes? Defining its segments and understanding their commonalities and disparities might be a good place to start. Any thoughts?
Greg McMillan’s Responses:
Peter Morgan provided the best practices of when we should be modernizing control systems. Here I seek to address the 5 W's with an imaginative view of the future.
Where are we and should we be? Why the disparity?
There are incredible advances in PID control seen in using analyzer and wireless devices via enhanced PID, more effective cascade and override control and dead time compensation by external-reset feedback, and two degrees of freedom structure documented in the ISA Technical Report ISA-TR5.9-2023. There have been advances in Model Predictive Control in terms of being faster, adaptive, and easier to interface and deploy.
There have been advances in control valve technology and understanding of the need for less lost motion and better resolution besides a faster 86% response time documented in ISA-TR75.25.02-2023 Annex A. Coriolis mass flow meters, resistance temperature detectors (RTDs), and radar level measurements offer incredible 5Rs (repeatability, resolution, rangeability, reliability, and response time).
However, we are seeing a decline in innovation and motivation to provide the time, resources, and capital to make the most of these advances to improve and even just sustain process control system performance. Here are some of the many possible reasons for this decline in using the advances in instrumentation, PID, and MPC.
There has been research that shows that the PID algorithm is inherently the best for load disturbance rejection and setpoint response if the proper tuning, form, and structure is used. MPC has an incredibly extensive track record for dealing with interactions and constraints, and facilitating optimization.
Control valve specifications still do not include any requirement that the valve actually respond. There are no entries to meet the 86% response time, lost motion, and resolution needed. The entries for capacity and leakage lead users to think bigger and tighter which along with emphasis on lower costs results in on-off rotary valves with horrible response being used for throttling service. Smart positioners cannot even realize the problem due to poor feedback due to backlash and shaft windup. The problem is discussed in the Control feature article, “Is Your Control Valve an Imposter?”
Lower cost measurements are used due to lack of awareness of the consequences of poor 5Rs.
There are excessive expectations and consequential distractions from algorithms developed by university professors and students seeking a PhD, data driven algorithms, and IIoT and particularly artificial intelligence (AI) hopes and dreams. I personally saw at the time the fourth largest chemical company waste more than 50 million dollars and several decades of effort on the use of expert systems and neural networks. There is a disconnect between universities and industry discussed in the Control Talk column, “The Real Deal with Process Control Education.”
The invisibility of process control improvement has been a major reason for regimentation to copy jobs with an emphasis on costs and schedules. This problem is well addressed in the Control Talk column, “The Invisibility of Process Control.” The leading statement by Sigifredo Nino, a protégé of Greg Shinskey gets at the heart of the matter: “The problem is perhaps best summarized by Karl Åström in his statement, “Control has become a hidden technology.” What he wanted to highlight is the fact that if all the controls work fine, nobody notices their existence, and that people have always been accustomed to defining control as devices and equipment rather than ideas.” What we need are more plant online metrics for both control loop and process performance as discussed in the Control Talk column; “Top of the Plant Bottom Line.”
The loss of expertise, mentors, and management advocates due to retirement and dissolution of technology departments. I was fortunate to work for twenty years in a technology department of nearly 100 specialists in modeling and control where I was given complete time and freedom to find, explore, and develop process control improvements. My group leader and section chairmen were both experts in process control. In fact, the entire 33 years I worked in industry, management were practitioners with extensive automation expertise.
The CEO was an engineer with plant experience. The technology department was dissolved and leaders forced into retirement when the CEO had a legal and business degree with no plant experience. Today’s managers particularly at higher levels tend to business degree and background. Many of the leading experts in former ISA Mentor program who provided answers to the questions posed by mentees, are in their late sixties or even seventies.
This is true for the authors of the practical books by industry experts Ed Farmer and Bela Liptak and by University professors like Karl Astrom, Thomas Edgar, William Luyben, Russ Rhinehart, James Riggs, Sigurd Skogestadt, and Cecil Smith. Other authors, notably Peter Harriott and Harold Wade, and by far the greatest source, Greg Shinskey—have passed. I have personally seen an increasing lack of participation of practitioners working in the process industry in writing articles due to constraints imposed by legal departments concerned with sharing knowledge. I have been fortunate to have continued participation by Hector Henry Torres seen in the Control Talk column “How to Deal with Dead Time Compensation Revelations.”
All of the books by the greatest mind in process control Greg Shinskey are out of print. Most of my books are as well. Magazine websites being redone has also made it difficult to find articles, blogs, and columns unless you know the title. Some that are older than four years may be gone.
Most articles and posts these days are by people trying to sell products or services that do not include the knowledge of the advances in instrumentation, PID, and MPC. There is also a focus on a reliance on IIoT and AI without an understanding of past mistakes and actual limitations due to dynamics, most notably dead time. The Control feature article, “At the IIoT Crossroads” provides a perspective of the past and future IIoT role.
Where are we going and where should we be going?
We will be seeing more copy jobs and less innovation with focus on what is needed to just update the current systems to prevent obsolescence. We will see an increasing information technology (IT) effort to provide more data and AI to provide the expertise needed to deal with problems.
We should be developing more expertise by online courses, Q&As, and webinars like what I have done for ISA since 2010. Practitioners should be given time and freedom to publish and explore, develop, test, implement, and improve process control systems using dynamic models and the actual control system as described in the InTech article, “Digital Twins for the Virtual Plant.” Online metrics to identify control system and process performance for batches, shifts, and monthly real time accounting should be deployed.
Online inferential measurements should be used to improve the performance of sensors and analyzers as detailed in the Control Talk column “Inferential Measurements are the Future.” AI should be based on ISA Technical Reports and books by the authors mentioned and particularly all the publications by Greg Shinskey. AI should be viewed as an opportunity to open minds for better data analytics, diagnostics, instrumentation, procedure automation, and process control improvements by providing ideas and details that are subsequently developed and thoroughly tested and improved by practitioners using a Digital Twin and Virtual Plant. Hopefully, useful conversations with AI are possible where you ask for the view by key experts like Greg Shinskey and request that load disturbances, the "5 R's", and all sources of lags and delays especially in instrumentation be considered.
Books should have best practices summarized at the end of each section or chapter like what I have done with my McGraw-Hill 2019 Process/Industrial Instruments and Controls Handbook Sixth Edition, ISA 2020 New Directions in Bioprocess Modeling and Control Second Edition, and ISA 2024 Advanced pH Measurement and Control Fourth Edition. We should learn how to sell ideas to our boss as depicted in the enlightening and entertaining Control Talk column ,“Want to be a Hero?”
When will we get where we are going and should be going?
I think the bad and good scenarios will be accelerating and be significant in next 5 to 10 years.
What happens when we get to where we are going and should be going?
If we do not correct our course, processes will be relying more on safety systems to prevent dangerous operation. There will be more shutdowns and an increasing loss in process capacity and efficiency and inability to deal with changing process and equipment conditions. If we go to where we should be going, the opposite would be true.
Michel Ruel’s Remarks:
Ed poses intriguing questions, and I won't provide answers but a few comments. Artificial Intelligence (AI) is a rapidly evolving field, challenging to predict its trajectory. Instruments in the market have significantly advanced in recent years, offering powerful tools in information technology. What lies ahead remains uncertain, but we should capitalize on emerging technologies—I'm optimistic. Where my concern arises is that new engineers and college graduates seem to be acquiring diminishing knowledge about process control.
I have specific examples in mind:
First example: Approximately 12 years ago, in a metallurgical process involving an arc furnace, we faced unidentified issues. The arc was lost several times a day, leading to production reduction and process restarts. Attempts to predict this loss two minutes in advance were unsuccessful until a Ph.D. student undertook a thesis project. After months of utilizing around 30 variables over 60 minutes, he successfully implemented a neural network solution, predicting the arc loss 80% of the time.
While modern AI can achieve the same outcome in minutes, the challenge remains in demystifying the solution. I believe using similar technologies not only to predict but also to explain the arc loss would be the next step. This transition turns the solution into a "white box," enabling us to enhance process control and overall improvement.
Second example: Currently, tools (software) exist in the market to detect oscillations, stiction, and abnormal situations, employing advanced statistics and mathematics. If AI can enhance these tools and integrate them into process control systems, we stand to gain substantial benefits through intelligent alarming.
Third example: Forty years ago, technicians underwent training to calibrate instruments regularly. Today, purchasing a flowmeter often means it remains uncalibrated yet maintains acceptable performance—an evident improvement.
Conclusion: We witness the emergence of numerous valuable tools. However, the challenge lies in finding proficient individuals to leverage them. Even with the right personnel, ensuring they have the time to harness these tools becomes crucial.
Michael A. Taube’s Remarks:
Mentoring is something that I have been observing/pontificating about in social media (mostly LinkedIn) for nearly 15 years. More recently I have published several LinkedIn articles on the topic; see below. Ironically, this is not a new issue: one of my “virtual mentors”, Admiral Hyman G. Rickover (aka “Father of the Nuclear Navy”), fought the same battle for the last 20 years of his 60-year career in the US Navy!
I am sorry to say that—as the root cause for our observations is based on business/organizational practices—most anything we publish in our “usual and customary” (technical) venues will amount to “preaching to the choir.” Thus, I have been determined to find a different venue for making the point—and maybe getting some traction.
Enter the Mary Kay O’Connor Process Safety Center (MKOPSC) 2024 Safety & Risk Conference.
I am compiling a paper, “Improving Safety Performance: Compliance versus Competence,” to submit to this conference. This paper contains many, if not all, of the issues, concerns and remedies that I have contemplated or suggested over the years. I expect it to be a fairly long paper. I am collaborating with a colleague from the High Reliability Group (HRG), Frank Gardner; the HRG is comprised entirely of retired US Navy Nukes.
I have co-authored papers with Frank before. Frank spent the last five years of his 30-year career as the Command Master Chief over the Naval Nuclear Propulsion Program (NNPP), so he knows quite a bit about high reliability organizations. I also welcome anyone from ISA to collaborate or contribute to the paper by contacting me at S&D Consulting, Inc. Alternatively, submit one independent of me or Frank—more voices are better!
Some may ask, why process safety? The answer is simple: process safety management (PSM) is one area where management will pay some attention.