ISA Interchange

Welcome to the official blog of the International Society of Automation (ISA).

This blog covers numerous topics on industrial automation such as operations & management, continuous & batch processing, connectivity, manufacturing & machine control, and Industry 4.0.

The material and information contained on this website is for general information purposes only. ISA blog posts may be authored by ISA staff and guest authors from the automation community. Views and opinions expressed by a guest author are solely their own, and do not necessarily represent those of ISA. Posts made by guest authors have been subject to peer review.

All Posts

Ask the Automation Pros: The Need and Use of Simulation in Process Control

The following discussion is part of an occasional series, "Ask the Automation Pros," authored by Greg McMillan, industry consultant, author of numerous process control books and 2010 ISA Life Achievement Award recipient. Program administrators will collect submitted questions and solicit responses from automation professionals. Past Q&A videos are available on the ISA YouTube channel; you can view the playlist here. You can read posts from this series here.

Looking for additional career guidance, or to offer support to those new to automation? Sign up for the ISA Mentor Program.

Russ Rhinehart’s Questions

How important is simulating a new control approach prior to implementing it? How does one include realistic issues in a simulation (noise, disturbances, constraints, nonlinearity, range of dynamics, etc.)?

Brian Hrankowsky’s Response

Not to derail, but I am curious:

I find that when I have enough information to model the items in Russ’s question, I tend to have enough information to design the right strategy without simulating. In many cases, the model is useful to convince others who don’t understand control theory very well that it works. I’ll have created some ad hoc excel calculations to help me crunch numbers at different operating points, but not usually much else. To frame this, my experience is with pharmaceutical manufacture and associated utility systems. What are examples of processes where running models is more necessary?

Don’t get me wrong — we do tons of simulation using tie backs and other low to medium fidelity techniques — it’d be nearly impossible to program and test sequencing and batch operations without it. But typically, if I have enough engineering information to make a model, I can apply all the books you guys have written to design the right solution 😊 and just worry about tuning it in during startup.

Reiterating: my experience is not terribly varied for process types — so I am not willing to say it is not valuable for assessing different control strategies.

Hunter Vegas’s Response

My experience mirrors Brian, but likely even more so. I very rarely have to model much of anything; however, most of my work is specialty chemicals (continuous and batch) and the reactions are usually well understood. There have certainly been occasions where some simulation was required — usually associated with particularly aggressive or exothermic reactions, but the other 99.9% of the time we just configure what makes sense and adjust in run.

Michel Ruel’s Response

In my experience, the use of digital twin and simulations is indispensable for certain projects. For example, in an ore processing project in South America, the simulation revealed crucial insights regarding batch capacity and buffering. The minimal cost of simulation was pivotal in readjusting the design.

Furthermore, in another case, simulation was instrumental in validating the control strategy for optimizing capacity in filling 8 bins with 2 shuttle conveyors from different sources. The cost of the simulation was negligible compared to the overall investment.

In conclusion, in the integration of automation and process control, simulation is undeniably advantageous. I strongly recommend the use of simulation, especially in cases where the design deviates from traditional approaches. the control strategy to fill 8 bins with 2 shuttle conveyors from different sources maximizing capacity. Again, the cost of the simulation was a small fraction of investment.

My point of view is that when you mix automation and process control, simulation is advantageous. I suggest to not hesitate to simulate when design is especially creative or different from usual approaches.

Hector Torres’s Response

Dynamic process simulations play a crucial role in replicating real manufacturing unit operations. The fidelity of a simulation refers to its ability to accurately represent steady-state and dynamic conditions. Fidelity depends on its capability to respond to changes from the control system, limits and internal disturbances. Simulations can generally be classified as low, medium or high fidelity, depending on their level of accuracy.

The choice of simulation fidelity is determined by the intended purpose and cost considerations. Dynamic simulations serve various purposes, including control system checkout, operator training and testing new control strategies.

Simulations consist of individual models, interconnected and working together to represent the operation of the real process or manufacturing unit.

Low fidelity simulations help to ensure control strategies respond in the intended direction to changes in valve positions.

A high-fidelity simulation involves strict material and energy balances, both in steady-state and in dynamic scenarios. It does not rely on empirical relationships. Due to the complexity and cost associated with high-fidelity simulations, they are not always feasible.

In my experience, a medium-fidelity simulation is an option to take. It can incorporate a mix of model fidelities. Its main objective is to verify process control sequences (debugging) and ensure that control strategies respond as intended. Helps reframing considerations for control sequences design. For operator training and testing of new control strategies, a medium fidelity simulation offers a good cost-effective solution, as they exhibit a good balance between accuracy and practicality.

Medium-fidelity simulation models are extremely helpful to save in time and headaches when starting up a new unit or at cutover after a migration. It helps understanding all interactions — process and equipment-wise — that take place. Helps understanding what the operator will be seeing and help improve situation awareness. The more the code is exercised using simulation, the more code bugs encountered and fixed, the better off the operators’ training goes and the better off experience at start-up. It is an awful experience being at the control room, having people looking over your shoulder (and looking at their watch), while waiting for you to fix bugs at the start-up showtime.

Patrick Dixon’s Response

How important is simulating a new control approach prior to implementing it?

Very important.  

How does one include realistic issues in a simulation?

That is the hard part.

I have many examples of using simulation, but the most recent is the selection of an advanced control platform in my role as VP of Automation for Pulmac. When I joined Pulmac, the challenge was to find a way to use their paper mill fiber and furnish sensor technology for an automated solution. I saw that the application fits the development of virtual online analyzers (VOA) to be used as controlled variables (CV) of final product quality at the reel of a paper machine. The strength of a sheet of paper is measured in many ways (tensile, burst, compression, tear, etc.) but no online sensors exist to measure these properties. Without knowing the quality of the raw materials, which are wood fibers, it is unlikely to get usable VOAs of these properties. Once I have VOAs I can trust, I can plug them into a multivariable predictive control (MPC) package as CVs.

We had to build the simulation capabilities necessary to give a realistic test bed for evaluating these platforms. We obviously are not going to conduct this evaluation on a real paper machine; we need to test failure modes and bad conditions that would not be tolerable in a profitable business. Therefore, we developed the following:

  • A dataset generator that yields a dynamic and realistic dataset for training VOAs. This is an open source, generic tool that can be configured to produce a simulated dataset reflecting relationships such as fiber quality to paper strength properties. This tool was the subject of a paper “Open Source Dataset Generator for Data Analytics” at the TappiCon 2023 conference in Atlanta.
  • A dynamic digital twin of a paper machine. Logic was configured with function blocks that can implement configurable steady state and dynamic models with noise and disturbances. This allowed us to connect various MPC package through OP UA in the same way it would connect to a real paper machine. We could induce model mismatch and see if the platform would identify the change and produce a matching model. We then evaluate packages and choose one that best fit our needs.

Building simulations that are useful means not only making it behave like the process but also understanding abnormal behavior. Real processes have noise, nonlinearity, disturbances, failures and dynamics (integrators, deadtime, lags). This requires subject matter expertise of the process and the simulation environment.

I have used simulation in many migrations and advanced control applications. If you don’t test against simulation, you can be in for a rude awaking at commissioning. It is far less costly in terms of economics and mental health to simulate first.

Mark Darby’s Response

I think it is helpful to consider the possible challenges behind the new control approach.

Is it a question of testing a different algorithm or feature in a control system that has not been used before?

Here it may be sufficient just to simulate the controller by tying controller outputs to PVs either statically or with lags to test. Either in the control system or an offline (or emulator).

Is it a question of determining the effectiveness of a different control strategy on a given process, i.e., one for which the control engineer does not have experience? Here the question could be more of the process itself than the control system.

A steady-state process simulation could be used to determine sensitivities, gains for desired controlled variables to independent variables (manipulated and/or disturbance), thereby determining if the control pairing is sensible. The impact of static nonlinearity associated with different operating points can be assessed. Simpler, process or engineering models might also be used.

Depending on the complexity of the controls or the process, especially if there are safety challenges, it can make sense to use a dynamic process simulation and connect to the control system. Nonlinearities, both static and dynamic, can be assessed. I’m thinking of the commercial simulators that provide steady-state and dynamic capabilities.

It could be a question of proving a new controller technology before deploying. I’ve been involved with projects where MPC is new to a plant or company and an existing dynamic simulation is available (common with new plants). Here it made sense to tie the MPC to the simulator and develop different scenarios to show the benefits of MPC. Note: such a set-up can be used for operator training. For these situations, it is useful to include sensor noise to make the simulation more realistic.

With more advanced controllers, especially multivariable like MPC, simulation is a necessary step to verify acceptable closed-loop behavior, constraint priorities and develop initial tuning. Impact of model mismatch can also be studied. Simulation capabilities are often built-in to the product.

Matthew Howard’s Response

Not being as seasoned as most of the folks reading this, I have a more basic understanding of simulation.

I have found simulation particularly necessary when using unfamiliar system features. Reading up on how the function blocks change modes, initialize and compute is great, but it often requires me to set up some parallel “dummy” or “shadow” logic to confirm to me how the logic will function with the live values. This is equally true for control scheme modifications and for new systems. Successive applications of similar logic usually do not require simulation in this way. Similar to others, experience leads us to “simulate” in our heads how the process will respond and react.

When possible, I have also enjoyed installing the logic and operator graphics for a new process before I/O installation. This is only possible if the process is standalone and new. By forcing and faking analog and digital signals, basic logic configuration can be checked out, such as control action, interlocking and operator interface design. We still have to check it out during commissioning, but this first software FAT of sorts reduces the amount of errors and rework to be completed under the gun. This may seem like a no-brainer to those in consulting, but as a mill engineer, we are often scrambling right up to the deadline due to break in work and distractions.

I guess a third comment about basic simulation is about fluid flow and process dynamics. On my projects, the pumps, valves, instrument and piping is often designed by a project engineer or team. I know one engineer who uses fairly advance piping simulations to do her designs. She uses pump curves, geometry and pipe properties to calculate all the pressure drops, head requirements, etc. I have never been on a project with her in which the controls were hard to get right. Simple systems like flow loops, heat exchangers and desuperheater, pressure controllers are all being designed by someone. I would consider the design of these systems to be a form of simulation.

Ed Farmer’s Response

Simulation usually involves help with understanding something and/or with testing something. Accurate and realistic simulation consumes time and resources, so it’s always a good idea to carefully understand what you hope or need to learn from it. Writing a description of the objective(s) of the simulation is a good place to start and often the precursor to lots of long and sometimes intense lunches with interesting people in your organization. The critical question is, “What is it, exactly, that we need to know?” The obvious second question is, “OMG! Can we really do that?” followed by, “…and at the end we will know it’s right because…?”

Is it less expensive and/or more illusory than working directly with the involved process? Developing useful simulation can be pretty simple or unfathomably complex. There may be a PhD thesis hiding in all that stuff that will be involved. On the other hand, is monitoring and bumping the real process adequate for the requirements motivating the simulation? In my first major refinery project in process control, one of the artifacts that was discovered in storage was an analog computer for part of the process made entirely from pneumatic devices. I’ve always wondered how well it must have worked.

Fundamentally, our science in process simulation and control is pretty good. If we understand what the process does and have some performance data we can characterize it with mathematics. Understanding is crucial and may involve adding some process data measurements to help us understand subtle nuances in what happens vs. what we expect. When we understand the functionality and mathematics it’s not difficult to simulate stimulations and watch them evolve on our computer monitor. Sometimes there’s a lot to learn, but that can be a good thing.

Creating realistic conditions for testing presumes we know enough to produce realistic perturbations. I used to save data files from the operation of processes I worked on to facilitate creating things like realistic noise patterns. There are books about various kinds of noise and how to model them. Modeling common random noise isn’t difficult, but sometimes there are process inter-relationships that result in condition-specific periodic noise. All this suggests that the object of the specific work assignment might not be as “typical” as the broader science foresees.

In modeling, an important step involves comparing model performance to process performance over the pertinent range of conditions. If it seems likely that might be especially difficult or expensive, it might be useful to take a deep breath and rethink exactly what you need to meet the motivating requirements.

Russ Rhinehart’s Follow-Up

I greatly appreciate the diverse responses.

Of course, I believe that simulation can be very important in testing and evaluating novel control approaches, which is what my academic research career has been about. (It is a one-upmanship game of publishing a control approach that is better than what the other guy published. And what I find embarrassingly lacking in academic publications are credible simulation demonstrations.)

I am pleased to learn that industry folks see the value of testing control structures before commissioning, also. Although, when I was in industry (13 years starting in 1969), control system design was intuitive, and I still believe that folks who are experienced in both the process and in control can be effective in designing without simulating. But when selling the cost/benefit of an advanced application to management or previewing/fine-tuning a holistic design prior to implementation, I did think that simulation would be important.

How is this simulation learned? Short courses, vendor courses, informally?

The chemical engineering undergraduate curriculum uses steady state simulators in the process design course(s). And students do intuitively optimize the design to maximize an economic profitability metric. Students are shown how to develop steady state first-principles models of individual unit operations (heat exchangers, reactors, fluid flow systems, etc.) in their sophomore and junior level courses. But rarely do they use dynamic models, and even less so perform a study of the impact of disturbances and noise on the process operation. In the process control course, they might use a Laplace transform representation of a linear, constant-gain, third order fictitious process unit to represent the process. In just a few hours of instruction (about 32 per semester per course, with 50-minute courses and time-out for tests and organization requirements) teachers can only take complete novices so far!

I would think, for a relevant process control system design, simulation ability should include:

  • Dynamic simulation, including discrete events such as mode switching in units, piping diversions, valve by-passing for maintenance, and MAN/AUTO/REMOTE/FF in the controller.
  • Numerical methods: to solve ODEs, for root-finding, and for optimization.
  • Operating a simulation software environment, including customizing inputs and output data processing.
  • Calibrating model coefficient values to match the process (ambient losses, friction factor in complicated piping systems, reactivity, etc.).
  • Choosing fast sub-processes to be at pseudo steady state.
  • Including noise (on measurements), drifting influences (fuel BTU content, ambient temperature, rain, fouling, raw material composition. etc.) and calibration error (i/p, valve stem, sensor/measurement, drift, failure, etc.)
  • Using long simulation times with the stochastic influences to evaluate the probability and degree of specification and constraint violation.
  • Start-up and shut-down and operation throughout the entire production and product range.
  • Batch and Continuous.
  • Using transition time to move to a desired operating point and penalty for violations to evaluate process economics and using control system devices to evaluate the cost and maintenance expenses of design choices.
  • Investigating controller action (too aggressive or too sluggish) over the entire operating range.
  • Understanding the economic assessment of the process in the business context, and desired results when throughput constrained and unconstrained.

Because including noise and continually changing disturbances has been important in my research on nonlinear controllers, I included methods in my new ISA book, Nonlinear Model-Based Control Using First-Principles Models.

I think that such is far from being teachable in an undergraduate degree program.

Again, how is it learned? Does this indicate an opportunity for an ISA training course?

Brian Hrankowsky’s Follow-Up

So to be clear: I am NOT advocating anyone stop using simulation. If it adds value, do it! I.e., this is one time where I am not arguing with anyone. 😊

All the uses folks have mentioned are all great uses. We use it for initial training and offline testing extensively. A key practice is to demo the batch processes to the process engineers, operations and manufacturing scientists before investing in our regulatory required testing efforts with the actual application and graphics. Our startups would be a miserable failure if didn’t show up with software that could a t least run offline with some basic tie back simulations. (Nothing like finding competing interlocks in the field….)

Were you on the ISA panel discussion (2008?) where the topic was about what schools should teach? There were three university professors and three hiring managers discussing the two different views. I can’t remember if I said it out loud during the session or to one of the professors after that: it would be nice if the education on each unit op included the considerations you mentioned below and the typical control strategies (see the process control and optimization handbook). Maybe the aspen models that are created for homework should include parts and pieces of sequencing and control strategy implementations?

I am finding that the variation in controls education has grown significantly since I graduated in 2001. Setting aside that my degree is in controls… We have some engineering graduates coming in WITH NO CONTROLS COURSE OR INSTRUCTION AT ALL and others who don’t have practical application under their belt, but have a good understanding of topics like gain schedules, gain arrays, feedforward, etc. and every level in between.

Michael A. Traube’s Response

The following paper used a high-fidelity model to explore if/how electrification of distillation column reboilers would affect tower operations and what the controls design would have to consider to handle sudden reboiler duty load changes. It’s this type of case study effort where high-fidelity dynamic modeling comes in very handy: one couldn’t (wouldn’t want to) try this with real equipment!

“Implications for control systems in highly volatile energy markets: Using a high purity distillation electrification case study” - Isuru A. Udugama, Michael A. Taube, Rob Kirkpatrick, Christoph Bayer, Brent R. Young; Chemical Engineering Research and Design, 203 (2024) 431-440

Erik Cornelson’s Response

That's very interesting. I attached here the link to the paper: Implications for control systems in highly volatile energy markets: Using a high purity distillation electrification case study - ScienceDirect

Electrification is a big trend now, and I know some industrial processes in large plants are being electrified, such as calcination or drying in large furnaces.

Michael, I haven't read the paper yet, but what are your key insights related to the electrification and modeling of these processes? 

Michael A. Traube’s Follow-Up

Thanks for adding the link, Erik; very helpful!

Yes, electrification is a trending thing. Ironically, one of my specialty chemical clients in North America just recently replaced their electric reboilers with a hot oil (fired heater) system! But that was mostly due to poor performance from expansion and aging equipment.

As a bulk of the authors are from New Zealand, where electrification is gaining lots of attention, the point of the paper was to assess the feasibility of using the inherent thermodynamic properties of the process as a “process battery” to help absorb some of the load change upon a change (in the “wrong direction”) of electricity rates. The control-related portion was focused on how the controls might be designed to help facilitate a smooth transition from one state to another. A young colleague (whom I referred to as my “Kiwi Apprentice”), Dr. Isuru Udugama, did all the heavy lifting in creating the model and configuring the control function blocks available in the simulation package (HYSIS, I think).

A couple of items I addressed was that not all processes may benefit from the “process battery” concept due to the inherent thermo properties of the process fluid. We had a fair amount of water in the process which, when the tower pressure is quickly reduced, generates some steam, thus somewhat mitigating the loss of available reboiler duty. The other aspect was in regards to the mechanical design considerations versus economics: the higher “normal” operating pressure could shift capital costs significantly in the wrong direction. There was also the issue of pressure cycles on mechanical integrity (much like the airframe industry discovered with the first pressurized aircraft and more recently — 1970/80s? — with jets operating very short routes with lots of pressure cycles). All of the above, of course, was tagged as subjects for future papers!

The results of the study indicated that it is potentially feasible to design facilities using electric heat sources (rather than burning hydrocarbons), but with the understanding of the caveats above.

Greg McMillan’s Response

There are so many different application details including process conditions and goals and instrumentation capabilities and installations, it is extremely difficult to generalize what is best for another application even they are in the same plant and have the same types of process variables. Publications focus on the story of success and not application details due to proprietary information limitations, lack of understanding of fundamentals and a marketing approach.

I always wanted to know “why” that get you to the “how.” The heart of this approach is the scientific method seeking causes for effects, realizing the importance of experimentation and opening my mind to show the solution could be wrong, recognizing we often learn the most from what is wrong.

I was fortunate throughout my 50-year career in process control to be encouraged to pursue knowledge discovery by sharing mistakes and seeking an understanding of first principle relationships and instrumentation dynamics. I started out as a lead E&I engineer and discovered that the prevalent rule used by the contractor to not use positioners on fast loops was wrong and dangerous. I had to install positioners during checkout of what would be the world’s largest acrylonitrile (AN) plant. Many of my 30 books flag this and the even worse problem of substituting a booster for a positioner on a diaphragm actuator that resulted from an extension of the same rule.

With a goal of knowledge discovery and experimentation, I built a dynamic simulation on my own time for compressor surge control as the lead E&I engineer on AN project. After startup, I was invited to move to Engineering Technology (ET) working with worlds’ leading simulation experts. The steady state simulator FLOWTRAN developed by these experts was given to Aspen Research, a newly created USA government agency to make this modeling capability widely available. The steady state simulation software and physical property package ended up in AspenTech, a newly created by members of Aspen Research.

Throughout my career, I found and developed many equations for mass balances, energy balances, charge balances and momentum balances. Most were ordinary differential equations but some were partial differential equations most notably for sheet lines. I also developed equations for mixing lags, injection delays, final control element and measurement 5Rs (resolution, repeatability, rangeability, reliability and response time). I also developed an equation for lost motion more commonly referred to as backlash. I am diligent about including disturbances and dealing with near zero inventory seen in batch operations and the startup and shutdown of continuous operations. I am particularly proud of the charge balance for pH extended for activity coefficients that provides a universal general simple and reliable interval halving solution for modeling pH with not limitation as to complexity of components. I think that me becoming a world-wide expert in pH is largely attributed to the knowledge I gained from doing dynamic simulations of pH systems and consequential plant process control improvements.

I am proud of the Michaelis-Menten and Convenient Cardinal equations for bioreactions that I found and incorporated into a dynamic model for production of modern biologics. The equations, unlike what is typically cited in the literature, are easy to setup and adjust to match test results. The batches that normally take two weeks can be run within hours using kinetic speedup factors. Extensive experimentation and testing of bioreactor operating conditions and control system design using a digital twin can be done in the research and pilot plant phases. However, for some reason, there is a lack of recognition by scientists and process engineers of this opportunity.

What I love about first principle dynamic relationships is the learning of causes and effects and having the ability to experiment and speed up the tests so that the main limitation is your imagination. I am hoping that practitioners partnering with simulation experts can invest in developing some case studies that can open management and project team members to the value of innovation and process control improvement so that we are not simply doing copy jobs when migrating to use much more of the advanced capabilities of modern control systems.

Dynamic simulation can be used to provide better data by conducting an extensive design of experiments (DOE) for training data analytics and neural networks and validating whether correlations developed from plant data are actual cause and effect relationships. Most plant data is unfortunately not provided by a DOE but is simply a gathering of closed loop control data where variability is transferred from controlled variables to manipulated variables. Dynamic simulation can offer huge improvement and be a possible source of data for developing artificial intelligence.

The following site has weekly “Ask Greg” posts on how dynamic simulation can provide the best instrumentation and control systems: https://prosera.com/AskGregMcMillan

The following recent ISA books particularly detail the use and value of simulations with breakthroughs I mentioned for charge balance and bioreaction kinetics (use promo code ISAGM10 for a 10% discount on Greg’s ISA books):

Here are Control feature articles that show the use and value of simulation for innovation and process control improvement:

In the Control Talk column “Dynamic World of Modeling and Control,” Julie Smith, global automation and process control technology leader at DuPont, describes how her group uses dynamic models to develop process understanding and process control improvement (PCI), including plantwide control strategies to increase process performance and provide training to increase operator performance.

In the Control Talk column “Simulation Benefits in Mineral Processing,” Michael Schaffer, president of Portage Technologies, details how simulation enables greater and deeper knowledge of the process and the use of smart controls and optimization to turn what were controlled variables into manipulated variables.

Greg McMillan
Greg McMillan
Greg McMillan has more than 50 years of experience in industrial process automation, with an emphasis on the synergy of dynamic modeling and process control. He retired as a Senior Fellow from Solutia and a senior principal software engineer from Emerson Process Systems and Solutions. He was also an adjunct professor in the Washington University Saint Louis Chemical Engineering department from 2001 to 2004. Greg is the author of numerous ISA books and columns on process control, and he has been the monthly Control Talk columnist for Control magazine since 2002. He is the leader of the monthly ISA “Ask the Automation Pros” Q&A posts that began as a series of Mentor Program Q&A posts in 2014. He started and guided the ISA Standards and Practices committee on ISA-TR5.9-2023, PID Algorithms and Performance Technical Report, and he wrote “Annex A - Valve Response and Control Loop Performance, Sources, Consequences, Fixes, and Specifications” in ISA-TR75.25.02-2000 (R2023), Control Valve Response Measurement from Step Inputs. Greg’s achievements include the ISA Kermit Fischer Environmental Award for pH control in 1991, appointment to ISA Fellow in 1991, the Control magazine Engineer of the Year Award for the Process Industry in 1994, induction into the Control magazine Process Automation Hall of Fame in 2001, selection as one of InTech magazine’s 50 Most Influential Innovators in 2003, several ISA Raymond D. Molloy awards for bestselling books of the year, the ISA Life Achievement Award in 2010, the ISA Mentoring Excellence award in 2020, and the ISA Standards Achievement Award in 2023. He has a BS in engineering physics from Kansas University and an MS in control theory from Missouri University of Science and Technology, both with emphasis on industrial processes.

Books:

Advances in Reactor Measurement and Control
Good Tuning: A Pocket Guide, Fourth Edition
New Directions in Bioprocess Modeling and Control: Maximizing Process Analytical Technology Benefits, Second Edition
Essentials of Modern Measurements and Final Elements in the Process Industry: A Guide to Design, Configuration, Installation, and Maintenance
101 Tips for a Successful Automation Career
Advanced pH Measurement and Control: Digital Twin Synergy and Advances in Technology, Fourth Edition
The Funnier Side of Retirement for Engineers and People of the Technical Persuasion
The Life and Times of an Automation Professional - An Illustrated Guide
Advanced Temperature Measurement and Control, Second Edition
Models Unleashed: Virtual Plant and Model Predictive Control Applications

Related Posts

How Did Automation Professionals Benefit from ISA in 2024?

The International Society of Automation (ISA) is proud to be the professional home of thousands of member...
Kara Phelps Dec 17, 2024 9:30:00 AM

Ensuring RCM or DCS Redundancy and Its Security in a Complex Industrial Environment

In industrial automation, remote control managers (RCM) or distributed control systems (DCS) are critical...
Ashraf Sainudeen Dec 13, 2024 10:00:00 AM

ISA Podcast Explores Automation and Smart Agriculture

The International Society of Automation (ISA) podcast, Podomation, curates and shares insightful discussi...
Kara Phelps Dec 10, 2024 11:00:00 AM