ISA Interchange

Welcome to the official blog of the International Society of Automation (ISA).

This blog covers numerous topics on industrial automation such as operations & management, continuous & batch processing, connectivity, manufacturing & machine control, and Industry 4.0.

The material and information contained on this website is for general information purposes only. ISA blog posts may be authored by ISA staff and guest authors from the automation community. Views and opinions expressed by a guest author are solely their own, and do not necessarily represent those of ISA. Posts made by guest authors have been subject to peer review.

All Posts

How Can We Improve Control of Dead Time Dominant Applications?

 

The following discussion is part of an occasional series, "Ask the Automation Pros," authored by Greg McMillan, industry consultant, author of numerous process control books, and 2010 ISA Life Achievement Award recipient. Program administrators will collect submitted questions and solicits responses from automation professionals. Past Q&A videos are available on the ISA YouTube channel. View the playlist here. You can read all posts from this series here.

Looking for additional career guidance, or to offer support to those new to automation? Sign up for the ISA Mentor Program.

Erik Cornelsen’s Question 

In a continuous process, the operators have to manually adjust the actuators of an extrusion machine to get the product within spec. However, the width and thickness gauges are located hundreds of meters further down the line, due to process related reasons, and cannot be moved. The dead time for the width and thickness control is between four and eight minutes, depending on the line speed.

What would be a good control strategy to fully automate this dead time dominant process?

I saw in the literature some mentions to model predictive controllers (MPC) in dead time dominant applications; however, I did not find an MPC function block integrated into modern, state-of-the-art programmable logic controllers (PLCs). In this case, what would be the implementation steps of a MPC into a modern PLC (not a distributed control system (DCS))? In addition, how can we empirically get the required MPC parameters (e.g., manipulated variables, control variables, disturbance variables, prediction time, mathematical nodel definition, etc.) that are used by the algorithm?

Russ Rhinehart’s Answer 

Many of our simpler control algorithms (for instance the Smith predictor and internal model control (IMC)) were designed for processes with delays.  But implementation for even these simple controllers requires an array to store the variables from now until the end of the delay. This might not be possible in your PLC. Hopefully, however, your PLC has delay, lead, lag, and proportional integral derivative (PID) functions that could be structured as either the Smith predictor or IMC. But if it is possible, since your delay is variable and depending on line speed, the delay in the controller needs to be scaled with speed. The Smith predictor and IMC are relatively simple solutions, but require a good estimate of the total loop delay including measurements.

Another solution would be to implement human logic as the controller. Unfortunately, the classic name for this is “fuzzy logic,” which sounds like faulty thinking, but it is just a way to automate human intuitive action. Humans are effective in complex tasks such as driving a car and catching a ball. If continual operator adjustment is an effective way to control the process, then implementing that logic will be both relatively simple and very effective. 

In batch processes, the control action to regulate the end-of-batch results cannot be implemented until after the end of the batch. The control action cannot affect what was made, but can affect the next batch. Extreme dead time processes are very similar, even if the process is continuous. You might investigate batch-to-batch recipe control methods. 

If there are uncontrolled disturbances that continually affect the delayed results, then you might consider using algorithmic statistical process control (automated SPC) to temper either the measurement or the controller output. Do not make changes until the evidence gives 95% confidence that a change is warranted. See Rhinehart, R. R. “A Statistically Based Filter”, ISA Transactions, Vol. 41, No. 2, April 2002, pp 167-175, or Muthiah, N., and R. R. Rhinehart, “Evaluation of a Statistically-Based Controller Override on a Pilot-Scale Flow Loop”, ISA Transactions, Vol. 49, No. 2, pp 154-166, 2010.

Greg McMillan's Answer 

There are many misconceptions about the use of controllers for dead time dominant loops. Here are some prominent ones.

  1. Both dead time compensating PID controllers (e.g., Smith predictor) and model predictive control (MPC) for dead time dominant loops are more traumatically affected by an overestimate rather than underestimate of the dead time. A model dead time just 20% greater the actual dead time can cause rapid oscillations. An underestimate of the model dead time just results in a more sluggish response. This is the opposite situation for a conventional PID controller where underestimates and overestimates of dead time used for tuning cause an oscillatory and sluggish response, respectively. Tests discussed in the upcoming June Control Talk column show that for dead time compensation of lag dominant loops, a model dead time much larger than the actual dead time gives better control.
  2. The potential improvement in performance by dead time compensating PID is much greater for lag dominant loops. In fact, both Greg Shinskey and I have confirmed that the benefit of dead time compensation diminishes becoming negligible in nearly pure dead time loops.
  3. The better dead time compensating controller (PIDθ ) found by me and Shinskey is not the Smith predictor but a PID with a dead time block simply inserted in the external-reset feedback because it eliminates the need and consequential error of setting the open loop gain and time constant and to create a separate operator interface to restore the controlled variable as the process variable and its setpoint. The PIDθ requires more aggressive tuning especially in terms of a smaller reset time approaching nearly zero for a model dead time exactly equal to the total loop dead time. The dead time block should be updated particularly for changes in production rate. You can insert a small filter on the PID output to smooth out the high frequency oscillations for a model dead time slightly larger than actual. However, you need a PID standard form with the positive feedback implementation of integral action creating external-reset feedback, which is only available by a few suppliers.

  4. Dead time dominance is not the reason to go MPC instead of PID even though this statement is pervasive in the literature. The dead time compensating PID can do a better job than MPC for unmeasured process input (load) disturbances. The use of feedforward and ratio control can also do a better job than disturbance variables in an MPC unless the dynamics are complex (e.g., inverse response) or there are interactions and constraints.

One supplier offers a small standalone DCS that has as its default a PID Standard Form with positive feedback implementation of integral action. You simply need to turn on the dynamic reset limit option to get external-reset feedback. The DCS also includes a small MPC eliminating the need for external software and interface. A Simulate-Pro package allows you to try out and develop the PID and its tuning for your application on your laptop.

It has been shown in tests that a PID with structure options offers better performance than fuzzy logic control (FLC) when properly tuned for load response with setpoint weights for best setpoint response. Several of the mentors have implemented FLC decades ago when it was all the rage but concluded a PID controller could have done the job and would have been more understandable and maintainable. If the measurement response is not reproducible and a dynamic model for tuning a PID not possible, which apparently occurs in pulp and paper industry, FLC may be the solution.

The simplest and most reliable and maintainable solution in my book is the use of a dead time compensating PID controller with feedforward, ratio, and override control or an MPC with disturbance and constraint variables for more complex applications. For sheet line cross sectional thickness and optical clarity, a series of MPC was found to perform better than a series of Smith predictors from extensive simulation study and pilot plant sheet line tests done decades ago.

If the dead time dominance was due to an analyzer cycle time, an enhanced PID per Annex E of ISA-TR5.9 could be used without the need to model the dead time or even update tuning settings if the time between updates increases. The PID gain can approach the inverse of the open loop gain and the reset time approach twice the process dead time (excluding the larger dead time from the analyzer cycle). This enhanced PID is also useful for slow wireless updates on fast loops.

Not commonly discussed is the possible use of first principle and data driven model to provide an inferential measurement of the controlled variable for the PID or MPC without the dead time. A combination of the models is increasingly being considered. The estimated dead time is inserted into the estimator whose output is compared to the measurement of the actual controlled variable and a fraction of the error (e.g., 0.4) is used to provide a bias correction of the estimator output and the PID or MPC inferential measurement.

Mark Darby's Answer 

I concur with the responses above about use of PID on dead time dominant processes, and the need to get a good dead time estimate if dead time compensation is used. If it varies, it is important to update the dead time as a function of process operation (such as throughput or hold-up) to ensure good performance.

Some DCS vendors do provide MPC capability in the DCS that could be an option, particularly for a multivariable problem with constraints. Note, though, a DCS–based MPC will normally have less features and capabilities compared to MPCs that run in a separate, networked computer, so a DCS-based MPC would need to be evaluated. If you did not have a multivariable problem or constraints, I would recommend a PID-based scheme.

As Greg points out, a PID can usually out-perform MPC for load disturbances due to its (typically) faster execution and due to the simple bias model update that can be slow to adapt to unmeasured load disturbances. In practice, MPC applications often incorporate well-functioning PID regulatory controllers. So, such a cascade approach may also be useful for the dead time dominant process. If dead time is modeled in the MPC and the dead time changes, the MPC controller would need to be able need support an online update of the dead time and be able to execute sufficiently faster than the dead time.

A final comment on the traditional MPC model updating scheme: there are ways to improve the disturbance rejection capability through a different MPC updating scheme that can attribute the model error to an unmeasured disturbance that is assumed to enter at the process input. With such a model, the MPC prediction will continue to evolve in the future, causing the MPC to take more aggressive, anticipatory action. Not all MPC controllers support this capability natively, but there may be custom ways to implement this. This is a good topic to take up with MPC suppliers.

Greg McMillan
Greg McMillan
Greg McMillan has more than 50 years of experience in industrial process automation, with an emphasis on the synergy of dynamic modeling and process control. He retired as a Senior Fellow from Solutia and a senior principal software engineer from Emerson Process Systems and Solutions. He was also an adjunct professor in the Washington University Saint Louis Chemical Engineering department from 2001 to 2004. Greg is the author of numerous ISA books and columns on process control, and he has been the monthly Control Talk columnist for Control magazine since 2002. He is the leader of the monthly ISA “Ask the Automation Pros” Q&A posts that began as a series of Mentor Program Q&A posts in 2014. He started and guided the ISA Standards and Practices committee on ISA-TR5.9-2023, PID Algorithms and Performance Technical Report, and he wrote “Annex A - Valve Response and Control Loop Performance, Sources, Consequences, Fixes, and Specifications” in ISA-TR75.25.02-2000 (R2023), Control Valve Response Measurement from Step Inputs. Greg’s achievements include the ISA Kermit Fischer Environmental Award for pH control in 1991, appointment to ISA Fellow in 1991, the Control magazine Engineer of the Year Award for the Process Industry in 1994, induction into the Control magazine Process Automation Hall of Fame in 2001, selection as one of InTech magazine’s 50 Most Influential Innovators in 2003, several ISA Raymond D. Molloy awards for bestselling books of the year, the ISA Life Achievement Award in 2010, the ISA Mentoring Excellence award in 2020, and the ISA Standards Achievement Award in 2023. He has a BS in engineering physics from Kansas University and an MS in control theory from Missouri University of Science and Technology, both with emphasis on industrial processes.

Books:

Advances in Reactor Measurement and Control
Good Tuning: A Pocket Guide, Fourth Edition
New Directions in Bioprocess Modeling and Control: Maximizing Process Analytical Technology Benefits, Second Edition
Essentials of Modern Measurements and Final Elements in the Process Industry: A Guide to Design, Configuration, Installation, and Maintenance
101 Tips for a Successful Automation Career
Advanced pH Measurement and Control: Digital Twin Synergy and Advances in Technology, Fourth Edition
The Funnier Side of Retirement for Engineers and People of the Technical Persuasion
The Life and Times of an Automation Professional - An Illustrated Guide
Advanced Temperature Measurement and Control, Second Edition
Models Unleashed: Virtual Plant and Model Predictive Control Applications

Related Posts

How Did Automation Professionals Benefit from ISA in 2024?

The International Society of Automation (ISA) is proud to be the professional home of thousands of member...
Kara Phelps Dec 17, 2024 9:30:00 AM

Ensuring RCM or DCS Redundancy and Its Security in a Complex Industrial Environment

In industrial automation, remote control managers (RCM) or distributed control systems (DCS) are critical...
Ashraf Sainudeen Dec 13, 2024 10:00:00 AM

ISA Podcast Explores Automation and Smart Agriculture

The International Society of Automation (ISA) podcast, Podomation, curates and shares insightful discussi...
Kara Phelps Dec 10, 2024 11:00:00 AM