ISA Interchange

Welcome to the official blog of the International Society of Automation (ISA).

This blog covers numerous topics on industrial automation such as operations & management, continuous & batch processing, connectivity, manufacturing & machine control, and Industry 4.0.

The material and information contained on this website is for general information purposes only. ISA blog posts may be authored by ISA staff and guest authors from the automation community. Views and opinions expressed by a guest author are solely their own, and do not necessarily represent those of ISA. Posts made by guest authors have been subject to peer review.

All Posts

Ask the Automation Pros: How Do You Improve Use of Analyzers for Continuous Control?

The following discussion is part of an occasional series, "Ask the Automation Pros," authored by Greg McMillan, industry consultant, author of numerous process control books, and 2010 ISA Life Achievement Award recipient. Program administrators will collect submitted questions and solicits responses from automation professionals. Past Q&A videos are available on the ISA YouTube channel. View the playlist here. You can read all posts from this series here.

Looking for additional career guidance, or to offer support to those new to automation? Sign up for the ISA Mentor Program.

Erik Cornelsen’s Question:

When implementing loops using at-line and lab analyzers, in a continuous process application, how can this new process timestamped information can help to improve process control? Note that the test duration varies, and it takes between 15 and 25 minutes to get the new sample result.

Russ Rhinehart’s Responses:

If the analyzer delay is on the order of (or longer) than the process response time to the controller action, and disturbance events are frequent, then the controller will not be able to control that measurement deviation.

For example, if the process open loop settling time (the process response time to the controller output, the process FOPDT delay plus three time-constants) is 15 to 25 minutes, then the process will take that long to respond to control action. If a changing disturbance causes a deviation, then by the time the controller learns of the deviation, it will be too late to fix it.

A new deviation will be affecting the process. By the time the controller can implement a fix, that disturbance event will have passed, and the fix for the past event will be superimposed on a new disturbance event.

In this case the controller will be responding to noise, tampering with the process, and I would suggest implementing a SPC (statistical process control) filter on the controller output. See Muthiah, N., and R. Russell Rhinehart, “Evaluation of a Statistically-Based Controller Override on a Pilot-Scale Flow Loop”, ISA Transactions, Vol. 49, No. 2, pp 154-166, 2010. Alternately, one could detune the controller.

If the disturbances are infrequent, persist for a long time relative to the analyzer cycle, and the changes in analyzer delay are also infrequent, then classic tuning for dead time dominant processes should work. Use the current analyzer delay to adjust the dead time in, for instance, Cohen and Coon tuning. I do not know of a solution if the delay time cannot be forecast, except to accept a detuned conventional controller.

A standard solution is to consider cascade control. If there is an easy-to-measure secondary variable (temperature, pressure, differential pressure, conductivity, etc.) that could provide an early indication of the primary analyzer variable value, then control that secondary variable, and use feedback from the delayed analysis to adjust the set point for the secondary variable. The primary controller feedback could even use heuristic rules to adjust the secondary set point.

David Bruton’s Responses:

Analyzers with long measurement delays make continuous control particularly difficult. They add extra delays to the loop's dead time, which is the main factor in the ability to control. We can borrow a solution from another control application that has long & variable delays: wireless control.

Advances in control techniques have developed enhanced PID controllers (e.g., DeltaV PIDPlus) precisely for this purpose. When this feature is enabled, the enhanced PID will assess the amount of time since the last measurement update and adjust its control response to match the update rate.

If the analyzer is not used directly for feedback control, lab analysis might be used to bias the controller or correct an online reading. Boiler conductivity control is measured online with occasional lab analysis. The bias on the PD controller is adjusted based on the analyzer's readings.

Sometimes at-line measurements are corrected to match lab readings. It is crucial this method is only used on sufficiently slow processes, as the lab measurements are assumed to be the same as the at-line measurement. Also, the controller should not be allowed to respond to the at-line measurement changes until all redundant instruments are adjusted. This helps prevent offsets between measurements.

 

Confirmation of the reliability of lab analysis and time stamping is essential before it is used to bias a controller output or analyzer reading. If the analyzer is used for feedback control, the confirmation process must be especially stringent, and the fraction used for correction of analyzer reader must be extremely small.

 

Because analyzers provide composition measurements essential for quantifying process performance, their readings may indicate that the process must slow down or speed up. Look to preemptively correct for disturbances to the impacted controllers by implementing feedforward and ratio control for the controllers and consider developing inferential measurements (soft sensors) detailed in Greg McMillan's responses. 

 

Michel Ruel’s Responses:

Erik, an interesting question. As a general rule, in process control, if you have relevant information, you should use it. With a signal updated every 20 minutes for example, it becomes difficult to control the process properly, the dead time (enemy in process control) is too long. With a dead time of 20 minutes, the closed loop time response would be longer than one hour.

 

With analyzer or laboratory information, we generally use this data to update a soft sensor (I personally prefer the term “virtual sensor”). The soft sensor uses process signals (e.g., temperature, density, pressure to infer the measurement produced by the analyzer). The soft sensor generates a continuous signal.

 

For process control, you use the signal from the soft sensor (continuous) but when a result from the analyzer is available, it is used to update (or reset or shift) the soft sensor. You absolutely need some form of logic or intelligence to properly adjust the soft sensor. Usually with new update from the analyzer the soft sensor is updated with a bias (zero shift) but it can be more complex. You do not want to upset the process so this bias is added gradually and with more complex logic, you can recalibrate the soft sensor.

 

It is also possible to use neural networks, fuzzy logic, and artificial intelligence to develop complex soft sensors.

 

Greg McMillan’s Responses:

The dead time from an at-line analyzer is the sample transport time, half the cycle time, and the analysis time. Normally the analysis result is at the end of the cycle time in which case the analysis time is the cycle time making the dead time the sample transport time plus 1.5 times the cycle time.

For analyzers in analyzer houses, the sample transport time can be huge. For inline analyzers, the dead time is mostly just the stream transportation delays to the analyzer often in a recirculation line. If the analyzer is downstream of a heat exchanger due to a hot process fluid, the transportation delay from the heat exchanger volume can be considerable.

 

As mentioned by David Bruton, for at-line analyzers, the enhanced PID developed for wireless measurements offers better performance and simpler tuning. See Annex E in the newly completed ISA Technical Report PID Algorithms and Performance ISA-TR5.9 for details on the enhanced PID.

 

Tests show that if the dead time from the analyzer is greater than the 63% response time, the tuning for an enhanced PID can be simplified to setting the PID gain equal to the inverse of the maximum open loop gain and setting the reset time based on the loop dead time without the analyzer dead time. If there are disturbances in an opposite direction during the time between analyzer results, the PID gain must be accordingly decreased. Lab samples can even be used with long and variable dead times between reported and validated analyzer results without retuning the controller.

 

However, a long dead time waiting for an analyzer update greatly decreases loop performance. As noted by Michel Ruel, dead time is the enemy. For unmeasured disturbances, the ratio of the minimum possible peak and integrated error with to without the analyzer dead time is proportional to the ratio and ratio squared, respectively of the total loop dead time with to without the analyzer dead time.

 

If the analyzer dead time is twice the total loop dead time without the analyzer, the peak error and integrated error are three and nine times the original error, respectively where the original error is for a total loop dead time without the analyzer dead time. Also, as noted by Russ Rhinehart, if the disturbance changes sign within the analyzer dead time, the correction based on an analyzer result could do more harm than good.

 

Lab analysis may be used to gradually correct an analyzer reading only if stringent requirements are met. For example, the lab analysis must be a time stamped verified reasonable result that is compared to the analyzer reading back in time to when the sample was taken and only a small fraction (e.g., < 0.4) of the difference between field and lab reading used to provide a bias correction of the analyzer reading.

 

This correction must be proven over time and the fraction of correction adjusted to be effective. The performance of three lab measurements and taking the middle value helps provide a more accurate and reliable lab analysis. Also, precautions must be taken that sample temperature changes and the evaporation or precipitation of components do not change lab sample composition.

 

Inferential measurements (also known as virtual sensors, soft sensors, and dynamic estimators) described in the June Control Talk “Top of the Plant Bottom Line” can eliminate the analyzer dead time. Inferential measurements for biological and chemical processes use accurate flow, and stream and equipment temperature, conductivity, dielectric spectroscopy dissolved oxygen, dissolved carbon dioxide, pH, and turbidity measurements.

 

Model Predictive Control (MPC) software can be used to identify and implement the dynamics of the inferential measurement to match at-line analyzer results. These are commonly termed dynamic linear estimators. The dynamics are an open loop gain, total loop dead time that includes analyzer dead time, and primary and secondary time constants. The inferential measurement with and without the analyzer dead time is used for process control and correction, respectively.

 

Open loop tests of dynamic first principle models can help find the open loop gains and stream measurements that affect the inferential measurements. The open loop gains should be updated online eliminating the linear restriction. Principle component analysis (PLS) by a multivariable autoregressive (ARX) model may also help identify unsuspected relationships between process inputs and process outputs.

 '

Mark Darby's Responses:

There are dynamics associated with both the analyzer such as a Gas Chromatograph (GC) and the process itself. For the analyzer, it is predominantly the delay associated with the length of tubing between the sample location and the analyzer. The process part includes delays and lags (time constants) that reflect dynamics of flows and volumes/holdups that exist between the measurement location and the analyzer sample location, The process measurements we normally think of are temperatures or flows.

To minimize dead time associated with the analyzer, a “fast loop” that continuous circulates process fluid is normally used. Samples are then “pulled” from the fast loop to the analyzer. This is standard practice. The impact of analyzer cycle time determines how often an analysis result is reported. For control purposes, it is obviously best to minimize.

Both analyzer and process dynamics are present when modeling the analyzer to a process measurement. This is often modeled with a first order plus dead time model, although the effect of multiple lags is usually observed. With an online process analyzer, the time stamp of a new analyzer result is not needed for control because of the dynamic relationship (normally assumed fixed) that exists between the process measurement, although this relationship can only be observed when a new analysis is reported. Note that the dead time does not change with the cycle time of the analyzer.

The other situation is when process analyzer is not present and instead samples are collected manually and taken to a lab. For this case, an accurate time stamp is necessary to match or synch-up the sample result with process conditions. In my experience, sample times are not sufficiently accurate for control applications and sample collection procedures must be changed.

As an example, it is common that a sample is routinely taken at the start of a shift; however, the actual times of the sample could easily vary by 30 minutes, perhaps loner than one hour. One solution to this problem is for the lab technician or outside operator to notify the console operator when the sample is taken. The console operator then manually changes a digital or switch in the DCS, which then records the current time or stores current values (often averaged) of corresponding process measurements.

An automatic way to do this is to install a raw thermocouple it the sample collection line. The resulting temperature spike from a sample collection is used to trigger a switch to record the current time or store current values.

Implications for control. There are two ways to use a process analyzer for control. One is as the PV to a PID loop. For the best performance, the controller should only execute upon a new analyzer result. If the analyzer comes into the DCS via a digital interface, there is often a digital that is associated with a new analysis result than be used to trigger the PID loop. If the signal comes into the DCS as an I/O point (4-20mA), logic is normally required to infer a new analysis based on the change in the analyzer.

For MPC, most products have a new results switch associated with each controlled variable that will need to be triggered in the same way. When brought is as analog input, there is usually some noise superimposed on the analyzer signal that must be accounted for in the detection logic. There may also be additional analyzer information that can quite helpful to use, including status/validity information and when the analyzer is being calibrated

The other way is to use the analyzer result to update an inferential or soft sensor model using continuously measured values like temperature and pressure. Similar logic is required to update the model upon a new analysis. This approach is also applicable to lab measurements. When a new lab analysis is reported, the time stamp is used to retrieve historical values, or stored value are used with new lab result to update the model.

Note that unless accurate time stamps are already in place it may be difficult to develop and inferential model. In addition, there may not be sufficient data and/or movement in the process and a dedicated test may be required.

When using analyzers in control, it is worthwhile to consider how they can be sped up. Options include changing the stream sequence or reallocating analyzers to difference service.

It is also important to consider if insulation of the sample line is adequate to avoid condensation and/or evaporation that can occur with ambient temperature.

Greg McMillan
Greg McMillan
Greg McMillan has more than 50 years of experience in industrial process automation, with an emphasis on the synergy of dynamic modeling and process control. He retired as a Senior Fellow from Solutia and a senior principal software engineer from Emerson Process Systems and Solutions. He was also an adjunct professor in the Washington University Saint Louis Chemical Engineering department from 2001 to 2004. Greg is the author of numerous ISA books and columns on process control, and he has been the monthly Control Talk columnist for Control magazine since 2002. He is the leader of the monthly ISA “Ask the Automation Pros” Q&A posts that began as a series of Mentor Program Q&A posts in 2014. He started and guided the ISA Standards and Practices committee on ISA-TR5.9-2023, PID Algorithms and Performance Technical Report, and he wrote “Annex A - Valve Response and Control Loop Performance, Sources, Consequences, Fixes, and Specifications” in ISA-TR75.25.02-2000 (R2023), Control Valve Response Measurement from Step Inputs. Greg’s achievements include the ISA Kermit Fischer Environmental Award for pH control in 1991, appointment to ISA Fellow in 1991, the Control magazine Engineer of the Year Award for the Process Industry in 1994, induction into the Control magazine Process Automation Hall of Fame in 2001, selection as one of InTech magazine’s 50 Most Influential Innovators in 2003, several ISA Raymond D. Molloy awards for bestselling books of the year, the ISA Life Achievement Award in 2010, the ISA Mentoring Excellence award in 2020, and the ISA Standards Achievement Award in 2023. He has a BS in engineering physics from Kansas University and an MS in control theory from Missouri University of Science and Technology, both with emphasis on industrial processes.

Books:

Advances in Reactor Measurement and Control
Good Tuning: A Pocket Guide, Fourth Edition
New Directions in Bioprocess Modeling and Control: Maximizing Process Analytical Technology Benefits, Second Edition
Essentials of Modern Measurements and Final Elements in the Process Industry: A Guide to Design, Configuration, Installation, and Maintenance
101 Tips for a Successful Automation Career
Advanced pH Measurement and Control: Digital Twin Synergy and Advances in Technology, Fourth Edition
The Funnier Side of Retirement for Engineers and People of the Technical Persuasion
The Life and Times of an Automation Professional - An Illustrated Guide
Advanced Temperature Measurement and Control, Second Edition
Models Unleashed: Virtual Plant and Model Predictive Control Applications

Related Posts

Checking In With Mimo, ISA's Large Language Model Trained on ISA Content

Over the summer of 2024, the International Society of Automation (ISA) announced a large language model (...
Kara Phelps Nov 15, 2024 7:00:00 AM

Ask the Automation Pros: The Use of Artificial Intelligence in Process Control

The following discussion is part of an occasional series, "Ask the Automation Pros," authored by Greg McM...
Greg McMillan Nov 12, 2024 4:30:00 PM

Protecting Electrical Terminal Blocks From Tampering

Electrical terminal blocks are a common sight in the automation world. Usually mounted on DIN rail in ind...
Anna Goncharova Nov 8, 2024 10:30:00 AM