ISA Interchange

Welcome to the official blog of the International Society of Automation (ISA).

All Posts

The Best Level for Optimization and Organization in the Smart Factory

The following discussion is part of an occasional series showcasing the ISA Mentor Program, authored by Greg McMillan, industry consultant, author of numerous process control books, 2010 ISA Life Achievement Award recipient, and retired Senior Fellow from Solutia, Inc. (now Eastman Chemical). Greg will be posting questions and responses from the ISA Mentor Program, with contributions from program participants.


Zhang (Frank) Yi is a senior control system engineer at Suez North America with five years of industry experience. Frank directs and manages control system improvement master plans, related projects, and five-year budget estimation. He also provides project and technical support and guidance for Pennsylvania, Delaware, Idaho, and South Jersey business units. He acts as a functional expert of smart SCADA, enterprise SCADA data centers, data management, process optimization, and digital process modernization. He assists in design consulting for the enterprise SCADA data center architecture and the building of it.

 

Zhang (Frank) Yi’s Questions

  1. What is the best level of embedding advanced automation for optimizing production performance (e.g., cloud, SCADA, or edge)?
  2. What are organizational requirements for enabling the smart factory?

 

Samuel Arce’s Answers

Samuel Arce is a PhD candidate at Brigham Young University. He is part of the Process Research and Intelligent Systems (PRISM) group and he has participated in academic research that applies data science to a number of industrial applications ranging from process analytics and control to unmanned aerial systems optimization. For more on this program, see the Control Talk column “Optimizing process control education and application.” Samuel has developed his professional experience in the mining industry, where he currently develops digital technologies to enhance industrial automation. He has also participated in commissioning and startup projects for major mining companies in the U.S., Canada, and Mexico.

  1. I would say SCADA is currently the best level of embedding advanced solutions because the maturity of products already in the market is higher than it is for cloud or edge. However, major automation companies are making big investments in cloud technologies. I think edge computing is currently limited in capabilities, but I would say that, when used in combination with SCADA or cloud, it can improve performance significantly.
  2. We recently had a discussion with experts in the field. I think they answer this question better than I can: https://youtu.be/5BRG0w66j0E

The proper training and early involvement of operators in the development of digital technologies cannot be overemphasized.

 

Mark Nixon’s Answers

  1. The answer to the first question is, “It depends on what the application is.” We have a lot of experience inside the DCS with utilizing Model Predictive Control (MPC) and optimizers for improving plant performance and optimizing equipment usage. We have utilized simulators for both offline process design and online optimizations. For example, dynamic process simulators are now being used with advanced control to optimize start-up times and respond to process changes.

    Edge, or on-premise solutions, have also been in use for many years. As an example, some of these applications, before the term “edge” came into use, optimized equipment and unit usages. Cloud solutions have also emerged over the past few years. Examples include equipment integrity monitoring, energy loss monitoring, and pump health monitoring, as well as valve monitoring. The choice of where to perform supervisory control, optimization, and monitoring often depends on the organization's readiness to utilize the various platforms and on which group in the organization drives the projects.
  2. There are many requirements for enabling smart factories. At the field level, it has often been difficult to get to information trapped in devices such as flow devices and valves. The wide availability of technologies such as WirelessHART has made it possible for many organizations to monitor vibration on pumps and other equipment, monitor for steam loss, and add measurements to remote locations, such as well heads, that were not very accessible in the past. Looking ahead, the adoption of Ethernet field-level networks will make it possible for the same device to be connected to the control system as well as to monitoring systems, which should improve the adoption of edge and cloud applications.

    Security has been another factor holding many users back. The wide adoption of the ISA/IEC 62443 series of standards in combination with the integration of security technologies into the DCS and plant information systems is enabling the promise of smart factories. The adoption of mobile technology is another factor. Today, sites are deploying mobile technology and devices to both operations and IT personnel. This wide-scale adoption is driving considerable innovation and forcing, or rather enabling, cooperation between IT and operations staff.

    Another roadblock has been the disparity between information models. One of the recent standards which was driven by the FieldComm Group is PA-DIM, or Process Automation Device Information Model. Standardizing information models is a critical part required for smart factories.

 

Patrick Dixon’s Answers

  1. The best level for automation of production optimization is a function of several factors. First, we have to define some terms. I have been involved in developing the “Industry 4.0 Lexicon.” Some of the definitions below are arguable, but they are applicable here:

    Production performance: If we are trying to optimize production, we want to make the most in a shorter amount of time. With production performance, I assume we are trying to do the same thing, but also ensure we meet all quality criteriaso that we can sell what we make, minimize resources such as raw materials and energy, and maximize profit.

    MPC: A multi-input multi output (MIMO) control with an objective to find the best solution to meet targets, while also minimizing actuator movement and resources (economic costs)

    Optimization: Eliminating error is not necessarily optimization. Compared to having no control or poorly tuned control, a well-designed and tuned PID might be considered optimization. However, to optimize production, we are typically using MPC.

    Level 0: Field instrumentation (sensors, valves, motors, and so on)

    Level 1: Data acquisition and logic, such as a PLC or DCS controller

    Level 2: The supervisory level, including HMI, historian, supervisory applications, and connectivity among Level 1 devices. This is often referred to as SCADA.

    SCADA: A system consisting of Level 1 and Level 2 components that are purchased separately and then integrated together

    DCS: The same as SCADA, except it is purchased as a complete system with all components integrated by design when the system is purchased

    Edge: A device that performs logic processing and simultaneously connects process data with the outside world (the internet). An edge device could be Level 0, Level 1, or Level 2.

    Cloud: Processing provided by an outside provider such as Amazon or Oracle with internet connectivity to the process, reducing the footprint and maintenance for on-premise processing

    Given these definitions, the general principle in industry has been to put controls as close to the process as practically possible. In cases where you need a fast response (such as a one-second scan time), this is where it should go.

    There are some vendors that offer MPC controllers in Level 1 devices, but they may not have optimizers. Most MPC that I have seen is done at Level 2, but the scan rate of such applications should not be so fast as to burden the network with too much data or drive Level 1 loops too fast. Normally, the processing requirement for these applications can exceed the capabilities of a Level 1 device, and you do not want to impact time-critical PID, interlocks, and alarms in these devices. Cloud provides a way to host Level 2 MPC offsite, but introduces security issues and the lack of deterministic processing over the internet.

    Above Level 2 (where MES and ERP live) cloud is probably the ideal solution. Since edge could be Level 1 or 2, it is viable either way. That leaves SCADA, which is not a level, but an alternative to a DCS. Either a DCS or a SCADA system can have a production performance optimization application.

    The bottom line is that a native DCS/SCADA server, edge, or cloud at Level 2 is the most common and preferred approachunless the objective involved production optimization across multiple production facilities, in which case, cloud at the MES or ERP level would be best.
  2. I assume that when you refer to organizational requirements, you mean the structure of the management and the business. My answer is to have an organization that puts priority on the foundation upon which you are building the castle. The most elaborate castle will crumble on a poor foundation.

    Then we have to define “smart factory.” “Smart” does not mean digital. We have been digital for 50 years. If “smart” means Industry 4.0, then we need to understand the difference between Industry 3.0 and 4.0.

    The difference is the internet. That means we now have a public infrastructure that allows connectivity from Level 0 to the top of the enterprise. Therefore, the organizational requirements are to have people and processes in place to maintain Level 0, since this is the foundation upon which everything else is built. Big Data cannot work if you feed it garbage.

    Then we need Level 1 to have well-tuned loops, functioning alarm systems, and appropriate interlocks and permissives. As we go up the network, we need to ensure that the network can support what is put on top of it. I was on a startup a few years ago where the network suddenly died (HMIs had bad values and red blinking indicators, engineering stations couldn’t connect to PLCs to see the logic, and so on). Someone had added 2,000 tags to the historian and sucked up the network bandwidth. To conclude, the organization needs to prioritize maintaining the Industry 3.0 system to enable a smart Industry 4.0 system.

 

Greg McMillan’s Answers

  1. The most successful technologies for optimization have proven to be PID and MPC, for the many reasons cited in the Control Talk blog column “Keys to successful process control technologies.” The location that provides the best interface, response times, and functionality of these technologies should be prioritized. I greatly appreciated having the extensive PID and MPC with a built-in Linear Program optimizer in a DCS that could be prototyped in a digital twin using imports and exports of actual configuration and connected to a first principle dynamic simulation without the need for translation or external interfaces.

    For simple optimization applications, such as maximizing chiller or cooling tower temperatures or minimizing pump or compressor discharge pressures, a PID can be used as a valve position controller to maximize user valve positions to save energy, as detailed in the Control feature article “Don’t Over Look PID in APC.” Procedure automation (state-based control), where PID modes and outputs are sequenced, can be used to handle abnormal situationsas discussed in the Control Talk column “Continuous Improvement of Continuous Processes.”
  2. A smart factory cannot be really smart without extensive smart measurements. This means accurate measurements of all variables important for process performance and equipment monitoring, particularly those that cannot be calculated or modeled from other measurements. Wireless measurements offer opportunities to expand the scope for measurements and to find the best location. Fortunately, smart digital transmitters narrow the focus to the best sensor type and installation to achieve the needed accuracy.

    For example, an RTD has two orders of magnitude greater sensitivity and accuracy than a thermocouple, taking into account the potential drift for temperatures less than 400o. A spring-loaded sheathed element in a stepped thermowell of sufficient insertion length in a velocity greater than 1 fps greatly reduces measurement lags and heat conduction errors.

There is a great opportunity for a more effective use of analyzers to tell us the critically important knowledge of stream compositions. While analyzer technology has improved, there is a considerable need for improved reliability and accuracyoften achieved by better calibration and maintenance that can be facilitated by smarter diagnostics via data analytics, as discussed in the Control Talk column “Analyzing analyzers.”

 

Additional Mentor Program Resources

See the ISA book 101 Tips for a Successful Automation Career that grew out of this Mentor Program to gain concise and practical advice. See the Control Talk column "How to effectively get engineering knowledge" with the ISA Mentor Program protégée Keneisha Williams on the challenges faced by young engineers today, and the column "How to succeed at career and project migration" with protégé Bill Thomas on how to make the most out of yourself and your project. Providing discussion and answers besides Greg McMillan and co-founder of the program Hunter Vegas (project engineering manager at Wunderlich-Malec) are resources Mark Darby (principal consultant at CMiD Solutions), Brian Hrankowsky (consultant engineer at a major pharmaceutical company), Michel Ruel (executive director, engineering practice at BBA Inc.), Leah Ruder (director of global project engineering at the Midwest Engineering Center of Emerson Automation Solutions), Nick Sands (ISA Fellow and Manufacturing Technology Fellow at DuPont), Bart Propst (process control leader for the Ascend Performance Materials Chocolate Bayou plant), Angela Valdes (automation manager of the Toronto office for SNC-Lavalin), Daniel Warren (senior instrumentation/electrical specialist at D.M.W. Instrumentation Consulting Services, Ltd.), and Ryan Simpson (process analytics engineer at Eastman Chemical).

Greg McMillan
Greg McMillan
Gregory K. McMillan, CAP, is a retired Senior Fellow from Solutia/Monsanto where he worked in engineering technology on process control improvement. Greg was also an affiliate professor for Washington University in Saint Louis. Greg is an ISA Fellow and received the ISA Kermit Fischer Environmental Award for pH control in 1991, the Control magazine Engineer of the Year award for the process industry in 1994, was inducted into the Control magazine Process Automation Hall of Fame in 2001, was honored by InTech magazine in 2003 as one of the most influential innovators in automation, and received the ISA Life Achievement Award in 2010. Greg is the author of numerous books on process control, including Advances in Reactor Measurement and Control and Essentials of Modern Measurements and Final Elements in the Process Industry. Greg has been the monthly "Control Talk" columnist for Control magazine since 2002. Presently, Greg is a part time modeling and control consultant in Technology for Process Simulation for Emerson Automation Solutions specializing in the use of the digital twin for exploring new opportunities.

Related Posts

ISA’s Relationship to IEC Standards

ISA actively participates in the world’s primary international standards system as sanctioned by the Unit...
Charley Robinson May 14, 2021 5:30:00 AM

Bridging the Gap Between Education and Industry

What skills will undergraduate engineers need for the digital economy? Our world is in continuous evoluti...
Elena Rios May 11, 2021 5:30:00 AM

Reflecting on ISA History: Don’t Leave Gold Nuggets Behind

By Ian Nimmo   It is good to know where we have come from. During my experience in over 50 years in this ...
Contributing Author May 7, 2021 5:00:00 AM