ISA Interchange

Welcome to the official blog of the International Society of Automation (ISA).

This blog covers numerous topics on industrial automation such as operations & management, continuous & batch processing, connectivity, manufacturing & machine control, and Industry 4.0.

The material and information contained on this website is for general information purposes only. ISA blog posts may be authored by ISA staff and guest authors from the automation community. Views and opinions expressed by a guest author are solely their own, and do not necessarily represent those of ISA. Posts made by guest authors have been subject to peer review.

All Posts

Process Plant Historians Deliver Insights and Improve Operations

This article was written by Bill Lydon, automation industry authority, journalist and former chief editor at InTech magazine.

Plant historians became popular in process plants in the 1980s. More recently they are being used broadly throughout industry to improve operations, identify problems, and find opportunities for enhancement.

An early forerunner of historians were electromechanical pen-and-paper chart recorders. They produced a graph of one or more measured values over a period of time for analysis and a permanent record of critical information.

Plant historians acquire real-time data from automation and other systems to store time-stamped data at high speed to maintain a chronology of information. This industrial process information is then available to any user for reference and analysis including production status, performance monitoring, quality assurance, troubleshooting, tracking, and genealogy.

A major advantage of historians is the ability to research and correlate any data easily to identify trends and relationships. Initially historians were very expensive, but they have now gained broader use. Lower-cost computers and storage make it possible to record large amounts of plant data from operators, sensors, and processes at a reasonable cost. Compared to keeping paper records, it is much simpler to manage data, analyze it, and archive it electronically.

Big data

Before the term “big data” was being used as part of the Internet of Things, plant historians were handling large volumes of time-synchronized data. Consider time sampling and storing a single temperature sensor value every half second (500 milliseconds)—it generates more than 63 million samples per year. Tracking 1,000 temperature sensors in a plant would generate more than 63 billion stored samples per year. For sequential samples where values do not change, data compression techniques are often used to conserve storage. It is also common to filter measurements before sending data to a historian to reduce the significant amount of “noise” in the data. In an automation system, typical items tracked and stored include temperature, flow rate, pressure, level, machine cycles, run time, and overall equipment effectiveness. The recognition that big data can be valuable has put greater emphasis on historians to capture more data in plants than in the past. Fortunately, technology enables the collection and handling of big data at lower costs.

Leveraging data

Information increasingly is being put to use by a wide variety of people, including plant management, engineers, operators, accountants, business analysts, scientists, quality control workers, and information technology (IT) specialists. Data alone does not deliver benefits; it is the people who use the information to solve problems and make improvements who deliver benefits. The ISA95 international standard for the integration of enterprise and control systems data has been widely adopted to organize and communicate historic information. There are several use cases for data captured by historians:

Legal and compliance verifications

In both process and discrete industries, companies need to maintain a genealogy record of production and quality tests for legal and compliance reasons. Historians have detailed data that can be used in defense against litigation and to determine what products must be recalled, if necessary. Requirements in the food and pharmaceutical industries are long standing. Recent quality incidents and recalls in the automotive industry illustrate the need to capture and keep production data. Pharmaceutical application of historians must comply with 21 Code of Federal Regulations (CFR) Part 11, also known as electronic records/electronic signatures. Given the importance of the data, the FDA has strict regulations regarding access, security, and ability to edit such records, as documented in this CFR.

Track and trace serialization

Track and trace (TnT) serialization initiatives to maintain the history of products in the pharmaceutical and food industries are creating a greater need to capture and retain historic production records. The pharmaceutical industry is gearing up to TnT as government agencies and companies take measures to reduce drug counterfeiting and product diversion and increase patient safety. There are a number of motivations including contamination recalls, counterfeit drugs (Viagra is the most counterfeited drug), and drug thefts. The street value of drugs can easily range from $15 to $50 per tablet or in the case of codeine, $200 to $300 per pint. Thefts of pharmaceuticals in transit have ranged from $2 million to $80 million per incident. Countries worldwide are planning to require track and trace, increasing the need for plant history information.

Root-cause analysis

When production issues occur, historic data is fundamental to identifying sources of problems using root-cause analysis. A root cause is a cause that prevents the final undesirable event from recurring when it is removed from the problem fault sequence. For example, common issues in process applications that can be identified include operator error, inadequate cleaning times, poor cleaning solution strengths, device malfunctions, and wrong process temperatures.

Troubleshooting

Many times, having chronological data is fundamental to pinpointing automation, control, and equipment problems. Chronological data gives troubleshooters deep insight into the behavior of processes at any point in time.

Optimization

Chronological historic data provides in-depth system performance analysis that companies use when developing methods to optimize processes. This information can be the inputs to simulation software to understand the production process and simulate optimization methods. At the annual Pharmaceutical Automation Roundtable, a major pharmaceutical company described how it uses historian data for new insights to improve the control methods of biological processes that are complex and difficult to predict. Using historic information and off-the-shelf analytic software, it learned about the interaction between process and equipment data that had not been apparent in the past. This also helps to identify a range of other problems, including maintenance issues.

Power and energy monitoring

Energy has been a blank check in industrial plants. However, by using a historian, along with submetering and power monitoring, plants can allocate energy costs to production steps and products to achieve closer cost accounting, find problems, and identify areas for improvement.

Predictive analytics

Advanced intelligence capabilities are being applied using data trends and patterns to predict failures and events. These predictive analytics are only possible with historian data.

Investment justification

Justifying the investments to replace and upgrade equipment can be more accurate using actual historic plant data, opposed to estimates. Historic data provides integrity to investment proposals made to management. I recently asked an automation manager from a major pharmaceutical company about the value of historians, and he commented they are invaluable for providing the data store needed for true process understanding based on analysis, especially for batch processes. He also noted that many managers think of historians as simply a data store to cover a firm for regulatory purposes and do not clearly see the untapped value of information that can be harvested from historians for process improvement, process optimization, or aiding with root-cause analysis.

Identifying data

There are different philosophies about how much data to capture and store in a historian. One theory is to capture every possible data point, since you never know what you will need in the future. This seems a bit extreme. The goal is to determine in each process or production area what would be needed for analyzing long-term performance and what would be needed for identifying immediate and short-term problems. It is probably more productive to discuss ideas about what data to capture from various departments based on their needs. Creating lists of questions for which people want factual answers can help stimulate ideas for data to capture. The application of data is an ongoing activity to explore and understand. Initially users may not know enough about the data to put it to productive use, and over time they may recognize the need to capture more data points. This is the idea of “peeling the onion.” As you learn more, there are typically more questions to research. Wireless sensors are making it more cost effective to monitor and capture data in historians without installing wiring, leading to new operational insights.

Embedded historians

There is a new breed of embedded historians in controllers and standalone field devices that collect historic data remotely. These are rugged field hardware devices with solid-state memory that are part of controllers, plug into backplanes, or communicate through industrial networks to capture data, time tag it, and store it. Data can be captured immediately at the source and time tagged, making it more accurate.

It is forwarded to the corporate or cloud database for long-term storage. This store-and-forward method allows data aggregation, so the central historian database transaction does not need to occur synchronously to the sampling rate. Onboard software rules engines may be incorporated into these devices, which can be configured to perform analysis for optimizing processes and production.

Having the absolute time relationship between data is critical to proper operations or analysis in some applications. For these applications, a number of controllers have options for more precise time stamping. Controllers that support the IEEE 1588 standard (Standard for a Precision Clock Synchronization Protocol for Networked Measurement and Control Systems) can communicate with a precision time reference. Another method is for the controller to use a global positioning system radio receiver for a highly accurate time reference. Using the open OPC UA industrial Web services standard, the information can be sent to historians, business intelligence databases, and enterprise systems, and queried over the Internet or in-house networks.

Database interfaces

Historian databases are high-performance, optimized designs to capture and time tag data at high speed. Sending this data to a computer industry-standard database structure, such as SQL, allows users to take advantage of a wide range of commercial reporting and analysis software. In addition, pushing information from historians to cloud servers is a way to store a large amount of data and lower cost without having to own, maintain, and manage more servers.

Cloud analytics

An interesting development is the availability of analytic software tools and analysis platforms as cloud applications at a very reasonable cost. Another advantage of this approach is it leverages high-performance computers to accomplish analysis more efficiently.

Business intelligence and historians

Historians and business intelligence (BI) systems have developed independently, but there is a growing recognition that they need to be linked. Historians are unique since they can chronologically capture high-speed, real-time data. In the business world, the time of transactions and events are typically measured in minutes, hours, or days rather than milliseconds. Historians and BI systems share the goal of capturing data to provide historical, current, and predictive views of operations for reporting, online analytical processing, analytics, data mining, process mining, complex event processing, business performance management, benchmarking, predictive analytics, and prescriptive analytics.

The goal is to have a resource for information that people can easily access and use. The new challenge is integrating the silos of dissimilar data to transform it into useful information. Having a plan to link systems and provide a framework that allows teams to combine isolated “islands of information” (including building management, plant utilities, process control, production, and business data) allows users to view and analyze disperse data in a variety of ways, leveraging their standard automation infrastructure. Developing a master plan with an integrated way for users to leverage data from multiple systems (i.e., historians, business intelligence, MES, LIMS) will achieve the goal. The solution should connect to other databases and systems to access data elsewhere without replicating the master data. Standard interfaces to other data sources provide the capability to bring in data from other sources and monitoring tools.

In addition, it is an advantage to allow access to this information via the Web using handheld devices as well as laptops and desktop PCs. One user told me his vision is that users need to be “two clicks away” when using a tool to browse to the data. He wants to make the data easily available to the right people using Web browsers with proper access control through the company intranet. This will provide links to knowledge management tools and contextualized data from various applications (ERP, LIMS, QTS, historians, etc.). More accessible data will be used to improve operations. Examples include data for analytics colleagues to build models and relevant data for continuous improvement champions.

Ad hoc reporting is an important functionality for this system to enable people to focus on specific issues and investigations. The solution should use commercial off-the-shelf technology to benefit from widely accepted industry standards. This will ensure timely migration to the latest technology, while avoiding large expenses and the cost of custom coding and maintenance.

Cooperation

Knowledge is power if you harness it and make it available to make better decisions and improve automation. Doing this takes cooperation between people in the silos of your organization. History has taught that increasing access to information enables people to gain insights and make improvements. Knowledge is dynamic, so the systems need to be responsive to the users and extendable over time. Having sound data is only the starting point; it takes knowledgeable and skilled people to use it. The technology implemented should have the goal of lowering users’ time to gain insights.

Outside of a few early adopters, technology by itself has negative value to the majority of people. Most people are afraid of change, and it is important to educate users. The tremendous growth of IT investments is accelerating and creating a range of off-the-shelf software for tapping data sources, analyzing big data, and closing the loop to optimize business operations and processes, including manufacturing. The overall goal of closing the entire loop for business operations through manufacturing enabled by the Internet of Things may well be the next force driving the integration of IT and automation.

The plant historian role is to be the single location to capture and store large amounts of real-time data. Big data concepts are knitting together silos of data more holistically to improve business operations. In the manufacturing and process industries, the plant historian is an important data source, along with distributed data located in automation controllers and devices.

About the Author
Bill Lydon is an automation industry expert, author, journalist and formerly served as chief editor of InTech magazine. Lydon has been active in manufacturing automation for more than 25 years. He started his career as a designer of computer-based machine tool controls; in other positions, he applied programmable logic controllers and process control technology. In addition to experience at various large companies, he co-founded and was president of a venture-capital-funded industrial automation software company. Lydon believes the success factors in manufacturing are changing, making it imperative to apply automation as a strategic tool to compete.

Connect with Bill
48x48-linkedinTwitterEmail

A version of this article also was published at InTech magazine

Bill Lydon
Bill Lydon
Lydon has been active in manufacturing automation for more than 25 years. He started his career as a designer of computer-based machine tool controls; in other positions, he applied programmable logic controllers and process control technology. In addition to experience at various large companies, he cofounded and was president of a venture-capital-funded industrial automation software company. Lydon believes the success factors in manufacturing are changing, making it imperative to apply automation as a strategic tool to compete.

Related Posts

Checking In With Mimo, ISA's Large Language Model Trained on ISA Content

Over the summer of 2024, the International Society of Automation (ISA) announced a large language model (...
Kara Phelps Nov 15, 2024 7:00:00 AM

Ask the Automation Pros: The Use of Artificial Intelligence in Process Control

The following discussion is part of an occasional series, "Ask the Automation Pros," authored by Greg McM...
Greg McMillan Nov 12, 2024 4:30:00 PM

Protecting Electrical Terminal Blocks From Tampering

Electrical terminal blocks are a common sight in the automation world. Usually mounted on DIN rail in ind...
Anna Goncharova Nov 8, 2024 10:30:00 AM