ISA Interchange

Welcome to the official blog of the International Society of Automation (ISA).

This blog covers numerous topics on industrial automation such as operations & management, continuous & batch processing, connectivity, manufacturing & machine control, and Industry 4.0.

The material and information contained on this website is for general information purposes only. ISA blog posts may be authored by ISA staff and guest authors from the automation community. Views and opinions expressed by a guest author are solely their own, and do not necessarily represent those of ISA. Posts made by guest authors have been subject to peer review.

All Posts

Leveraging Big Data at Manufacturing Plants in Real Time

 

This post was written by Jim Petrusich, vice president of global sales at Northwest Analytics.

 

Most manufacturing plants hold weekly or monthly production meetings where they review productivity and waste. Obviously, the reports do not let them change what has happened in the past, yet in most cases, they also provide little insight into how to change the future. Many are stuck in a paradigm of "inspect and reject."

This post focuses on companies that have switched their paradigm from "inspect and reject" to "predict and prevent" using a holistic approach to real-time analytics.

"A database is not a vault; it is a garden," says Rena Marie Pacella of Popular Science

 

Too often, databases are treated like vaults. We store information in them and draw it out when needed. But databases can be more like gardens, where, if cultivated correctly, they can produce valuable, actionable information when needed and on their own. This approach has tremendous benefits, but, historically, real-time monitoring tools lacked the intelligence to leverage the potential of the data.

Companies have been using real-time dashboards to monitor their plants for years, but in most cases, they use simple trend charts, and only alarm on fixed limits. If information is monitored by analytics at all, it is usually from a single silo of information in an individual database. This is one of the big obstacles for manufacturers, who frequently have many individual databases, which end up more like a series of vaults, rather than just one.

Manufacturing intelligence software is designed to address this. Successful implementations are only partially about the technology, however. Great gardens do not create themselves; we also have to work through our organizations to set up effective implementations. So this article will also focus on three keys to developing effective real-time analytic solutions.

Real-time alarms

Many plant operators are flooded with alarms on production line consoles or control room displays. Frequently these signals are just ignored. According to Yiqun Ying, senior staff process control engineer at Husky's Prince George refinery, "Some alarms don't mean anything, and the operator doesn't have time to respond. They just hit the keyboard and acknowledge it, and sometimes hit repress." Other companies tell similar stories. Before DuPont began to overhaul its alarm management system, it sometimes had 150,000 alarms going off per week.

If databases are like gardens, then these gardens are full of weeds. Showing excessive alarms to operators can be detrimental for two reasons. Operators may see something and take action when there is nothing wrong. People are amazingly good at finding meaning in data where no meaning really exists. With the new world of "big data," we are filling our plants with more and more sensors and capturing much more data. We should expect to find more and more false positives.

Placing more and more alarms in front of operators will also make them more likely to overlook critical alarms in the sea of noise. Knowing which alarms to pay attention to, when 150,000 are going off per week, is quite a challenge. We need to filter out the noise and focus on significance.

Big data and manufacturing intelligence

Manufacturing plants are used to working with many specialized databases: manufacturing execution systems/manufacturing operations management, distributed control systems, enterprise resource planning (ERP), quality systems, lab information management systems, historians to capture process data, and more. Manufacturing intelligence software allows companies to analyze their most important parameters, no matter where the data is located. The traditional method for doing this was to copy data from each of the individual databases into one of the other databases. Frequently people chose the ERP system or historian.

This method has several challenges. For one, replicated data tends to deviate over time, as corrections and adjustments end up creating two versions of the data. Additionally, these are not trivial projects and usually take years to implement. By the time the data is ready to be analyzed, the data analysts may have already retired . . . and not just for the day. Finally, these databases store information in very different formats. Whether a company has five, 50, or 500 plants, no two seem to be at the same level of automation, measure the same variables, or even make the same products. So the projects frequently become colossal.

Technology today now allows companies to analyze the data in real time without copying the data into yet another database. The software polls the appropriate data, runs calculations, and produces real-time analytic alarming. Using this technology, companies have achieved meaningful results very quickly.

Dow Chemical has been rolling out these types of analytics systems and achieving impressive payback. Lloyd Colegrove, who directs the Analytical Technology group at Dow, stated, "This is very cost effective, because we are already generating the data. We are already spending the money on the infrastructure for the data. Why would we spend all that money and not look at the data in a new and better way?"

Creating successful projects also requires thoughtful planning and implementation. Below are three key methods that Dow and other companies are using to set up these systems. The project described was the initial implementation in one of Dow's largest business units.

Three keys to successful solutions

Create a parameter-centric model. When Dow formed a team of experts at the beginning of its project, their first task was to limit the parameters they would track. They wanted to focus on the vital few that had the biggest impact on product quality and plant efficiency. This meant they would alarm on possibly 35 to 50 parameters, rather than the thousands that were being tracked by all the databases. This was not an easy task, as they had to argue and agree on each. Yet they found that when the team finished this step, the business unit had already benefited. Even before deploying any technology, everyone could see the most important variables to pay attention to.

With this list in hand, they next focused on how they would define real time. One of the members of the team described his perspective in three categories: First, there is short-term transactional data that is usually the focus of operators. Next, there is medium-term tactical data, which might be used more by supervisors and management. Finally, there is longer-term strategic data. Each real-time context has important value, so the Dow team decided to measure the same parameters over multiple definitions of real time.

What they found was alarms that triggered in shorter-term transactional or tactical dashboards let them quickly address real issues, but longer-term drifts in the process only showed up in the long-term strategic dashboard. Each of these dashboards had additional context that provided a better view than had ever been achieved before.

Leverage the protocol plan. One of the objectives of the project was to work toward replacing the existing protocol plan. The company followed a set of procedures, maintained in an Excel spreadsheet, for each situation that might occur in the plant. Over time this spreadsheet became complex, and yet one person maintained it. Most people felt that nobody would be able to modify or maintain the spreadsheet if this individual ever left the company.

The plan was to convert the protocol plan into new configurable-off-the-shelf software, which others could maintain. Yet they also realized that they only had one protocol plan. Now that they had multiple definitions of real time, they found there was no protocol plan in place for longer-term drifts in the process. So the team began creating and modifying a new protocol plan to address this.

Finally, they wanted a plan to leverage assignable cause/corrective action (AC/CA) data stored in the dashboard system. When the team met each month, they reviewed how the new protocols were performing compared to the old ones. The dashboards gave the guidance needed to make adjustments. If operators had not followed the protocols, they could also review this, as the AC/CA data showed exactly what action was taken in each situation. In some cases, operators chose not to follow the protocol and thought they had better knowledge about the situation.

When this happened, the team could review the actions. If the actions outperformed the protocol, these situations were reviewed as potential candidates for new protocols. If the actions underperformed the protocol, then the dashboards were used as training data to show the operators that they underperformed the protocol. Using these methods, the plant will continuously improve over time. Reduce the noise. First, Dow accomplished a major step toward reducing noise.

By limiting the parameters to the vital few, they could focus on the most important information without being distracted. Yet not all parameters were treated equally. In some cases, they wanted to trigger alarms based on specific pattern rules, while others might be triggered based on statistical process control (SPC) violations or specifications. Only actionable alarms would be shown to the operators, and there was a protocol for each alarm shown. Next they grouped alarms by type or location, so that only one alarm might be visible for a category of sensors.

If this alarm went off, the operator could click on the indicator and drill into the specific location of the individual parameter. This eliminated the noise of having many tags on one screen, while keeping the power of the system intact. Finally, they worked on the email and short message service (SMS) notification systems for alarms. This is an area that usually requires some tweaking over time, because too many electronic messages cause people to ignore the whole method of communication. The project in this plant was evaluated over one year. During this period, the plant had the longest catalyst run ever, both in terms of time and product generated, and they could see value almost immediately. When the dashboard was first displayed to the team, one of the members asked if the data was live or a prototype.

When he was told it was live data, he remarked that the plant was ignoring a key parameter. He emailed the plant immediately after the meeting. The next time the group met, everyone could see not only that the adjustment had been made right after the previous meeting, but also that the plant was more in control as a result. Dashboards also generated many conversations among the team members, because certain situations required actions that required compromises. According to Mary Beth Seasholtz, senior data scientist at Dow Chemical, the business unit experienced more than just a new technology advantage. There was a culture change away from an inspect and reject

 

S

 

(or in their case downgrade) paradigm to a collaborative process that was more focused on using real-time analytic dashboards to predict and prevent events from occurring in the first place. This was the major win. By using technology and doing careful planning, Dow developed a powerful manufacturing intelligence system in one of their largest plants. Yet one final observation should be noted. The team that was formed was essential to the success of the project. In some cases, projects are initiated with a plant and can achieve success.

Yet without fully engaging the right resources, projects can lose their sponsorship and funding and never reach their full potential. Similarly projects pushed down from headquarters can also miss the mark. Without engaging plant expertise, the project may fall flat. Most importantly, the combined perspective of a broad core group of individuals who can contribute key insights to the process will most likely develop the most effective solution. The team motto: Where data is working for us, and we are not working for it.

 

About the Author
This post was written by Jim Petrusich, vice president of global sales at Northwest Analytics and a frequent contributor to the global discussion of real-time analytic monitoring systems. He has worked on implementing real-time systems in manufacturing, transportation, energy, and defense projects in more than 40 countries.

 

Connect with Jim
LinkedIn

 

A version of this article also was published at InTech magazine. 

 


Related Posts

Checking In With Mimo, ISA's Large Language Model Trained on ISA Content

Over the summer of 2024, the International Society of Automation (ISA) announced a large language model (...
Kara Phelps Nov 15, 2024 7:00:00 AM

Ask the Automation Pros: The Use of Artificial Intelligence in Process Control

The following discussion is part of an occasional series, "Ask the Automation Pros," authored by Greg McM...
Greg McMillan Nov 12, 2024 4:30:00 PM

Protecting Electrical Terminal Blocks From Tampering

Electrical terminal blocks are a common sight in the automation world. Usually mounted on DIN rail in ind...
Anna Goncharova Nov 8, 2024 10:30:00 AM