ISA Interchange

Welcome to the official blog of the International Society of Automation (ISA).

This blog covers numerous topics on industrial automation such as operations & management, continuous & batch processing, connectivity, manufacturing & machine control, and Industry 4.0.

The material and information contained on this website is for general information purposes only. ISA blog posts may be authored by ISA staff and guest authors from the automation community. Views and opinions expressed by a guest author are solely their own, and do not necessarily represent those of ISA. Posts made by guest authors have been subject to peer review.

All Posts

How to Improve Quality with Industry 4.0

Introduction

Along with increases in efficiency, increasing quality levels are a staple of the industry 4.0 (I4.0) and smart manufacturing movement. However, unlike the steady progression that has been seen through the previous three industrial revolutions through tighter tolerances and better inspections, the new methods brought on by the Fourth Industrial Revolution either hypercharge previous quality control (QC) and quality assurance (QA) methods or bring in completely new ones.

The Fourth Industrial Revolution is largely considered to be the information revolution, and the data made available and accessible with modern technologies and connectivity provided by this latest change can be leveraged to significantly improve the quality of not only in-process but delivered products as well.

Connecting the Lab to the Floor

The laboratory has always been seen as the be-all end-all regarding quality, whether it be in the QC or QA labs. The lab was always able to perform tests that were far more advanced than what was available in the process environment, and the data taken from these tests were used to identify potential process issues or identify quality issues in final product to prevent its release. In the case of process monitoring, the information gained from these tests were either used to evaluate a specific part of the process (e.g., oil monitoring applications, or the conditions of the manufacturing process at a certain stage). In the case of final product testing, these tests were used to ensure that any product leaving the facility was free from defect and within standard limits. Industry 4.0 brings with it two major advances that significantly impact this area: Advanced sensors and full connectivity.

For sensor availability, many of the underlying technologies have been around for years, but it was not until recently that processing speed made these viable. For instance, online bioburden analyzers use Mie scattering, which traces its roots to the early 20th century, and laser-induced fluorescence (LIF), which was discovered in the late 1960s. Recent advancements in processing power and optics have finally allowed the measurement to be performed fast enough to be useful in process. These sensors can identify single organisms in water in real-time, 24/7, with updated results on a second-by-second basis. The equivalent lab test? Growing samples in a petri dish, which can provide results in 7 to 21 days.

Other technologies, especially spectral analysis technologies, have been used successfully in process for decades, but are getting a massive boost with the ability to link process readings to laboratory tests. Many spectral analyses are readings built on models, which are built by taking readings and comparing them against a known value. Working for and against this type of measurement is its sensitivity. While it can be used to detect minute changes in the process, it can also be influenced by changes in the instrument, raw material source changes, and other variables that the measurement should ignore (though some may be tuned to look for these as well). Because of this, the measurement model requires regular maintenance where the inline reading is compared against a laboratory measurement. In a fully connected system where the inline process analyzer is connected to the laboratory information management system (LIMS), this regular maintenance can be automated by constantly feeding data back into the analyzer to regularly maintain the model.

This constant connection between the laboratory and the process also opens the door to an even more interesting measurement method: Virtual sensors. Much like spectral models where correlation is established between the field measurement and the laboratory results, virtual sensors are based in statistical models and artificial intelligence (AI). Process data is recorded and compared against the readings from the lab to find correlation between the two areas. Once this model has been built, laboratory-type measurements can be inferred from one or many sensors in a timeframe much closer to real-time than waiting for lab results.

If these measurements can be done inline, does this mean the end of the laboratory? No, quite the opposite in fact. These new or enhanced sensors are going to rely on the measurements being done in the laboratory as the foundation for their measurements. Without lab tests, virtual sensors will not have the data available to be built and models will fail over time. Further, this new capability of the lab will require highly skilled individuals that will be able to provide the quality data required for these measurements. With the right tools and the right people, facilities will be able to get the quality they expect from their laboratories in a timeframe they expect from their process sensors, allowing them to predict quality issues before they happen.

Real World Feedback

Connecting the lab and the process is a major step forward, but it could be argued that this is more of an evolution than a revolution. Measurements in the lab have always been compared to the process and vice versa. If that is the evolution, then the revolution is occurring after the product has left the facility.

Many of the pillar technologies of smart manufacturing, specifically better sensors, better communications, and better data handling and analysis are allowing organizations to extend their QA/QC reach past the shipping dock to continue to follow the product long after its delivery. While this may not be available for all products, larger, more complicated ones are prime candidates for this new capability and offer significant advantages. There are many ways this capability is being leveraged, two of the biggest are the ability to automatically correct quality issues in the field, and the ability to reduce downtime proactively.

Regularly improving a product through updates has been a staple of the software industry for years. Back in the physical media age, it used to be that an update only came with the next edition, or if the issue was big enough, through a patch on a physical disk. As computers have become more connected, updates have become much more common and less visible. Many times, users are unaware of updates that have patched a system or added in a smaller new feature. This constant attention allows the software company not only the ability to constantly address quality issues, but to increase the value of their product over time.

With connectivity extending to equipment and products via internet of things (IoT) and industrial internet of things (IIoT), this functionality is beginning to cross over into the physical world. Tesla is probably the most well-known application with their over-the-air (OTA) updates adding new functions and addressing issues on a regular basis. However, even smaller systems like the Nest Learning Thermostat use this capability to increase efficiencies and provide feedback on the use of the system.

One of the best examples of leveraging Smart Manufacturing to reduce downtime can be found at Rolls Royce, who stream data from their aircraft engines to a central point where the information is analyzed for any potential issues or anomalies. Any issues that are found are presented to the operator, allowing them to schedule maintenance for a time that allows them to minimize downtime due to the engine. Further, since Rolls Royce can identify the source of the issue, they can ensure that the parts required to fix the engine are available where and when they are needed. This ensures that not only is flying safer thanks to the right maintenance being performed at the right time, but also that more aircrafts are available to fly.

Summary

One of the most interesting aspects of smart manufacturing is how much creativity plays a role; the limitation of what can be done is based on the imagination of the individual or the organization instead of the technology. Technology is no longer the barrier.

There are few situations where technical capability is the limiting factor, and quality is no different. Every day new approaches are being taken by using the modern technologies available to improve product reliability and the end-user experience in ways that have never been done before, and the factory walls are no longer a limiting factor.

Ryan Kershaw
Ryan Kershaw
Ryan Kershaw is a Senior Member with ISA and holds a Certified Automation Professional designation. Ryan works with Litmus Automation and is a part of the Smart Manufacturing and IIoT division within ISA where he works with the Industry Maturity and Readiness Committee. Ryan lives just outside of Toronto, Canada with his wife, three kids, and his dog, and much like many Canadians, uses his love of hockey to get through the winters.

Related Posts

Ask the Automation Pros: The Use of Artificial Intelligence in Process Control

The following discussion is part of an occasional series, "Ask the Automation Pros," authored by Greg McM...
Greg McMillan Nov 12, 2024 4:30:00 PM

Protecting Electrical Terminal Blocks From Tampering

Electrical terminal blocks are a common sight in the automation world. Usually mounted on DIN rail in ind...
Anna Goncharova Nov 8, 2024 10:30:00 AM

How to Access ISA Technical Content

You Have Questions? ISA Has Answers. Serving up member-generated technical content related to standards, ...
Renee Bassett Nov 5, 2024 7:00:00 AM