ISA Interchange

Welcome to the official blog of the International Society of Automation (ISA).

This blog covers numerous topics on industrial automation such as operations & management, continuous & batch processing, connectivity, manufacturing & machine control, and Industry 4.0.

The material and information contained on this website is for general information purposes only. ISA blog posts may be authored by ISA staff and guest authors from the automation community. Views and opinions expressed by a guest author are solely their own, and do not necessarily represent those of ISA. Posts made by guest authors have been subject to peer review.

All Posts

Edge Analytics Can Help When Cloud Computing and Big Data Fall Short

 

This post was authored by Andrew Hopkins, managing director of Accenture Mobility, and Brian Irwin, head of Accenture’s Industrial Group in North America.

 

Traditional industrial analytics brings gains in operating efficiency, machine uptime, and risk and hazard mitigation. But what if collecting and crunching millions of data points through a big data solution is too slow, too costly-or even impossible? Big data analytics in manufacturing has become a well-established concept: Collect data from connected products, machines, factory lines, and entire plant functions, and enable businesses to gain immediate insights into their ongoing operations to produce greater efficiency, quality, or safety. Also adopting "edge analytics"-that is, technology that allows analysis at the "edge" of a network, without needing to send data back to the cloud at "the core"-can significantly further that goal.

Large-scale big data analytics deployments can be useful. But, they are still few and far between. That is partly because collecting and crunching all that data can be a challenge. Big data requires industrial equipment to be connected. It requires the data from these machines to be collected and brought together for analyses. And it requires a lot of computing power to run the analyses, especially if sophisticated routines like machine learning are required. Therefore, almost all state-of-the-art big data solutions use high-bandwidth Internet connections and the cloud-two technologies that can be costly and are not always the best way to achieve the desired result.

 

Consider, for example, a large network of gas sensors in a huge chemical production plant, installed to collect data that might help predict critical equipment failures. In most cases, the data points collected by these sensors will not indicate any failures that must be resolved-and yet, with big data technology, 100 percent of the data is streamed to, and analyzed by, expensive network, cloud, and big data infrastructure.

Or think of large, underground mining equipment: Operational analysis for it is an obvious case for using data analytics, with applications ranging from early warning of minor maintenance issues to critical systems failure predictions. Unfortunately, in a mining setting, high-bandwidth Internet does not always exist. And so, most of the data generated by the machine cannot be streamed to a cloud and analyzed in real time.

Conversely, in manufacturing environments, the challenge may have more to do with latency than connectivity. Breakdowns in automated manufacturing processes (particularly with process manufacturers) can cause significant downtime and loss of production. In these situations, any latency in the collection and analysis of data and the "triggering" of actions can be very expensive. Here, machine solutions that allow instant, near-real-time warnings to machines and operators or even preventative machine shutdowns can reduce risks and costs much better than cloud-based solutions prone to latency.

Gaining an edge

Similar issues exist in numerous other industries-in farming and mining, for example, at industrial sites in remote areas with low-bandwidth Internet connections, or in plants and factories with highly complex machines and operations. This is why networking hardware vendors, such as Cisco, Dell, and HPE Intel, have found ways to use the analytics capabilities built into devices like network routers or switches, which operate very near or even next to the data-generating equipment. And highly specialized analytics firms like the U.S. startup Lone Star Analysis have refined this approach and improved the analytics capabilities so that these devices can analyze the data coming from the equipment and glean decisions and instructions from them in near real time with relatively limited computing power.

It is an approach that not only solves many infrastructure or efficiency challenges associated with cloud-based big data analytics by generating faster analytical results, but it enables decentralized decision making. There are two benefits that can help address key operational challenges.

Faster analytics, for instance, benefits a wide range of industrial situations that require real-time insights. One example for this is enhancing "self-optimizing machines"-which cannot be realized with "slower," cloud-based big data analytics. Imagine a process manufacturing production line that attempts to self-optimize for highest output with minimal wear through cloud-based big data analytics. The time lag between data generation and interpretation caused by bandwidth issues and network latency would very likely cause inaccurate measurements and ill-fitted instructions.

Decentralized decision making, on the other hand, is an excellent way to mitigate some of the risks that come with "smart" equipment. Imagine 1,500 oil pumps being controlled by a cloud. If their connection to the cloud failed, or if the cloud ever ran into what IT experts call a "disaster," these 1,500 "smart" pumps would become ineffective in a matter of seconds, and remain in that state until the infrastructure was back up and running. This would also very likely affect safety and productivity in significant ways.

Studies show that big data analytics in the cloud typically is not fast enough for many Industrial Internet of Things (IIoT) uses. Whether an analysis can be considered "real time" greatly depends on its underlying use-and in many cases machines need results fast. In a survey of 203 Internet of Things professionals, research firm Dimensional Research found that many experts think the "timely collection, processing, and analysis of data" was the chief technology challenge for IIoT implementation; 92 percent of the experts surveyed said they simply could not capture data fast enough. And when asked about the business impact of "better" analytics, 86 percent of the respondents said that they believed that faster and more flexible analytics would increase the return on their IIoT investments.

Benefits for manufacturers

Performing sophisticated analysis at the edge means that targeted condition- or prediction-based outcomes can be triggered at the level of machines, components, or even parts very quickly, and with very short response times-without high-performance network and cloud infrastructure. This means that analytics becomes much more efficient, much faster, and therefore much more effective.

By acquiring, monitoring, and interpreting data at the component level, edge analytics can identify a cause before its effect materializes, enabling earlier and more specific reactions. So, rather than identifying and analyzing an effect (excessive motor bearing vibration, for example), edge algorithms identify and act upon real causes at a more granular level. This could be, for example, a voltage leak causing a bearing temperature spike, degrading the other bearings, and causing vibration.

All of this can be of tremendous value to industrial businesses. They can use edge condition and predictive analytics to improve equipment uptime much more effectively and efficiently than they could with big data analytics. They can reduce maintenance cost and planned downtimes significantly by planning maintenance predictively and giving maintenance experts extremely granular, precise machine-specific status insights. Or they can package all these capabilities into service offerings, and begin to build entirely new business models around them.

Edge analytics models can be tailored to the requirements of an individual device or system. This might mean reading sensors directly associated with certain components. Or it might mean inferring results based on known and validated calculations. The right sensor package for a piece of equipment will be guided by an organization's desired business value-the model can define how an asset or system should be optimally configured to achieve a business goal for the minimum cost.

Putting edge analytics to work

Edge analytics cannot, however, do everything, and they especially cannot replace big data and cloud computing when it comes to storing, analyzing, and interpreting vast data sets or running resource-intensive technologies like machine learning. This is why businesses that plan to use edge analytics to their advantage should not think of the two technologies as either-or, but rather as complementary.

In fact, both unlock the highest possible business value when used together. When integrated with a big data analytics cloud, edge technology brings precise insights and improvements on the component or machine level while relying on the cloud to do the same at the "collective" level. In addition, edge technology can reduce the volume of data sent to the cloud, while improving data quality. This makes for more efficient and effective analytics on all levels of a business operation.

The way this works has much to do with the data models and analytics algorithms used in edge analytics. Because routers' and switches' computing power is nowhere near even that of a single server, let alone a cloud, most edge analytics solutions use models that are highly efficient-and very different from big data models-and execute them through algorithms that need relatively little computing power.

These models are usually built to solve very specific analytics tasks or deliver very specific outcomes, which limits the data they need to only what is necessary to reliably run the model. This is almost the opposite of big data that analyzes large swathes of data until correlations or other patterns emerge. This "outcome-centered" approach is what unlocks the gains in data efficiency and quality. Only data that is relevant and insightful is being used and, if need be, pushed to the cloud.

To be able to build these models in the first place, edge analytics experts usually need expert knowledge, supported by historical data, and, in some cases, a cloud-based big data analytics resource to build, test, and optimize their statistics and algorithms.

New business models

Edge capabilities also can help shape new business models. Consider, for example, the potential impact of edge analytics on just-in-time parts management. A remote piece of equipment could self-monitor and analyze its condition and that of its subassemblies and individual components. An edge analytics model could then predict which components will fail and when, and assess that failure's effect on the equipment's operation.

This could then trigger part replacement notifications to be sent to the original equipment manufacturer (OEM), the equipment operator, and any third party, such as a dealer, that will handle the replacement. The OEM could then dispatch the replacement part to the dealer, who could, in turn, confirm a maintenance schedule with the operator. Most importantly, the timing of each stage in the process would be determined by the original time-to-failure prediction made by the equipment-thus resulting in the foundation for a new business model.

Build analytics capabilities for the future

The benefits of edge analytics' problem-focused approach will initially be seen at the edge of the network. That means optimization will first occur at the level of individual pieces of equipment. Realizing the wider benefits of edge capabilities across the whole organization will require analysis and optimization further up the chain.

Just-in-time parts management will, for example, maximize the uptime and utilization of each piece of equipment with edge capabilities. But optimizing maintenance scheduling and parts management across an entire fleet means aggregating all the individual equipment outputs and then applying more traditional forms of analytics. Most organizations will thus opt for a hybrid analytics approach, incorporating both edge and cloud capabilities, optimized for their requirements and circumstances.

But starting at the edge and working toward the center could be the fastest route to realizing material benefits for a business. Building and testing an edge model offers a predicted benefits statement that can be used to build a wider business case. In this way, edge capabilities can be used to kick-start a wider program. They can represent the first step in an organization's journey to capture the immense value that lies in the billions of connected devices set to join the Industrial Internet of Things today and in the future.

 

About the Author
Andrew D. Hopkins, managing director, Accenture Mobility, part of Accenture Digital, serves as the IoT lead for multiple industries, including industrial equipment, infrastructure and transportation, automotive, consumer goods and services, and retail. He is an IoT enthusiast who believes passionately that IoT will help address many of today’s most pressing social problems. IoT offers an opportunity to take action based on a wealth of new understanding in areas such as healthcare, the environment, safety, and the prevention of avoidable disasters.

 

Connect with Andrew
LinkedIn

 

About the Author
Brian Irwin, leads Accenture’s Industrial Group in North America responsible for the company’s automotive, construction, freight and logistics, industrial equipment, and public transport industry segments. He helps clients exploit the advantages that are being created by Industry X.0 and the digital disruption that is driving innovation, new products, services, and business models. Irwin has more than 20 years of consulting experience working for industrial organizations in areas such as product development, operations, supply chain, logistics, and disruptive technologies. Prior to consulting, he worked for General Motors.

 

Connect with Brian
LinkedIn

 

A version of this article also was published at InTech magazine

 


Related Posts

Ask the Automation Pros: The Use of Artificial Intelligence in Process Control

The following discussion is part of an occasional series, "Ask the Automation Pros," authored by Greg McM...
Greg McMillan Nov 12, 2024 4:30:00 PM

Protecting Electrical Terminal Blocks From Tampering

Electrical terminal blocks are a common sight in the automation world. Usually mounted on DIN rail in ind...
Anna Goncharova Nov 8, 2024 10:30:00 AM

How to Access ISA Technical Content

You Have Questions? ISA Has Answers. Serving up member-generated technical content related to standards, ...
Renee Bassett Nov 5, 2024 7:00:00 AM