ISA Interchange

Welcome to the official blog of the International Society of Automation (ISA).

This blog covers numerous topics on industrial automation such as operations & management, continuous & batch processing, connectivity, manufacturing & machine control, and Industry 4.0.

All Posts

What is Edge Architecture?

Why is “Edge Computing” Happening?

As modern Industry 4.0 trends take hold, the location where data resides and where it’s processed is increasingly becoming distributed. In simpler times, some remote IO, a PLC, and a historian server were considered modern. But today, those same functions can be performed by applications in the cloud, collecting data across a varied set of platforms and utilizing components to perform data buffering and processing at various levels.

There are many factors behind the change, but prime among them is the economies of scale and business benefits of cloud-hosting. One application can now service multiple facilities (ensuring consistency), costs and resources can be scaled dynamically as the business grows, change management is simpler, and data from multiple facilities can be aggregated into a single location for business-wide analytics. At the same time, we expect the luxuries we experience in consumer electronics such as our smartphones –- we want to access data from anywhere, we expect systems to be 100% resilient against connectivity outages without loss of data.

Accomplishing this requires a distributed computing model, where functions such as store-and-forward are performed at every level. Data often needs to be conditioned or protocols translated at various levels of the computing architecture as well. And finally, it can be advantageous to not only have embedded processing near the PLC, but entire applications, which are becoming quicker and easier to deploy.

Where is the Edge?

Distributed computing models and IIoT systems blur the lines between levels in the Purdue model. For example, a VM host can be running multiple virtualized instances of applications which perform I/O, Control, and Supervisory functions. It is also true that cloud-based functionally may reside in a public cloud (such as AWS), but can also be performed near the process, further confounding the layman.

A typical way to define the edge is to say “near the process that’s generating data.” But exactly where that is can vary based on the system or process. For that reason, it is best to think of “the edge” in relation to the distributed system as a whole. For example, a remote control center that’s monitoring a fleet of solar energy facilities may consider the entire solar facility to be the “edge” of the system’s architecture. In contrast, a cloud-based OEE system monitoring manufacturing processes would consider production units to be the “edge,” rather than the entire facility. The physical location of “the edge” is not important; it is more about identifying the sources of data and points of consumption.

Applying an Edge Architecture

Exactly how an edge architecture is applied varies widely. Some technology platforms include a webstore of applications which can be loaded onto edge hardware and used to process data prior to transmitting to a centralized data concentrator or cloud application. Other software ecosystems provide an “edge” flavor of the software, meant to run on light-weight hardware and perform core functionality in the case of a cloud connectivity outage.

Understanding the user requirements is key for an edge-cloud architecture, as well as aligning with a global cloud and Industry 4.0 strategy. In general, cloud-edge architectures and deployments shouldn’t be considered on an isolated plant-by-plant basis. In certain industries such as life sciences, the collection and retention of data is a matter of regulatory compliance that can result in fines or scrapped product if data is not handled properly and validated. In other cases, especially those performing control-related functions, ensuring a user interface is available at the edge even when other components are not available is critical. Understanding and choosing a technology platform that will work for the entire organization, aligning with a larger strategy and ensuring all regulatory and organizational requirements will be satisfied will ultimately lead to reduced total costs of ownership for the system and a strategic benefit as the business grows in Industry 4.0.

Jacob Chapman
Jacob Chapman
Jacob Chapman has a background in automation engineering, project management, account management, industrial networking, and ICS cybersecurity within the food and beverage, pharmaceutical, and energy generation sectors, among others. Jacob currently leads the industrial IT and cybersecurity solutions and services at Grantek, which help manufacturers develop their facility infrastructures, including their industrial network architectures, local and cloud computing systems, and cybersecurity programs. As Grantek’s leader in the space, Jacob maintains involvement and leadership positions in international societies and standard bodies - including the Cybersecurity Committee Chair of ISA’s Smart Manufacturing & IIoT Division, a Registered U.S. Expert to TC65 of the IEC, and a member of the ISA99 standards development committee.

Related Posts

ISA-95 and IIoT: Power is Nothing Without Control

The Internet, Internet of Things (IoT), and the Industrial Internet of Things (IIoT). Many different term...
David Jimenez Jun 18, 2021 5:30:00 AM

ISAGCA Proudly Announce the Top 20 Secure PLC Coding Practices

Over on our ISAGCA Blog, Sarah Fluchs has written an article outlining the background and thought process...
Steven Aliano Jun 15, 2021 12:00:00 PM

The New Normal

“In the rush to return to normal, use this time to consider which parts of normal are worth rushing back ...
Steve Mustard, ISA President Jun 15, 2021 5:30:00 AM