The undiscovered country: the future of industrial automation — Part 1

Honeywell Ltd

By Paul McLaughlin and Rohan McAdam, Honeywell Process Solutions*
Wednesday, 14 June, 2017


The undiscovered country: the future of industrial automation — Part 1

The Industrial Internet of Things (IIoT) has the promise and potential to be the most influential and disruptive influence on automation systems since the advent of microprocessor-based distributed control systems. Additionally, the question of how the conventional Purdue model for automation systems, and today’s installed base, fits with new IIoT architectures is examined.

The Industrial Internet of Things is the intelligent application of digital technology to solving the automation needs of an analog world. IIoT could equally be called the Intelligent Internet of Things. There are two fundamental pillars to IIoT: digital automation systems, and the internet itself. Although current DCS and discrete manufacturing systems have been described as IIoT solutions, they are not true IIoT systems without the internet, and internet cloud-based technology.

The Internet of Things

The development of the internet over the past three decades has led to connectivity between people, organisations and businesses on a scale that would have been difficult to imagine when it first emerged in the 1980s. This ubiquitous connectivity is rapidly extending beyond people to ‘things’ as all manner of devices, sensors, controllers and actuators become connected in what is now referred to as the Internet of Things, or IoT. However, simply connecting vast numbers of objects from daily life into an Internet of Things is not sufficient to enable interesting and useful new ways of living and doing business unless there are platforms, tools, algorithms and applications to analyse, distribute and act on the huge amounts of data that result from this connectivity. Consequently, the IoT, as it is currently understood, lies broadly at the intersection of ubiquitous device connectivity, cloud storage for the very large amounts of data produced, statistical and machine learning algorithms for analysing and acting on that data, and new human computer interaction technologies provided by mobile and wearable computing devices.

The emergence of IoT

The IoT has its roots in the early 1970s and can be considered to have an epoch date of 1969, the year that the internet itself (then ARPANET) was first deployed, and when UNIX was released by Bell Labs. Given the premise that IoT is based on the harmonious alignment of the internet to smart digital sensors and devices it is clear that ARPANET and modern industrial control systems are foundational pillars of the IIoT concept. UNIX is foundational as well, as it formed the underlying basis and structure for client-server computing, workstations, personal computing, server farms and virtualisation.

Evolution of these core systems has yielded smart, internet-connected sensors and ubiquitous computing that weave computing into every aspect of life by allowing it to occur anywhere in formats that make sense in any particular situation. But the current IoT landscape is characterised by a large number of emerging application areas and supporting technologies, many of which are still in the early stages of development. Nonetheless, the confluence of technology coming together in IoT approaches does enable new sorts of applications and business models that will undoubtedly create new markets and disrupt existing markets.

The Internet of Things has, to a large extent, been enabled by the rapid emergence of a series of technology inflections. These technologies are virtualisation, cloud computing, pervasive networking, big data analytics and machine learning, smart devices and device mobility. These technologies enable the new types of systems typical of what is recognised as the IoT.

Virtualisation

Virtualisation technology that allows software workloads (entire operating systems, individual applications, etc) to be decoupled from the hardware on which they run allows a range of deployment scenarios that can significantly reduce cost, simplify management, improve availability and avoid problems associated with churn in the underlying hardware platforms. Virtual deployment of computing resources can be either on-premise or off-premise.

Cloud computing

Cloud computing provides virtualised platforms with elastic compute and storage capabilities. Cloud platforms are usually run as a service based in large data centres that make it possible to easily acquire additional computational and storage capacity as it is needed. This can entirely remove the need for a software-based enterprise to acquire and manage their own computing infrastructure.

Pervasive networking

More and more devices, in the consumer, commercial and industrial markets, come equipped with some form of connectivity. This connectivity can take the form of direct internet connectivity via 3G and 4G cellular networks, indirect internet connectivity via Wi-Fi, or local connectivity via Bluetooth or near-field communication. This allows devices to participate directly in cloud-based services or indirectly through gateway devices such as a smartphone that is connected to a cloud-based service. Pervasive networking provides opportunities for establishing relationships between elements of the physical world that do not exist otherwise, enabling a range of new applications and services.

Big data analytics and machine learning

The elastic compute and storage provided by cloud computing together with pervasive networking makes it possible to collect very large amounts of data from an increasingly wide range of sources. Collecting lots of data about a lot of things (big data) provides opportunities for analysis of phenomena not possible otherwise. Big data is characterised by a large volume of data (in the order of terabytes and petabytes) that requires new techniques to store and analyse, such as massive parallel data stores and statistical techniques or identifying correlations and patterns in data. The availability of data for analysis has also spurred the application of machine learning techniques including artificial intelligence algorithms to the big data stores as well.

Smart devices

Not only are devices becoming more connected, they are becoming smarter. The availability of small, low-power processors that can be embedded in many devices allows them to act as more than mere sensors or actuators. Local computational resources allows devices to act on their local environment becoming more interactive and autonomous. The trend towards increasing connectivity and capabilities in a wider range of finer grained smart devices is also known as ubiquitous computing. The aim of ubiquitous computing is for computers to blend into the surroundings so that they become an integral part of the physical and virtual world that are available when needed without having to explicitly use any form of conventional computer interface.

Mobility

One area where smart devices are making a very large impact is in the area of mobile computing. Smartphones and tablets enable a wide range of highly interactive context-sensitive (location, time, task, etc) applications. However, the current trend is towards disassembling the smartphone and distributing its capabilities across a series of smart wearable devices. The current crop of smart watches and head-up displays are a good example of this.

IoT architecture

The technologies outlined above come together in a general architecture that consists of three main domains — the cloud, the network and the edge — as illustrated in Figure 1.

Figure 1: IoT architecture.

Figure 1: IoT architecture.

The cloud includes compute and storage mechanisms together with applications including analytics, reporting, control and user interfaces. The user interfaces may actually live at the edge and are often combined with sensors, as in the case of smartphones. Network connectivity is built on IP-based protocols, some of which are conventional protocols such as HTTP with others being more specialised protocols designed specifically to enable IoT-based applications involving large amounts of data collection and distribution.

The edge consists of the ‘things’ in the IoT such as sensors, actuators and controllers. In some cases these devices are connected directly to the network via 3G/4G cellular or Wi-Fi. In other cases an intermediary — an edge gateway — provides connectivity to one or more devices that support only local connectivity.

The Industrial Internet of Things

As noted in the previous section, the core ideas of the IoT have broad applicability. The Industrial IoT is, in broad terms, the application of these ideas to the planning, running, analysis and optimisation of industrial enterprises. The application of these ideas to industry is being developed in a number of initiatives such as Industry 4.0 and the Industrial Internet Consortium, and aims to bring together the means of production (the physical plant) with the advanced internet-based computational and analytic capabilities to create cyber-physical systems that transcend the capabilities and scope of current automation systems. In simplistic terms, the IIoT connects the world of industrial things (sensors, actuators, controllers, robots, etc) to computational capabilities residing in internet-based storage and analytics.

IIoT vs IoT

IIoT differs from the more generic concept of the IoT with respect to several key quality requirements that result in architectures that expand on IoT approaches. A fundamental difference is that IIoT aims to enhance the operation and management of industrial production processes, many of which involve exothermic reactions for which safety is a primary concern. Security of IIoT-based systems is also of paramount importance not just from a safety perspective, but also in cases of the production of essential and strategically important goods and services. This results in more stringent security, reliability, availability requirements and the ability to continue operation with intermittent access to internet resources. When failures do occur, the system must continue operation where possible or degrade gracefully, deterministically and safely.

Another fundamental difference between IIoT and consumer applications of IoT is that an industrial plant is a very long-lived, capital-intensive asset requiring long-term support in the face of rapid technological advances. This requires support for existing, often ageing equipment and infrastructure and a means of protecting investments in intellectual property concerning the planning, execution and optimisation of production activities. In contrast, other applications of IoT involve short product life cycles that are often driven by whims of fashion and budget. Consumers are willing to rip and replace to get improved functionality. On the other hand, it is very expensive to shut down an industrial process to replace or upgrade equipment. Instead industrial enterprises favour keeping things running as long as possible — as exemplified by the huge spare parts business for obsoleted systems. One consequence of this is that many devices that will form part of the IIoT will continue to communicate via existing, often old protocols and will need special mechanisms to integrate them into the wider IIoT environment.

Many applications of IoT are human-centred — in which information delivery and interaction is aimed primarily at human users. The IIoT, on the other hand, focuses on automation of industrial processes with a trend towards less manual human involvement in production. Human participation is still a key element, but increasing levels of automation continue to move the human participants in an industrial enterprise away from direct control of the process towards higher level planning and supervisory roles. The ultimate goal of IIoT might be considered completely autonomous ‘lights out’ operation over an increasingly large scope of production from the autonomous operation of individual units today to the autonomous operation of an entire site or collection of sites in the future.

IIoT architecture

A large DCS is a complex network of sensors, actuators, controllers and computational capabilities. The lower layers of a DCS tend to be autonomous with responsibility for the real-time control of the process and can operate with a high degree of safety and reliability. The layers above provide various supervisory capabilities including advanced and supervisory control, and HMIs for management of the process by human operators. Above this are facilities for capturing and analysing a continuous historical record of the process and tools for planning and scheduling production activities that are passed down to the lower layers for execution.

It is tempting to draw a direct comparison between the DCS of today and the IIoT-based automation system of the future and claim that we are already doing IIoT, but to do so ignores the significant changes to the DCS (as we understand it) that will occur with the introduction of the IIoT. The IIoT arises from the combination of core DCS concepts such as local, high-availability, real-time control of industrial processes together with the technologies and architectures that enable the IoT.

Figure 2: IIoT architecture. For a larger image click here.

In the case of the IIoT, the applications running against cloud-based storage include applications geared towards industrial enterprises such as planning/scheduling, optimisation and engineering.

This model provides an overview of the general structure of IIoT-based systems. This is not a one-size-fits-all model. There will be variations on this architecture for specific types of systems and sites. In some cases the cloud environment may be deployed on-premise (either at the site or in the organisation’s data centre).

One of the main differences between IoT and IIoT architectures concerns the nature of the edge computing environment. In the IIoT the edge computing environment provides the opportunity to address key requirements in the areas of performance and robustness needed in industrial process control. Another significant characteristic of the edge computing environment in the IIoT that sets it apart from the IoT is a high degree of heterogeneity in the devices used and the protocols with which they communicate.

In Part 2

In Part 2 of this article we will look at the IIoT architecture in more detail, and how it can be reconciled with the Purdue Enterprise Reference Architecture Model.

*Paul McLaughlin is Chief Engineer at Honeywell Process Solutions. He has worked in the automation industry for 35 years and been with Honeywell for the past 30. Paul is currently responsible for the development of the HPS Industrial Internet of Things strategy, and has degrees in mathematics and computer science from the University of Delaware and the University of Pennsylvania.

*Rohan McAdam is Chief Architect for Honeywell Process Solutions. He has worked in the field of industrial process control since 1988. Prior to joining Honeywell in 1993, he worked in the alumina industry in Western Australia, and has a mathematics degree from Charles Sturt University, a master’s degree in cognitive science from the University of New South Wales and a PhD in computer science from Charles Sturt University.

Image: ©stock.adobe.com/kinwun

Related Articles

Liberating stranded data via the IIoT

Modern edge-to-cloud IIoT solutions can make it easier to access and use stranded data.

How the IIoT can fast-track Australia's sovereign manufacturing capability

The primary benefit of using automation to enhance sovereign capability is increased productivity...

EtherCAT: leveraging industrial Ethernet for 20 years

EtherCAT is the only industrial fieldbus that leverages Ethernet for both high speed and...


  • All content Copyright © 2024 Westwick-Farrow Pty Ltd