A new era of connectivity, thanks to OPC UA
By Harry Mulder, Engineering Manager, Omron Electronics
Tuesday, 04 September, 2018
OPC UA fulfils the connectivity requirements of industrial applications, by providing both vertical and horizontal communications to industrial controllers.
One of the biggest frustrations expressed by the manufacturing industry over the years has been the inability of controllers on the factory floor to exchange data with each other and with supervisory systems. This lack of connectivity and interoperability between devices has meant that basic data and other vital information has gone unreported. The main cause has been the absence of an adequate protocol and a method of data exchange that’s appropriate for the automation industry. This article will discuss how many of these problems have been resolved with the advent of OPC UA.
In the early days of industrial controllers, every automation vendor created their own (often proprietary) protocol over serial links. Despite these protocols all performing essentially the same function, incompatibility was immediately created whenever controllers from different vendors were used together. This prevented communication between devices and made connection to higher-level supervisor systems, like SCADA, all the more difficult. However, because ubiquitous internet connections did not exist, these networks were closed to the outside world and had none of the concerns we find today, such as cybersecurity risk.
Open vendor communication networks
With the inevitable push from end users, changes started to happen around 25 years ago, firstly in field networks. Vendor-neutral associations like the ODVA produced open network standards like DeviceNet and promoted them throughout industry. They gained popularity as more and more vendors embraced them in their products. This trend towards openness continues unabated to this day. EtherNet/IP, EtherCAT, Profibus and Profinet, to name a few, are all open vendor networks, meaning anyone can produce devices for them and at minimal cost.
Open networks allow one vendor’s controllers to connect to, say, the drives of another vendor and the I/O blocks of a third vendor. The interoperability of devices on a network gives end users two distinct benefits. Firstly, as multiple vendors produce similar items, end users can choose a ‘best of breed’ technology, improving their overall systems. Pricing is also kept honest, as end users are no longer locked into one particular vendor who controls the price.
However, the trend towards openness in data or controller-level networks has been slower. This may be due in part to the uniqueness of the requirements; namely, the real-time exchange of complex data. Modbus RTU is probably the most commonly used industrial serial protocol. It became a de facto standard when vendors began using it in their products. While serial links are now considered legacy, the Modbus protocol can be encapsulated within the Ethernet TCP framework, to form Modbus/TCP. Its adoption has been quite widespread, mainly due its simplicity, which allows comparatively easy implementation. However, its relatively inefficient master/slave polling methodology and its treatment of data as being flat and dimensionless (ie, a set of raw bits, without data type) preclude it from many applications, such as real-time control.
The first real attempt at producing a robust, open vendor protocol, with high-level functionality required for both the automation and process control industries, came with the OPC DA standard in 1996 (now often referred to as OPC Classic). It was created by an industrial automation taskforce headed by Microsoft, and it represented a significant step forward from serial protocols.
OPC defines a set of interfaces, objects and methods, as well as events that can be based on a client’s criteria. Variable data is handled in a much more sophisticated way — it’s presented hierarchically with data type and with three attributes: value, quality and timestamp. These relate to measure, trustworthiness and freshness respectively. In addition, data indicating the status of the controller, such as its operating mode, presence of errors and much more, are automatically served up.
Client/server architecture is utilised by OPC primarily for scalability, as nodes can easily be added to the network. Here, a server node holds data and makes it accessible to one or more client nodes. Clients can be distributed across a network and can request data from a centralised server. The server listens on the network and responds to clients’ requests. Networks can consist of multiple OPC servers, which can connect to multiple clients as required. A typical network is shown in Figure 1.
HMI/SCADA vendors were among the earliest adopters of OPC DA. Previously they had been forced to create their own custom drivers to extract data from third-party devices (mainly controllers). This consumed considerable resources and required much maintenance. The OPC DA standard allowed them to create a single driver for an OPC DA client, and abstracted them away from the more difficult task of interfacing to third-party devices. In many cases, the driver for the OPC DA server was sourced by the same vendor that supplied the hardware. Being the producer of the device, they had integral knowledge of it and were the obvious choice for a driver.
The OPC Foundation
The OPC Foundation was formed in 1994 to take on the administration and promotion of OPC. It’s a vendor-neutral, non-profit consortium which today has over 500 members. It has been instrumental in the advancement of the suite of OPC standards, which includes Alarm & Events, Historical Data Access, Batch and others.
One important role of the OPC Foundation is the certification of products to ensure they fully comply with the standard. There are two methods are available for this: self-certification via the CTT (Compliance Test Tool) offered to members and independent, real-world testing by third‑party laboratories.
OPC DA underwent three revisions over the years, but with the rapid adoption of the internet as a means of interconnecting industrial sites, it become clear that a more rigorous revamp was required. The end result was OPC UA (Unified Architecture), which was released in 2008.
Highest on its list of design priorities was backward compatibility with the OPC DA standard, so that existing installations did not need to be modified. Also high on the agenda was the move away from Microsoft’s COM technology, which had restricted OPC’s use exclusively to Microsoft’s Windows platforms. The need for DCOM, which handled networked connections between computers, was also removed by the creation of a completely new communication stack. This was a relief as DCOM had proved problematic for many users.
The decision to make OPC UA usable on almost any hardware platform or operating system was so significant that it may have been the catalyst for changing the meaning for the OPC acronym. It originally stood for OLE for Process Control (DA signified Data Access). OLE means Object Linking and Embedding, a technique tied solely to the Microsoft Windows operating system. OPC now denotes Open Platform Communication, to signify its move to open systems.
OPC UA details
OPC UA utilises a service-orientated architecture (SOA), with well-defined services that negate the need for WSDL (Web Services Description Language). It uses two protocols: a binary protocol optimised for high performance and a web service (SOAP on HTTP). Another handy feature is the ability to discover devices on a network, which is useful in large networks.
OPC UA was made firewall-friendly expressly for internet connections, which also demand strong security. Authentication is supported, for both human users and applications, via either a password or digital certificate. The X.509 format of digital certificates is used by OPC UA, to safeguard the integrity of data messaging. Digital certificates are used to generate public encryption keys which are exchanged between nodes. They can be issued either by a trusted, external certifying authority (CA) or created locally from user details (shown in Figure 2) where they are known and self-certifying certificates. Certificates can be revoked and will eventually expire and require re-authentication.
Users decide on a security policy, which establishes the cipher strength (128 or 256 bits), encoding method and whether to just digitally sign or to sign and encrypt each message. Encryption ensures confidentiality, while a digital signature (a unique code generated by a hash function that is appended to each message) verifies a sender’s authenticity and prevents both non-repudiation and man-in-the-middle attacks.
OPC UA is also designed for extensibility. The OPC Foundation continues to develop the standard, and the latest version (v1.04) was released in November 2017. This version added publish/subscribe functionality as an alternative to polling, to achieve more efficient use of network bandwidth.
More than an isolated standard
As OPC UA is an international standard (IEC 62541), it seeks to cooperate with other established industry standards. It integrates into the widely adopted IEC 61131-3 industrial programming standard and is listed as a recommendation for the RAMI 4.0 (Reference Architecture Model Industrie 4.0) paradigm.
OPC UA serves as a basis for both the PackML (Packaging Markup Language) standard for the packaging industry (ANSI/ISO-TR88) and EUROMAP 77, which defines the data exchange between injection moulding machines.
The next logical step
There can be little doubt that OPC UA is a robust standard which has established a long history in the industrial automation industry. But for all its advantages, the question of why it has not been more widely implemented should be asked.
Up until now, most OPC implementations have relied on gateway devices (usually PCs) that gather data from multiple field devices and serve it up as OPC UA. While workable, this configuration needs gateways — which means more hardware and software, adding to costs and maintenance. It also introduces latency, which can be a problem for real-time applications.
A simpler, more logical topology is for industrial controllers themselves to act as OPC UA servers. Controllers are already connected to most field devices and are usually good repositories of data, which they need to manipulate within their programs. OPC UA then just becomes another network protocol they support. This arrangement makes OPC UA more accessible and easier to use.
OPC UA effectively fulfils the unique requirements of industrial applications by providing both vertical and horizontal communications to industrial controllers, regardless of the vendor. This link has been missing for too long in many installations, but will need to be functioning if modern trends towards the IIoT, and the resulting cloud-based big data, are to become a reality.
Table 1 lists OPC UA’s main design criteria. Fortunately, OPC UA manages to satisfy the rather divergent priorities of both the OT (operational technology) and IT (information technology) aspects of industrial control systems. It has been designed with internet connections in mind and is as secure and reliable as the underlying technology will allow.
Nobody can be sure who popularised the term 'cloud computing', but today cloud computing...
Improve profitability and maximise return on capital across the operations and asset lifecyles to...
Is the latest darling of IT ready for the world of supply chains?