Understanding OPC Ebook
Understanding OPC Ebook
Understanding OPC:
Open Connectivity via Open Standards
BY TONY PAINE
The OPC Foundation released, which essentially made OLE feel like
legacy technology. Second, OPC has since found a
In the mid-1990s, a group of vendors convened home in many automation environments—not just
to address the growing concern regarding process control. Therefore, the OPC Foundation saw
connectivity to the plant floor referred to as the it necessary to update OPC’s meaning to include
“Device Driver Problem.” the changes in market terminology and in the
technology’s application. OPC is simply called Open
At that time, HMI and SCADA vendors were Connectivity via Open Standards today.
responsible for building their own driver libraries.
This approach created great solutions when it OPC leveraged Microsoft’s COM technology for
included all the connectivity requirements that their quite some time. It was the basis for Alarm & Events
end users would need, but incomplete solutions (AE), Historical Data Access (HDA), and several other
when it did not. The vendors were faced with a less-adopted specifications (like Commands, Batch,
decision: they either needed to invest resource Security, and Complex Data).
application-level functionality or extend connectivity.
Over the years, data has transformed into information
Some vendors decided to create their own or data with context. As such, the classic standards
Application Programming Interfaces (API) or Driver have evolved as best as possible to meet the needs of
Toolkits. Although this solved their own connectivity today. The latest generation of OPC is known as OPC
needs, it limited end users to how they could Unified Architecture (OPC UA). Like its predecessors,
approach purchasing additional solutions. Luckily, it OPC UA provides the same benefits of OPC such as
was not too long before the market persuaded the device connectivity while offering so much more. As
vendors to collaborate and make changes that were a member community, the OPC Foundation has since
in the end users’ best interests. learned that they can take what they like and change
what they do not like. Doing so has created a much
The initial task force consisted of half a dozen more robust and cohesive technology for linking
companies, including Fisher-Rosemount, Intellution, different domains compared to classic specifications.
and Rockwell Software, among others. They took off
their competitive hats and ventured out to solve this The OPC Foundation continues to enjoy growth and
problem. The direction was pretty clear. All software success since its beginnings in 1994. Today, it can
developments at the time were targeting Microsoft count more than 500 companies as members, most
Windows as the platform of choice. Microsoft’s client/ of which build multiple OPC-enabled applications,
server technology, Object Linking and Embedding including servers and clients that support one or
(OLE), was used to share information between more technologies like Data Access, A&E, HDA, and
applications using vendor-specified interfaces so forth.
and rules. The group’s initial plan was to create a
solution that would generalize data transfer between Despite its North American origins, most of the
applications and any data source. The result was the OPC Foundation’s members are now in Europe
first OPC specification, referred to as OPC Data Access (48%), followed by North America (35%), Japan
(OPC DA), released in 1996. (8%), and China (3%). All other regions of the world
take up the remaining 6% of membership. OPC
The OPC initialism was originally called OLE for has clearly become a global standard; their latest
Process Control, but its meaning has changed over OPC UA specification has also achieved well-
the years as a result of changes in the market. deserved status by becoming an International
First, Microsoft rebranded OLE as Component Electrotechnical Commission (IEC) standard.
Object Model (COM) not long after OPC DA was
There are now thousands of OPC-enabled products To address this and to provide end users with a more
registered by OPC members alone. There are also confident stamp of approval, the OPC Foundation
many non-members who have developed their had the idea of creating an independent test lab.
own OPC-enabled solutions both client and server There is currently one in Germany (run by a company
applications with the use of OPC toolkits. known as Ascolab) and one in Scottsdale, Arizona
(run by the OPC Foundation at their North American
Programs and Evangelism headquarters). Self-certification tests are run before
the labs put a real-world test against the OPC-
Unfortunately, open standards are not enough to enabled applications. The labs also support client-
ensure the best solutions for end users. Vendors side testing, which is very hard to accomplish using
can interpret the specifications differently, which a self-automated tool. Lab-certified products are
results in non-interoperable or inconsistent then used to assist with the third-party certification
solutions. Although the specifications have of other products. If successful, the vendor will be
clarified and removed many ambiguities over the presented with a Lab Certified logo that specifies the
years, the success of any standard relies on some product and version that underwent testing.
sort of conformance tool. Fortunately, the OPC
Foundation has solved this problem. Another testing method leverages the basic concept
behind the OPC Foundation: it gets vendors (who
The OPC Foundation allows vendors to certify their are sometimes competitors) together to test clients
products. There are two types of certification: against servers using a pre-defined set of tests.
self-certification and independent lab certification. These interoperability (IOP) workshops pre-date
The ability to perform self-certification came both self-certification and third-party certification,
first, through a Compliance Test Tool (CTT) that and give vendors an excellent opportunity to test
allowed members to download, install, and run and validate against the very products that their end
against their applications. There has always been users will utilize. There are three IOP workshops each
a better offering of server-based compliance year: one in North America, one in Europe, and one
tools because servers must implement all the in Asia. In 2012, most companies focused on testing
required functionality that is defined by the first generation UA-enabled products, because most
OPC specifications, whereas clients only need to classic OPC-enabled products have matured and are
implement the functionality that makes sense for stable by now.
their application. It can be difficult to determine
whether the client application is operating correctly Self-certification, investment in third-party
without having specific knowledge of the product. certification, and participation in IOP workshops
Luckily, Compliance Test Tools are very rich and are key differentiators between product vendors.
allow vendors to easily understand the product
areas that are non-compliant. They also assist with In order to educate and update the engineering
the debugging and resolution of these problematic community on OPC advancements, the OPC
areas. If an application can run through a CTT suite Foundation began hosting OPC Roadshows. The
of testing without errors, a report will be generated events were held 6 to 8 times per year in various
that is then sent to the OPC Foundation for a self- cities in North America. They were free to end
certified logo. These tests exist for a majority of users and paid for by sponsors, who gained
OPC interfaces, including DA, A&E, HDA, and UA. the opportunity to engage with potentially new
customers during breaks. Presentations were given
One key disadvantage in this type of testing is that by various sponsors to evangelize and educate the
vendors may not always test in a real-world scenario. attendees— not to advertise their companies.
It’s much easier to run a product reliably and
consistently in an environment that is tightly controlled.
Instead of hosting several small events, European The Foundations of OPC Data Access
members decided to arrange fewer large events
called “OPC Days.” OPC Days are very similar to Although most people in the automation industry
the North American road shows in that sponsors are likely more knowledgeable about OPC Data
attend and exhibit to help with the cost of the Access (OPC DA) than any other OPC standard, it is
event; however, attendees must pay a nominal helpful to review some key elements of this “Classic”
charge. OPC Days aim to educate the community specification. It is worth noting that “Classic” does not
on the latest OPC advancements, and encourage mean “Legacy:” OPC DA is not going away any time
end users to present their OPC success stories. soon. In a way, it can be compared to Microsoft’s DDE.
They also provide a venue for networking, and for Although DDE is a very old technology by Microsoft
demonstrations that show OPC interoperability in a standards, it is not old by industrial automation
live setting. standards where hardened and proven technologies
stay in use for quite some time. The same may be said
Today, the OPC Foundation continues its attempts for OPC DA (or OPC Classic).
to simplify and act in the best interests of the
end users. They recognize that end users face a OPC generalized Data Access down to a value,
challenge in selecting from a vast array of standards a quality, and a timestamp. The value represents
and products. Questions often arise regarding the data, the quality indicates whether the data
standards in the same market: How do I choose? is trustworthy, and the timestamp indicates the
Which standard is better? Which standard will meet data’s freshness. In order to use the data, the OPC
my needs? Do these standards complement each Foundation created a well-known interface for OPC
other or compete with one another? client and server applications to adhere to that is
known as an Application Programming Interface (API).
In an effort to encourage interoperability across They also provided the redistributable binaries that
the board, the OPC Foundation has partnered with are required to enable OPC on a Windows machine.
different standards organizations to determine
how to leverage beneficial features and produce The API provides a mechanism to discover both
an optimal result for the market. They are currently the OPC DA servers that are available and the
working with PLCOpen, Field Device Integration (FDI), information or data that they contain. It also gives
and Electronic Device Description Language (EDDL) client applications the ability to read, write, and/or
on how to configure devices with OPC UA. They are subscribe to data. Clients can decide whether they
also working on how to leverage information models want to read data from a device or from a cache
that already dominate in certain verticals (such as that is updated independently of the client request.
BACnet for Building Automation, DNP for Power, They can also select whether to poll the server for
and WITSML for Oil & Gas) with the power and data periodically or to only subscribe to the data
abundance of OPC products. that has changed within a specified interval. To
clarify, a change in data is a change in either the
OPC is not being positioned to take over these value or the quality that is associated with the value.
existing specifications, but rather to provide the glue To a client and the end user, it is essentially a data
that binds the different standards and information event of importance.
models together. It will be interesting to see how
well these standards organizations work together,
because there are clearly some cases where they
would compete.
Because OPC is built on Microsoft COM (new OLE) Today OPC DA is a mature specification that has not
technologies, it benefits from the distributed experienced changes for several years. This means
nature of COM, referred to as Distributed COM that the products based on the technology have
or DCOM. DCOM has its pros and cons. Many end also matured, and any issues have been already
users see DCOM as challenging to configure, and been identified. It is highly probable that OPC DA
an outright pain to correct when it does not work implementations will succeed in solving end users’
well. For example, some users have experienced connectivity needs.
situations where the client can communicate with
the server, but the server fails to update the client. Data Access trends into their solutions did not wait
DCOM has also been known to lock up for over for the OPC Foundation to develop a specification
six minutes if communications are lost between around those types of data. Instead, they built the
the client machine and the server machine. Six support on top of OPC DA. Many Alarm & Events
minutes is a long time to wait for data updates in a Manager or Historian products are simply Data
manufacturing environment. Access clients. For end users, there are many OPC-
enabled products from which to choose.
Luckily, OPC DA evolved over the years and gained
new functionalities that help end users bypass The Marketplace Acceptance of OPC Data Access
DCOM anomalies and enhance DA application.
The greatest improvement is the server’s ability Vendors, system integrators, and end users have all
to periodically send Keep-Alive requests, which become familiar with OPC. They understand what
ensures that a subscription callback has not failed. a typical OPC server installation looks like and how
to configure an OPC client. Even though many end
The Benefits of OPC Data Access users have experienced some challenging DCOM
situations, many have either become accustomed
The benefits of OPC DA are quite simple. This to or accepting of the DCOM security model.
technology continues to solve the device connectivity
problem that prompted the establishment of the The breadth of available products enabled OPC to
OPC Foundation almost 20 years ago. become widely adopted among the people who
configure automation systems—even those who
When the specification was released, it was expected do not entirely understand how OPC works behind
that there would only be one OPC server per piece the scenes. This is similar to the way many people
of hardware developed and provided by the device interact with the different peripherals connecting
manufacturer. This was not the case. Instead, a new to personal computers: they just know that the
market was created for companies that specialize in components are going to work.
developing OPC-based connectivity solutions for a
wide variety of data sources, making it viable for end OPC DA technology is proven. Any specification that
users to have a consistent connectivity experience. can remain untouched for a period of time and still
be leveraged today really speaks to the robustness
OPC’s distributed nature and underlying technologies of the technology. As such, OPC has become part of
allow data requests to be aggregated through a single an automation engineer’s toolkit. Engineers have
server that feeds data to many client applications. learned the tricks of the trade and understand what
Multiple clients with native drivers no longer need to expect from implementation and performance.
to make the same requests for the same data to the End users have become accustomed to OPC
same devices. OPC has reduced the burden on both Classic’s ease of use including available data, data
the devices and the communications infrastructure. types, read/write permissions, update rates, and
additional properties).
Furthermore, Web Services are firewall-friendly for After compiling a long list of questions, several
a reason: They use one-way communications. In this key objectives became obvious. The first
architecture, a client makes a request and the server objective was to create a technology that could
responds by an outbound connection using the well- run on any platform (not just Window or Linux),
known HTTP ports used daily in our modern world. but something that could run up to the highest
layers of an enterprise down to an appliance or
Because XML is text-based and “fat” in nature, embedded device. Vendors should be allowed
the OPC Foundation decided they would need to to implement the technology on a wide range
allow some “state” to be kept in the server and that of systems, independent of the tools available
a continuous polling model (closely emulating a for that particular platform and regardless of
subscription) would be needed in order to guarantee whether it is running a particular operating
performance. This is known as a Polled Refresh. system or the applications require a particular
Depending on the cleverness of the client, the server programming language.
is able to achieve performance on par with true
subscription based behavior over web services. About a year into the UA effort, the OPC Foundation
came to the conclusion that it had to invent its own
The Evolution of OPC Unified Architecture high-performing OPC-specific wire protocol in order
to achieve the expected performance. In doing so,
Shortly after the XML DA specification was released, they decided that it was also necessary develop a
an initiative began to create XML companion set of UA protocol stacks in multiple programming
specifications for Alarm & Events and Historical Data languages that application vendors could utilize.
Access. The question was raised whether the OPC
Foundation would have to go through that same The second objective for the new UA architecture
exercise for other specifications that had similar but was to consolidate the service set to deal with all the
also dissimilar interfaces. What about in five years’ types of information in which users are interested
time, when there are likely better ways to exchange real-time data, alarm and events data, historical data,
data between applications? What about when the and so forth. To do this, the OPC Foundation looked
XML Web Services are replaced? for commonalities. In an effort to avoid rewriting
the technology in a few years due to a new way of
The OPC Foundation decided to step back and exchanging data over a wire, they considered how
identify the commonalties shared by the different software vendors moved data between different
specifications. They determined that it was necessary levels in an enterprise. The answer was Service
to be able to decouple the API from the underling Oriented Architecture (SOA).
wire protocols, so that the new technology could be
mapped to any communications transport or medium SOA allows users to create a very abstract set of
in the future—without requiring the specifications services that could be mapped to multiple transports
to be rewritten. With this insight, OPC Unified without affecting the interface between the
Architecture was born. application and the service set. Ideally, the application
could take advantage of new transports without
The team began its research by looking to the past for having to be rebuilt with specific knowledge of the
answers. What features are liked? What development new medium.
should have been done differently? What were the
problems that OPC had tried to solve but had been
previously limited? Were there others in the software
industry with similar problems and solutions that
could be leveraged?
These requirements result in a low-level base set of Generically, UA requires that clients create a secure
services like building blocks that can be specified by channel to configure authenticity. It also requires
OPC and extended for different types of information. a session to ensure message integration, which
For example, to be able to read “something;” to be is usually only done once due to its expense. The
able to write “something;” to be able to discover underlying implementation is transparent to the
“something.” This “something” is much more than application levels, completed by the communications
some primitive piece of data. It is an object that has stack, and depends on the protocol or transport used
one to many properties. A property could be the to exchange messages. For example, UA Binary over
name of the object, the data type of the object, TCP may be secured with the use of secure sockets
the value of the object, and so forth. These objects (SSL), and XML Web Services may be secured with the
are discoverable within the UA server’s address use of HTTPS.
space, which is not the classic tree-based hierarchy
browse space. Instead, it is a fully-integrated address Furthermore, UA allows any client/server interaction
space that can be viewed as a hierarchy, indicates to be audited and traced, which is useful and often
relationships between different nodes, and allows required in regulated environments.
native complex or structured-type data to be defined
and accessed generically. The Benefits of OPC Unified Architecture
OPC UA can layer on the specifics to deal with real- The first clear benefit of OPC UA is the decoupling
time data, alarms and conditions data, historical of the API from the wire. UA is designed to fit
data, and so forth. It can also work with other into Field Devices, Control Layer Applications,
standards organizations to map others’ data models Manufacturing Execution Systems (MES), and
to OPC; and, finally, to allow vendors to extend the Enterprise Resource Planning (ERP) applications. Its
generic information model for their own needs. generic information model supports primitive data
types (such as integers, floating point values, and
The third objective for the new UA architecture was strings), binary structures (such as timers, counters,
to ensure the security of the information as it is and PIDs), or XML documents (which can be thought
delivered between client and server applications. of as text based structures). OPC UA delivers an
Moving outside of a firewalled domain requires the interoperability standard that provides access from
development of a technology that protects the data’s shop-floor to top-floor.
authenticity and integrity. The UA security model
allows for user authentication, communication UA moved away from the Microsoft-centric security
integrity and confidentiality, and verification of model to something that is more familiar with IT
functional claims. These features ensure that only departments. Most of this is defined on the protocol
authenticated users or applications can communicate, or transport that is utilized. Although that protocol
that the information being exchanged cannot be or transport may change dramatically in the future, it
compromised by an external agent, and that a client should have little effect on UA as it is specified today.
and server can predefine what the other is capable
of doing. This is accomplished both by exchanging Lastly, UA supports an information model that can
certificates and by obtaining UA-specific profiles that be extended and defined to interoperate with the
indicate the level of UA conformance for each party simplest and most complex systems.
involved in the conversation.
OPC Unified Architecture Environments It is interesting to note that OPC .NET was developed
as a layer that can sit on top of an unmodified OPC
As described earlier, OPC UA knows no boundaries. Classic application. Vendors are encouraged to take
Its environment can be anywhere from the plant the reference implementation wrapper and brand it
floor to the Internet. It can work on any platform, on a per product basis. With vendors deciding where
and be used in markets even where OPC is not to put their efforts, this approach greatly simplifies
known today. It will be exciting to see what the the decision on how to adopt OPC .NET.
future holds for this technology.
Conclusion
© 2016, PTC Inc. All rights reserved. Information described herein is furnished
for informational use only, is subject to change without notice, and should
not be taken as a guarantee, commitment, condition or offer by PTC. PTC, the
PTC logo, Kepware, KEPServerEX and all other PTC product names and logos
are trademarks or registered trademarks of PTC and/or its subsidiaries in the
United States and other countries. All other product or company names are
property of their respective owners.
J07858–Understanding OPC-EB–EN–1016