The Interoperability Illusion

Publication
Article
Pharmaceutical CommercePharmaceutical Commerce - June 2019
Volume 14
Issue 2

Your serialization systems aren’t as interoperable as you might think

wolke by videojet

Fig. 1. Where data becomes reality: in-line marking of pharma packages. Credit: Videojet

Interoperability. This is how the internet derives its usefulness. When you purchase a smartphone, tablet or computer you’re not worried about whether you’ll be able to connect to Google or Amazon, for example. You expect these connections to happen- and they do. As such, interoperability is a good thing. It’s such a good thing, in fact, that the word interoperable was written into the law which has driven serialization efforts here in the United States for the last six years.

EPCglobal to the rescue?

The Drug Supply Chain Security Act (DSCSA; part of the Drug Quality and Security Act passed in 2013), is the law which contains the regulatory specifics around pharmaceutical serialization. Among other things, the DSCSA contains details with regards to who must serialize, what must be serialized and also specifics around how serialized goods are to be handled by authorized trading partners. The net intent of this law is to create an interoperable network of systems that can verify the authenticity of drugs as they travel through the supply chain and, ultimately, to the consumer.

Having said that, aside from some specifics around the barcode symbology to be used for serialization, the DSCSA is devoid of any details relative to the standards to be utilized during manufacturing, data-transmission and for overall interoperability among serialization systems. The government basically told industry: “you figure it out”… and this has led to some problems.

Before we dive into any interoperability being experienced today, let’s briefly revisit how we got to where we are in order to provide some context for what will follow: Before the federal serialization mandates and the California “e-Pedigree” act, there was the Wal-Mart Radio Frequency ID (RFID) mandate. Odd as it may seem, the Wal-Mart RFID mandate was the genesis of many of the standards that we use in pharmaceutical serialization today. In 2003 Wal-Mart issued a mandate to its top 100 suppliers to affix a passive ultra-high frequency RFID tag to each case and pallet for all products shipped. The effect that the mandate had on the tag suppliers, reader manufacturers, RFID software vendors and the consulting space was massive and during the early phases of the mandate it quickly became evident that prior RFID standards such as PML, originally developed by MIT’s Auto-ID Labs, were going to be inadequate. There needed to be an industry-led standards body to align all players.

EPCglobal to the rescue

As a result, the EPCglobal group was formed with GS1 to develop and maintain a group of interoperability standards for an RFID-based ecosystem. Out of this emerged the first version of the Electronic Product Code Information Services standard (EPCIS) along with some other lesser known standards. The Wal-Mart mandate was later cancelled leaving behind the formation of EPCglobal and the development of EPCIS as artifacts.

Being primarily created to handle RFID data, EPCIS does not mirror the real-world for barcoding in the same fashion in which it does for RFID. For example, to meet DSCSA compliance, man-ufacturers are required to label their goods exclusively using GS1 barcode symbologies and encodings - yet the EPCIS protocol uses an abstract Uniform Resource Name (URN) format along with Instance Lot Master Data (ILMD) to express what data is present within a given serialized barcode. To arrive at a well-formed EPCIS URN, certain barcode values are dropped, reshuffled and repositioned such that the end result is unrecognizable relative to what is actually on the barcode. Why is this? Due to their origins in RFID, the URN formats we use today were largely developed because RFID readers reported tag data as long hexadecimal numeric values- making any GS1 data, such as a GTIN or SSCC, embedded on an RFID tag completely unrecognizable to the human eye and also ill-suited for certain types of machine logic.

In addition, the structure of the data physically encoded on an RFID tag and contextually represented in an EPC URN share a strong correlation. So, arriving at a human readable format such as the EPC URN in the context of an RFID protocol makes perfect sense. In barcoding however, there is a loss of fidelity with regards to what is ultimately on a given barcode relative to that barcode’s URN representation. Over the years, this has led to confusion and difficulty when debugging problems with systems, labels and when working with various types of barcode scanners out in the field.

Since its initial release, EPCIS has been revised twice and additional, supplementary standards were also developed to enhance the expression of supply-chain business context within EPCIS messages. For example, shortly after EPCIS 1.1 was released the GS1 US Healthcare (USHC) standard was developed. For EPCIS version 1.2, the GS1 Core Business Vocabulary (CBV) along with the core EPCIS standard was expanded to supplant and replace much of the USHC standard. In general, these were both needed developments and overall noteworthy efforts; however, these two supplementary standards, in their own way, ended up introducing unforeseen problems into the interoperability equation. While the EPCIS protocol was designed to be backwards compatible with future versions (and it is), the USHC standard was implemented inside of its own namespace (which is a way of allowing one protocol to exist within another) so, technically, USHC was never officially part of EPCIS.

In a similar fashion, the CBV changes that accompanied the release of EPCIS 1.2 also fell within their own namespace and had significant overlap with the USHC in terms of expressing certain types of business context. Fast forward to today and you have a situation where systems have two perfectly good standards that can be used to express identical business conditions - some use both protocols, some use one. Some implement them both wrong, some implement them correctly but within different versions of EPCIS, some skip them altogether and implement custom namespaces in order to solve the same problems. In order to handle all of these variances in the “standard”, serialization systems now need to be able to understand multiple flavors (not to be confused with versions) of EPCIS to accomplish what is often the same business objective.

serialization systems

"

demonstrating the need for interfacing among VRS systems." alignment="left"/>

Bring in the connectors

Due to the above, it is currently more the rule than the excpetion for serialization platforms to have multiple “connectors” that handle all of the permutations of these standards, malformed expressions of the standards and even to handle vendor-specific “standards”. The net effect being that every time a vendor developed a connector to get around an EPCIS, USHC, CBV or platform-specific issue, the ecosystem in which serialization systems must interoperate did not self-correct, improve or become more efficient. In fact, the opposite has occurred. Today we are no closer to having true standards-based interoperability between systems than we were five years ago.

While EPCIS is utilized relative to supply chain events such as manufacturing, shipping, repackaging, etc., number range distribution, the process by which systems share serial numbers for use during printing and manufacturing, has been widely implemented across industry using no official standard. For the most part, the lack of standards around number range distribution has been solved using the connector approach coupled with what one might call the “do it for me” approach. The connector approach is identical to the approach systems vendors have used to get around EPCIS message abnormalities. The “do it for me” approach is where number range information isn’t shared at all.

The “do-it-for-me” approach has been adopted where parties have either thrown their hands up in frustration, found the integration effort to be too overwhelming right off the bat or where there were exclusive business relationships that made the probability of a product being manufactured elsewhere remote. In conjunction with many software vendors and industry sponsors, OPEN-SCS, part of the OPC Foundation, did develop a series of protocols that attempted to address the number range issue; however, due to the overwhelmingly slow progress of the protocol’s development and looming FDA deadlines, industry forged ahead without OPEN-SCS.

Lack of reference implementations

Many believe the lack of reference-implementations or open-source protocol stacks that are typical within most industry-specific landscapes may have contributed to the lack of system interoperability we see today. For example, before the internet went mainstream, the US Department of Defense had already put its full weight behind the uniform adoption of the open TCP/IP protocol. Similarly, the industrial control space was reshaped when Allen-Bradley opened up and shared DeviceNet (an industrial control protocol) with industry. In 1991, the first version of the Linux kernel, an open operating system, was released and is today’s de facto platform for IoT, cloud computing, machine learning and robotics. In each of these cases, there were both open-standards and real-world systems available to serve as reference implementations.

Currently, the QU4RTET open-source project stands relatively alone as a widely-available reference implementation of some of the more recent revisions of the EPCIS and related standards. “Standards are pointless if they are not adopted and utilized. Open source brings together people on the premise of collaboration, transparency, standards and a belief that shared problems are solved faster. In addition, supported open source provides a means for the enterprise to consume rapid innovations in a supported manner. RedHat Enterprise Linux is an example of such where RedHat contributes to the upstream innovations and makes it available in an open, verified, tested, documented and supported way to be backward compatible,” said Atif Chaughtai, Chief Technologist of Healthcare for RedHat (one of the world’s largest open-source software companies).

PDSA to the rescue?

There are some recent industry-let efforts underway to address all of these interoperability concerns. The Pharmaceutical Distribution Security Alliance (PDSA), a group of more than 30 companies and trade associations, has proposed, an industry-run governance body to help standardize future interoperability efforts. (See: https://pdsaonline.org/white-paper-download/)

Such a group would be a great start in getting industry aligned and unified on solving the interoperability problems we face today and would help push the pace of interoperability along with innovation. Having said that, until such a body is established and its principals adhered to, any new technology vendor that wishes to innovate in serialization by grabbing the current standards documentation and having a go at it will be in for a rude awakening.

ABOUT THE AUTHOR

Rob Magee is a graduate of Temple University and has been building and developing serialization systems and businesses for over 15 years. He is one of the founders and primary contributors of the SerialLab open source initiative and currently runs the IT and Security practice for Vantage Solutions. He was the founder and president of Apostrophe Systems, which became part of Systech International in 2012, and has worked in the past with Optel Vision, Acsis, Systech and VeriSign as a serialization con-sultant and software systems architect.

Related Videos
© 2024 MJH Life Sciences

All rights reserved.