Food Safety Modernization Act requirements for traceability


by Frederic Clulow

What you need to know

On March 7th GS1 US announced the publication of its guidelines to assist the food industry in implementing the new traceability requirements under Section 204 of the Food Safety Modernization Act (FSMA). GS1 is an international organization that standardizes the identification of products and transactions of supply-chain data between different parties. If you’ve ever scanned a barcode, you’ve likely used a GS1 standard. 

Kezzler participated  in the GS1 workgroup responsible for drafting the guidelines, along with many other industry partners and technology organizations. The issuance of these guidelines signals positive progress toward the practical implementation of the FDA’s vision of Smarter Food Safety. Why are these guidelines so important? We’d be remiss if we didn’t explain their importance by focusing on some of the challenges posed by traceability implementation.

What is the Food Safety Modernization Act (SMA) Section 204?

If you are reading this, I assume you are familiar with Section 204, at least broadly. I can’t really imagine too many people reading a blog post on enhancing food traceability to improve food safety just for the fun of it. So, I will gloss over the particulars of the FSMA rule and get straight to the not-often-discussed complexities of implementing the rule, boiling them down to the essentials. 

FSMA Section 204 is a response to the growing food safety problem in the form of the Food Traceability Rule. As a part of the Smarter Food Safety initiative, the rule encourages companies in the food value chain to invest in digital traceability technology. Traceability contributes to everything from more effective recalls to the removal of contaminated food products swiftly and safely.

Advice on the what and when of FSMA data – but not the how of traceability

Despite the demand for traceability, the rule does not recommend or suggest any particular technology to achieve its traceability aims. The rule does not endorse a data format, database structure or data exchange standard. It makes no requirements as to the how.     

It does, however, make specific demands about what and when data must be collected. Thus, we know which data is to be captured for particular events at specific points throughout an item’s lifecycle. We also know, as a minimum requirement, that this data must be collated into an electronically sortable spreadsheet within 24 hours of a request.

Therein lies the rub. And why the GS1 guidelines are so important.

The reality of the supply chain and data flows

In the public imaginarium – and I think fair to say, in the minds of many a professional – modern supply chains are paragons of efficiency fueled by complex, algorithmic, data-driven predictive analysis. The reality is far from that. Supply chains rely on a collection of independent or semi-independent operations united via the physical flow of goods. But while raw materials and products may flow from one operation to the other, this is not true of data.

This is illustrative of an end-to-end connected supply-chain traceability paradigm and not necessarily reflective of FSMA 204 KDE collection requirements

In general, product-relevant data remains bound to the physical operation and at a higher level, to the corporate entity that had custody of the product at the moment the data was created. The data seldom leaves those boundaries. The rule seeks to improve this by incentivizing the sharing of some key data elements and keeping the traceability lot code intact throughout. Even when operations are under the same corporate umbrella this does not guarantee the seamless amalgamation of data.

Overcoming lack of standardization to get to FSMA-compliant traceability data

Section 204 will require you to collate information that is stored in a WMS, an MES, an ERP at a minimum (and this list can be vastly expanded). But different events are stored in vastly different systems that just don’t talk to each other. This is compounded by the fact that complex operations have multiple installs of each of these different systems across different supply-chain nodes. These can be run on different versions of the same software across the enterprise. Possibly these solutions are not designed to facilitate a centralized query and operate as standalone deployments (even if they are cloud based). To make matters worse, all these solutions usually speak a different language. So even if you could get information out to a different system, you would also need to make sure that the receiving system can properly read it to make it usable.

In essence what we have are data islands (more accurately, data silos but bear with me here). Each island is home to a lot of relevant information, but it has no way to get off the island. A recent Forrester Consulting study conducted for Rockwell Automation highlighted exactly this challenge: 35% and 36% of respondents said that lack of sufficient standardization and difficulty stitching data together was the major hurdle in achieving traceability.

This is illustrative of the data integrations needed for an end-to-end connected supply-chain traceability paradigm and not necessarily reflective of FSMA 204 CTE collection requirements

Easing the challenges of FSMA 204

Back to rule 204. At first glance the requirement to provide electronically sortable spreadsheets makes it seem like collecting this information is easy. Perhaps this is true for a smaller operation. But the requirement can rapidly scale in complexity with an increase in size. Once you account for the key data elements (KDE) that need to be collected, the number of steps several products under the rule are subject to, and the number of lots for which this information needs to be made available, data requests can generate several hundred if not thousands of data points.

Good luck collecting all this data in 24 hours*.

Add to this the fact that the systems mentioned above are built to fulfill a very specific role. Often corporations will force existing systems (usually the ERP) to double as traceability platforms. The result is inadequate data output and low compatibility with its intended use. A system designed as a traceability layer ensures that the required data can be pulled together in one location. Of course, this adds to the complexity by adding another application but at least it’s designed for this.

Taming data chaos and breaking down data silos with standards

If we go back to the island analogy, we need to build a ferry system to transport the needed information as well as the docking elements to permit the loading/unloading of information. Of course, the entire system must be standardized and organized so ships can properly dock, load, unload at the right port with the right information.

This is precisely why the work undertaken by GS1 US and all members of this workgroup was so important. A common understanding and data exchange model sets the basis for industry-wide compatibility and interoperability regardless of technology. Different actors across the supply chain can now use a unifying syntax to share data irrespective of the systems they use.

At Kezzler we are big fans of this approach. My colleague Johan Borg penned a post on GS1’s Electronic Product Codes Information Service (EPCIS) and the interoperable supply-chain future. While adoption of EPCIS still remains relatively low compared to EDI adoption, a transition to EPCIS would provide a qualitative leap in CTE/KDE collection and facilitate the creation of an ecosystem that generates additional value from the data collected for FSMA 204 compliance.

But that may be a topic for another post.


*I fear this is a complexity many are overlooking. My suggestion for companies seeking to leverage technology to achieve traceability compliance is to evaluate the following elements for selection:

1. Data scalability: How many data points can the platform process without diminishing performance?

2. Low latency: How quickly can the platform retrieve an individual record and maintain performance with an increase in data complexity?

3. Interoperability: Is the platform designed to be integrated and collect and share data with other systems by leveraging common standards?