Why the supply chain needs API-led connectivity: data complexity

Managing complexity is at the heart of supply chain strategy. Whether it’s in logistics, planning, procurement, or warehousing, supply chains provide firms with a competitive advantage by adding intricate, automated, or resource-heavy capabilities that can’t be easily replicated.

Nowhere is complexity more visible than in the supply chain data a firm keeps such as vast Stock Keeping Unit (SKU) libraries, Bill Of Materials (BOMs) with thousands of parts, high-volume transactional ledgers, and partner inventories, to name just a few.

There’s so much complexity that few of these firms have figured out how to use this data to their advantage. Instead, it’s siloed, inaccessible, and even untrusted. With widespread analytics initiatives and mergers and acquisitions (M&A), the problem is only getting worse — in spite of large investments in IT systems and talent.

According to PwC’s Industry 4.0 survey, only 9% of companies have real-time, end-to-end integration and planning platforms, and only 18% are digitally connected with external partners. Data and integration complexity and cost are usually the biggest impediments.

Fortunately, in recent years a new technological path has emerged to drive ROI, reduce risk, and improve time-to-value results in digital supply chain initiatives: API-led connectivity (ALC).

Forecasting failure

Before getting into ALC, let’s first look at a case study of a global device remanufacturing company that in 2018 made a multi-million dollar investment in digital supply chain transformation. The firm’s supply chain was — and is — a competitive advantage because of its complexity, creating strong value for its customers and a high barrier to competitors.

After years of spreadsheet-based supply and demand planning, a project was launched to capture the value of statistically — and ML-driven planning algorithms. With a couple of billion dollars of annual spend on parts and materials, even a two to five percent improvement in sourcing and inventory costs could represent tens of millions in annual savings.

Global advisory and integration firms led the roughly 12-month effort, which includes the implementation of a leading cloud-based advanced planning software package. Project work focused on implementing the software, refining the statistical models (e.g. ARIMA using R), and capturing specialized forecasting requirements of the planning teams. With talented teams and unwavering executive support, the project seemed on track for success.

Yet two years later, the project had still not fulfilled its promise — and it was increasingly unclear if it ever would. Why, and could this have been predicted?

It’s about the data

One of the best ways to understand API-led connectivity is to understand what happens in its’ absence.

Planning systems require huge amounts of quality and (ideally) real-time data. In the case study above, relevant data was distributed across dozens of systems, so it had to be retrieved, transformed, and consolidated for consumption by the planning system. This process had to be operational and automated before anything else could “work.”

Unfortunately, this data integration challenge was not the priority of the consulting teams, planning teams, or executives sponsoring the effort. In fact, integration planning was considered so secondary to the statistical modeling and analytics work that it began months after other project work-streams.

Getting the data turned out to be hard. With no standardized or advanced integration toolset, a hodgepodge of tech tools were used, including data warehouses, orchestration tools, and reams of custom code. Each data source required specialized IT talent to access it, but the talent was often backlogged, requiring project managers to fight for resources.

The nature of planning system algorithms is often iterative, so integration requirements changed throughout the course of the project, and in time multiple algorithms were implemented to allow planners flexibility creating “what-if” scenarios. These changes wreaked havoc on the largely custom-coded integration effort.

Even when the project’s first phase was delivered five months late, the data was somewhat untrusted by the planning team. IT teams were still catching up with change requests, slowing down phase two work, and encouraging teams to cut corners.

A better path

This case shows the risk that comes with putting data and integration complexity as an afterthought. So how does a firm proactively construct a more complete integration strategy and protect such critical investments? API-led connectivity offers a strong, modern model to consider.

Simply stated, ALC removes integration complexity by “productizing” how a firm accesses source systems and their data. This makes it easier for supply chain teams (and IT) to find and consume data, even with no expertise in source systems. They just have to tie into the appropriate productized system API and orchestrate data flows — in many cases with little or no code.

ALC also uses “modern APIs,” meaning APIs are more than just gateways to source system data. System connectivity and data transformation can be built into the productized API itself, further reducing complexity and improving security. The results? ALC has been shown to speed up IT projects by three to five times, or more, and can have a tremendous impact on long-term manageability due to evolving requirements.

No complex data integration project is easy to execute, but it is easy to imagine how the above case could have likely turned out differently with ALC.

Executive (re)vision

In the above case also consider that the supply chain organization owned and funded the project, while integration work was delivered by a separately operated IT organization.

Everyone in IT knew how difficult the data work would be, though in planning they had no clear visibility into the complex data requirements of the project. Supply chain knew it had no visibility into or control of the data integration solution they depended on, but they trusted IT leadership to deliver on their timeline. This turned out to be a major planning failure.

ALC offers a fundamentally different approach to fix this without changing the organization. ALC’s methodology clarifies IT’s role as the producer of productized integration assets (and their data) for supply chain teams to consume.

Now project planning across organizations is about how the integration platform enables requisite data flows, not how the data is accessed from source systems. Over time, across multiple projects, the impact of this role alignment is game-changing, and partially why ALC has exploded in recent years across enterprise IT, and is gaining presence in digital supply chain investment.

Supply chains are more complex and volatile than ever, so why not take the complexity out of data integration? Learn more about leading firms who have seen phenomenal supply chain modernization results with ALC.



We'd love to hear your opinion on this post