Blue Yonder aims for ‘one speed’ supply chains with new AI and knowledge graph updates
- Summary:
-
Blue Yonder’s latest product updates are a culmination of work that’s been happening behind the scenes. The foundations are now in place for buyers to reconsider how they think about their supply chains.
As was highlighted by diginomica in a recent interview with Blue Yonder CEO Duncan Angove, the supply chain vendor has been going through a radical product transformation - backed by significant investment from its owners, Panasonic. Blue Yonder’s ambition is to shift enterprises away from stitching together ‘best in class’ supply chain products, where data interoperability is challenging and real-time insights are hard to get to, towards an interoperable suite of products, which are now built on Snowflake’s data cloud.
Blue Yonder’s argument has been that, historically, enterprises have had to rely on a spaghetti network of applications that are sitting in disparate silos, which makes it difficult to respond to macro changes in the market, disruption, and various areas of risk. By building a new interoperable suite of applications on Snowflake, buyers can enable data sharing and collaboration, as well as take advantage of new workflows and AI/ML technologies.
This work has been ongoing for a number of years and a recent product release from the vendor highlights a culmination of many of these behind the scenes efforts. Key to the latest product announcement is how the data flowing between different supply-chain functions is advancing Blue Yonder’s knowledge graph capabilities, which should allow business users to better understand the relationship between different supply chain functions (something that has been hard to do ‘on the fly’).
diginomica got the chance to sit down with Gurdip Singh, Blue Yonder’s Chief Product Officer, to discuss the latest release, who said:
One of the most fascinating things that's happening throughout the globe, and most certainly in the markets we serve, is we are being asked by customers to help deal with this problem of uncertainty. It’s being compounded with all of the events that have occurred around us. And what that has meant is that this old paradigm, where it was okay for applications to work in their individual data silos, is no longer tenable.
The clock speed of decision making that businesses now need to operate, both within the four walls of the enterprise and beyond it into their suppliers, and perhaps their supplier’s suppliers, just means that you need to have the construct of all of the data available in a manner where it actually is connected to, and seamlessly available for, these disparate decision and execution services to communicate and talk.
It’s clear from Singh’s comments here that the key thing to understand about Blue Yonder’s product strategy is that better use of data is needed to allow supply chain managers to respond more quickly to risks that emerge. You can read the product announcement in full here, but some of the key components include:
- Integrated demand and supply planning with AI and ML - Blue Yonder has integrated demand and supply planning processes into a single ‘motion’. The solution claims to simultaneously analyze changing demand patterns and supply availability to better support planning for efficiency, resiliency and service level attainment. This release also provides demand planners with AI- and ML-based recommendations to understand available inventory to service specific customers and regions.
- Enhanced integrated business planning - AI and ML updates provide ‘near real-time recommendations’ to drive alignment across supply chain and financial goals.
- Single view of online and in-store inventory - planners can now update assortment and strategic space to ‘bridge the gap’ between online and offline worlds to better need demand and reduce inventory holds.
- Enhanced returns orchestration - integrations between warehouse management, commerce and returns management mean combined order data with details on returning stock to improve inventory control.
- AI-tuned data model for ongoing intelligence - updates to Blue Yonder’s common data model now allow for embedded AI and ML to more accurate and explainable insights, as well as faster recommendations and root cause analysis. This will be the basis for linking to external, generative AI agents that can engage with Blue Yonder and third party data.
- Custom ML - ML Studio is now available for customers to meet custom data model requirements, providing the ability to test different models.
Speaking to the common data model that Blue Yonder has deployed and the capabilities of an interoperable suite of applications, Singh said:
This release, in a very comprehensive way, both for manufacturers and for retailers, but also for logistic service providers, [allows them to say]: I have a demand problem, my sales are falling short and I'm going to apply the price lever, I can see right there and then whether I have the inventory or manufacturing capacity at a supplier level to support the increase.
And on the other hand, if there was a supply problem and the warehouse couldn't deal with the influx of inventory, you now need to be able to shape your customers’ actions, taking into account what's happening in the warehouse, what's happening in your logistics network, what's happening in your supplier network.
This release basically says, not only is the data there, which is something that we've been working towards, but our ability to respond to events, as opposed to thinking about these as batch processes, or even micro batch processes - a business can now choose to manage their supply chain at a latency of their choosing.
The role of AI agents
It’s clear from Blue Yonder’s approach that an integrated data model allows enterprise buyers to be more responsive across their supply chains - both in terms of demand and supply planning, but also in terms of risk assessment.
Singh explained that the complexity and demands being placed on supply chains - and the increasing waves of data being made available - make it almost impossible for humans to scale alongside. And even if an enterprise could hire enough people to scale with the demands of supply chain management, those people probably wouldn’t be the experts needed to make the right decisions.
This is where Singh and Blue Yonder see AI and ML as being essential to the future of supply chains:
The other half of this has been to bring the power of machine learning, both predictive and generative, to allow for, on the one hand, a lot of these decisions to be automated, so you don't actually need human intervention; and then, on the other hand, setting the stage for a number of generative AI agents to now become available.
Blue Yonder is using third party risk services, for example weather data or natural disaster reports, to feed into the Blue Yonder repository, with that information being applied across the logistics and inventory network. This is where AI agents could play a role. Singh said:
Using that risk, we know there's a fire in LA, or bad weather that's happening somewhere in some other part of the world, that's going to impact the flow of inventory - which specific purchase orders are going to get impacted because of that?
Because we now know that a natural disaster somewhere else is happening at a certain point in time, and we know which Purchase Orders are expected to flow through at that time, we can then take that risk, and in an agentic framework, allow for that risk to be applied to the network. We can understand, using an agent, which purchase orders and shipments are impacted, and then using a separate agent, evaluate what options may be available. And then make a recommendation.
Without it, there just aren't enough people to keep track of everything that's going on throughout the globe, which is the reality of the supply chain today. In a very true way, we're allowing for the first time, real capabilities in systems to deal with the uncertainty and to do it at scale, leveraging the power of machine learning and leveraging the power of the data cloud that we've been building towards.
The role of the knowledge graph
What’s particularly compelling about Blue Yonder’s approach to supply chain management is how it’s combining its common data model approach with an abstracted knowledge graph that incorporates process intelligence. diginomica has highlighted on a number of occasions that for AI agents to move beyond hype towards actual usability, process understanding and data context are essential.
Blue Yonder appears to be tackling this head on. Singh provided an example of an integrated retailer that is using the vendor for financial planning, forecasting and allocation and replenishment. He said the retailer had been bringing data in at different levels of granularity. For instance, financial planning decisions are made at the monthly, quarterly or weekly level - but are not done at a lower class level of ‘style, color, size, or day’. Whereas allocation decisions are made around classes that include the likes of ‘style, colour, size or day’. Singh said:
That’s no longer the case. You can now bring the data in at the lowest level, because not only are we saying there's a common way to bring the data in, but we're also saying, and this is equally important, there is a knowledge graph that lives on top of this common data that allows you to say, for financial planning, ‘my processes require this cadence and this level of detail and these are the business relationships that I that I care about’.
Not only have we brought the data together, we've also put in place this semantic layer that lives on top of the underlying data, that gives businesses this flexibility to make decisions quickly and then to see that immediately being impacted, rather than it being an integration change.
This context provided by the knowledge graph allows Blue Yonder to understand the relationship between sourcing and supply, or consumption of raw materials and the finished product. This ‘knowledge’ typically gets captured in some bespoke manner in a table with relationships and foreign keys, and so on. Now, however, users can see that knowledge abstracted to a higher level. Singh said:
You can now capture that information and then use it for business decision making. Most supply chains don't operate at one speed. Just as an example, if you think about a product that's being sold in just the retail channel, versus a product that's being sold online in a marketplace and in the store, versus a product that's sold in the wholesale channel - customers have different supply chains for each of these different channels.
The product might be the same, but I need to have different speeds and gears in terms of inventory policies, flow policies, to manage that flow of product through my supply chain. I need to forecast for it differently as well. But all of a sudden now, if I capture that nuance in my relationship in a knowledge graph, I can apply, without coding it into a into a SQL statement or embedding it into an integration somewhere, a semantic construct that says: I have omni-channel products, I have wholesale products, and I have retail products.
The benefits of this should be felt immediately by supply chain managers, Singh said, as they are presented with opportunities that they may not have previously considered. But it also opens a path to a more sophisticated use of AI agents too:
There's a whole conversation to be had around how, as we get into agentic frameworks and leverage the relationships that are now available, about not using what would have been rules of thumb, or business practices that people have been doing and relying on them.
We can now go back and say: ‘you may have thought that you were sourcing these products always in this way, but guess what, because of reliability or other reasons, six times out of 10 you're overriding it. And quite literally, you have an opportunity to do a completely different negotiation, which we couldn't have told them before.
So there's a plethora of different ways in which we are beginning to see the benefit of that knowledge graph being applied. There's lots more to come as we go forward, as we apply generative AI and agents to exploit that information.
My take
When talking to enterprise buyers in recent years, one of the things that is often highlighted as a ‘top concern’ or as ‘high risk’ is their supply chain. Blue Yonder is absolutely correct in its argument that buyers want more responsive, interoperable systems - ones which flag risks emerging and can offer solutions to unforeseen problems. It is also true that the human capital often required to do this often can’t reach the scale of the challenge. ML and AI could offer a path to helping enterprises achieve some of this (and I appreciate that Blue Yonder has focused on getting the building blocks in place to make that happen, rather than launching ‘AI agents everywhere’ because that’s what the market wants to hear).
However, as Duncan Angove has noted himself, the big challenge now is a change management one. Organizations have, for many years, been operating their supply chains in silos and this has led to embedded practices and fragmented data. Overcoming that to target a ‘one speed supply chain’ is a huge education process - one that I know Blue Yonder is considering and investing in internally.
That being said, the use cases will speak for themselves and we will be speaking to customers in the coming months about how they are using AI and ML to become more responsive.