No, AI is not coming to your demand planning job – Demand Planning, S&OP/IBP, Supply Planning, Business Forecasting Blog

AI and ML Jobs



By integrating artificial intelligence and machine learning into S&OP processes, companies can become more agile in responding to changes in demand. Data sources that were previously difficult to analyze due to their complexity or sheer volume are becoming standard planning inputs.

For example, demand planners can use these tools to use data from sources such as online reviews, competitors’ ads, and even tweets.

With all that AI/ML can do to enhance the planning process, does this mean that the current role of demand planning is doomed? Should those currently working in the field of demand planning start looking for a new job?

Not so fast. While AI and ML provide new and powerful ways to manage demand inputs to the planning process, they also have some significant limitations. And I think we will need human demand planners to ensure that AI and ML are truly effective in the planning process. To better understand how these technologies and human Demand Planners can complement each other, let’s first look at some of the often overlooked limitations of these tools.

1. Patterns everywhere

The main way these tools can aid planning is by finding patterns in large amounts of complex data. As companies collect more and more data about their customers and businesses, the amount of data that needs to be analyzed to make good decisions exceeds what human demand planners can manage. By training AI/ML systems to find patterns buried in these mountains of data, companies can leverage data that was previously inaccessible due to volume or complexity.

But patterns can be a trap. Just because your customers have bought a lot of products every September for the past six years doesn’t mean the same thing will happen this year. You should evaluate other factors that may influence this pattern, such as pricing, product features, and competitive products. So, while AI/ML is great at showing these sales; maybe If it happens again, human input and research will be needed to determine whether it’s worth betting on it happening again this year. In time, it may be possible to train the system to incorporate these potential sources of input. But until then, human insight will be needed to fill in these gaps.

2. Intelligence and common sense

Even when humans can deduce things from common sense, AI/ML can stumble. Look at this scenario. A man went to a restaurant. he ordered a steak. He left a big tip. If you ask your friend what this guy ate, he’ll probably say steak. But most AI/ML systems will struggle to get this answer. This is because these statements do not explicitly state what the man ate, only what he ordered. From experience, we know that what we order in a restaurant is usually also what we eat. This kind of context-based, common-sense extrapolation is difficult in AI/ML systems.

In planning scenarios, this can be a big problem. For example, customers always order large quantities of a certain product for Thanksgiving, but then return about 20% of the product because it didn’t sell. Does this mean the customer doesn’t understand the correct way to purchase the product? Or does it mean that excessive quantities are needed to fill the display throughout the sales season? AI/ML can’t answer these questions, but human planners can reach out to the customer and assess what the real problem is.

3. There are limits to AI’s adaptability.

One of the strengths of human intelligence is that the human mind can easily adapt to new information. If I tell you that a customer has just gone bankrupt, you will know from experience how this will affect your business. You can quickly adjust your processes to accommodate this change. AI/ML cannot react that quickly. These systems need to be retrained to know what to do in this situation.

And because each situation is slightly different, training provided for one scenario will only have limited application to subsequent scenarios. These systems require continuous training to become truly agile and adaptable.

4. AI cannot understand cause and effect

Humans instinctively understand cause and effect from experience. If you drop glass on a hard floor, it will break. However, dropping glass does not cause it to break. This happens when the glass hits the floor. Here’s another example. We know from experience that when the sun rises, the rooster crows. AI/ML has no problem understanding this relationship. But when it comes to whether the crowing of a rooster causes the sun to rise or vice versa, these systems are stuck.

Using the customer bankruptcy example above, from our experience it is usually easy to assess possible causes such as lack of sales, high expenses, loss of funds, and increased competition. Our experience allows us to make these mental jumps with ease. However, without extensive training, AI/ML systems will have difficulty associating cause and effect.

5. AI lacks ethics

AI/ML systems reflect the biases and perspectives of the humans who trained them. They cannot distinguish between right and wrong. Programming these systems to reflect the complexity of human values ​​and how to adapt these systems to different changing circumstances is extremely difficult. Therefore, allowing them to make certain types of decisions can be dangerous.

For example, when planning how much credit to extend to a customer, you can train a system to analyze the business factors that make a customer a good or bad business risk. However, these systems do not tell us how well our customers are managing their environmental and social impacts. If these factors are important to the decision, human input is required.

complementary, not competitive

The AI/ML limitations described here may make these tools seem less useful than you initially thought. In fact, they can be very useful if you are working with large amounts of data and have the time, skills, and resources to properly train them. What they lack is what human planners can provide. In the best case, I believe that the combination of a properly trained AI/ML system and an experienced demand planner can be highly effective in unlocking all the insights hidden in the data.

To make the most of this relationship, demand planners will need to develop some new skills. Although much of the data analysis can be left to the system, human insight based on extensive experience is required to get the most out of the analysis the system provides. By combining human soft skills such as relationship building, listening, innovation, and strategic thinking with AI input, planning can become more agile and effective.

This article was originally published in the Fall 2021 issue of the magazine. Business Forecasting Journal. As an IBF member, you’ll receive journals delivered to your door every quarter, discounted admission to IBF conferences and events, member-only tutorials and workshops, access to the entire IBF Knowledge Library, and more. Get your membership.





Source link