To comment on any of these articles, email Chris Gray at cgr...@grayresearch.com


S&OP Resources


On-site S&OP Education

S&OP Instructor's Manual

S&OP Software

S&OP Consulting


Education Courses
On-site Workshops/Classes


Education Materials

S&OP Instructor's Manual
MPS Instructor's Manual


Consulting and Research
Class A Certification
Performance Assessment
Software Evaluation and Selection

MRP II Standard System Definition
Software Supplier Directory (140K)


eSOP - S&OP Software



Recommended Books
The Right Choice
by Chris Gray

The MRP II Standard System by Chris Gray and Darryl Landvater

Sales and Operations Planning Handbook by Don Rice and John Civerolo




Associates and Related Companies
Worldwide Excellence Partners
Partners for Excellence
Graham Barton (New Zealand)

Bill Belt Excellence (French language)

Who We Are
Chris Gray Biography
Books by Chris Gray

This article appeared in Foresight, the quarterly journal of the International Institute of Forecasters.  The "Preview and Key Points" below are their summary of it. 


Preview and Key Points 


Chris Gray, noted S&OP author, summarizes the most important elements of the business world’s new focus on proper management of the forecasting process. The primary requirements as he sees them:

Maintain accountability and transparency
Compare different forecasting methods
Recognize that one size does not fit all
Apply statistical process control
Broaden forecasting into sales planning
Document assumptions
Improve forecastability through improved product design.


New Directions in Forecasting Management

Chris Gray




If you “grew up” in manufacturing in the 1970s, as I did, you may think that for the first few decades of the computer age there had been two common misconceptions about forecasting: 


First, that it was possible to develop a “right number” – that by devising more and more sophisticated and complicated forecasting algorithms, it would be possible to compute the one right number.


Second, that it was possible to develop a single technique that would work to forecast all items. 

We might call this the tools era of forecasting. The focus was on building the best models and using the computer to apply them universally across items and product families, in effect eliminating people from the modeling activity. 

Today, businesses have shifted their focus away from the purely mathematical and statistical hammers and nails of forecasting and toward better management of the forecasting process.  Fewer companies think they’ll find a mathematical “magic bullet” that will cure their forecasting ills or eliminate people from the forecasting process.



 Most companies now take a more encompassing view. 

Ensure that the demand-related processes of the company, including forecasting, don’t violate the basic system principles of accountability and transparency. 

In the end, computers cannot be accountable for the results of forecasting and demand-management business processes; that responsibility resides with people. Computer logic should support these important activities, but it should not attempt to replace people through automation, or confuse them by undermining the basic accountabilities of those individuals involved in the business process. 

Preserve the transparency of the system. Without the ability to understand where the numbers are coming from or how they were developed, people will resist being held accountable.

Given the choice between two forecasting methods producing the same results where one is simple and the other complex, it is generally better to use the simpler method. As Önkal and Gönül (2005) said in their article in the inaugural issue of Foresight, an important factor to creating user confidence is the forecaster’s ability to thoroughly explain the forecasting method and the justification for choosing it. 

Use the computer for evaluating and recommending a forecasting method, rather than simply producing an “optimized” result based on a single arbitrary method. 

For example, in many kinds of manufacturing, instead of seeking a single result from an optimization algorithm, it makes more sense to use the massive data manipulation capability of today’s computers to evaluate the behavior of various forecasting strategies to see which, if any, would have worked best in the recent past. In the best-designed systems, people can review the recommendations of the forecasting system as to which would work best, choosing the specific technique that makes the most sense for a specific item or family.  

Recognize that one size does not fit all when it comes to forecasting methods. 

As Stellwagen and Tashman (2013) noted in a recent article in Foresight, “Forecasters should switch between different methods as appropriate, rather than taking a one-size-fits-all approach.” Forecasting methods that may make sense for manufacturing enterprises supplying parts to other manufacturers often don’t work well at all for companies managing large-scale distribution networks with store-level sales data. In high-volume retail supply-chain situations with millions of SKU/store combinations, for example, specialized forecasting methods (including ones for handling low-volume intermittent sales) along with “flowcasting” (DRP-like) methodologies to the distribution centers and manufacturing supply points may be better for getting a more accurate picture of demand. 

Take advantage of analytical tools originally developed as part of statistical process control (SPC).

The tools of SPC promise to illuminate the “normal” behavior of demand, to identify where statistical forecasting is appropriate (or inappropriate), to suggest when changes to the forecast are warranted (and not), to help identify and eliminate bias in the forecast, and to monitor and manage the “accuracy” of forecast demand. 

In their article in this issue of Foresight, Martin Joseph and Alec Finney explain how control charts, which were popularized in industry because of their application to monitoring manufacturing processes, can be used to monitor demand processes. They describe how control charts can distinguish “normal” variation from a signal that the process is changing and is possibly out of control, how different types of signals can mark shifts in demand behavior versus normal noise, and how control charts can help limit the amount of unwarranted forecast “tampering.” 


Broaden forecasting into sales planning.


The experience of operating effective systems suggests that the keys to better demand numbers include coupling forecasting techniques to good human judgment, formulating plans for the specific actions needed to achieve the numbers, and clearly defining who will be held accountable. 

Forecasts are estimates of future demand, in terms of quantities and timing. These raw estimates, which are purely quantitative, must be complemented by judgment and accountability in order to be effective. As John Dougherty and I wrote in our book Sales and Operations Planning - Best Practices (2006), “… the statistical forecast is based heavily on past history. As long as the future is going to be much like the past, then everything works well. But usually changes in the product line, the customer base, the competition, promotion plans, the economy, and so on make the future quite different from the past. It’s the job of people, using their innate intelligence and their knowledge of current conditions and the expected future outlook, to adjust the statistics and establish the best forecast possible.” 

People formulate sales plans – defining the specific activities that must occur in the future to achieve company targets and goals, and the market activities required to realize the forecasted demands. This also includes processes to document the assumptions that went into the original numerical data as well as those associated with activities and assignments. Sales planning involves supplementing original forecast numbers – the raw estimates from above – with human judgment about external factors, such as:   

Market-development activities

Product portfolio adjustments and new-product introductions
Pricing changes
Collaboration with key accounts to gain market intelligence about likely future demand not reflected in history
Account plans for specific customers
Product placement considerations
Trade-show plans

These, along with the existing customer order backlog, specific assignments and accountabilities for work that must be done, and the documentation of the major assumptions that went into the final demand numbers become the sales plan for the product or products. 

Document assumptions. 


Every forecast makes some assumption about the numbers that were developed. These could be gross assumptions such as “the future will be like the past,” “the best method for forecasting future demand is the method that would have worked best in the recent past,” “we’ll have normal weather this summer,” or “housing starts will increase by x%.” Or they could be much more granular, like “pricing changes will give x% lift over the baseline demand,” “new product A at price point P will cannibalize sales of product B by Y% per period once it is introduced nationwide,” or “the top 6 customers will increase their purchases by Z units because of our targeted pricing.” 


When the real orders appear and are different from forecast, as they inevitably are, the questions management will ask are: 

Why is actual demand different from forecast?

Was our forecast flawed or is this just normal variability? 

Did we make bad assumptions that might explain the variability? 

Did the key elements of our sales plan affect demand differently from what we anticipated? 


Without having written down the basic assumptions that went into the original forecast, it’s difficult to make much sense of the difference between forecast and actual. In retrospect, everything that happened makes “sense.” But write down the assumptions and then evaluate the actual demands in light of those assumptions, and you have some hope of learning something about the nature of your demand and what levers you have that are actually effective. 


In “The Forecaster as Leader of the Forecasting Process”(Foresight, 2007), Borneman discusses the importance of these postmortems in the pharmaceutical industry:  ”The final step, and one not consistently applied in many companies, is performance analysis, a monthly review of the variances between actual and forecasted outcomes. Here we attempt to allocate a variance across the major assumptions in the forecast: How much of the variance is due to departures from our assumptions about market size, market share, dosing, inventory, and pricing?”


Use improved product design and supply-chain management methods to reduce the need for forecasting or eliminate it entirely. 


Here are some common methods, each of which can be employed alone and in combination with the others: 


In cases where your customer has an effective planning and scheduling system, get close enough to use its output to reduce or replace your internal item-level forecasts. And irrespective of the quality of the customer’s planning system or the geography of your supply chain, involve all the supply-chain partners in your product design, supply-chain design, and inventory and capacity planning processes so you can respond rapidly to demand changes. 

In cases where shortening supply-chain lead times, both internally and externally, is possible, make a transition to a make-to-order or finish-to-order fulfillment model. This is especially important when there is a high degree of commonality of parts and assemblies at lower levels of your product’s bill of material. Moving away from the level of “entangled” features (the finished-product level) to the level of disentangled options/modules/components will allow you to forecast a smaller number of items with more accuracy. 

Ensure that sensible strategies for inventory and capacity exist across the entire supply chain. Determine where you and your supply-chain partners can best hold buffer inventory or buffer capacity as a way to absorb demand variability. 

Increase manufacturing flexibility to enable short-term schedule changes to respond to shifts in demand from the customers.

Use strategic safety stocks at appropriate stages in the manufacturing process  to reduce cumulative lead times (and hence the forecasting horizon), and to increase flexibility to respond to forecast error. 

Plan for safety capacity and/or use selective “overplanning” to accommodate variability as well as shifts in demand. 




Ultimately, the answer to the problem of managing forecasts and demand has much less to do with the statistical techniques than with appropriate use of the computer to assist people in decision making, accountability, and transparency; use of appropriate analytical tools to understand the nature of demand and how it may be changing; analysis of sensible forecasting and sales-planning processes; documenting assumptions; and sensible product and supply-chain design. In the bibliography, you’ll find some useful recent writings on these subjects, some of which were referred to above. 


Perhaps most exciting: we have not yet scratched the surface on the kinds of improvements we can make to forecasting and demand management. The last word on these subjects is far from being written. 




Borneman, J. (2007). The Forecaster as Leader of the Forecasting Process, Foresight, Issue 7 (Summer 2007), 41-44.


Deschamps, E. (2005). Six Steps to Overcome Bias in the Forecast Process, Foresight, Issue 2 (October 2005), 6-11.


Dougherty, J. (2012). Dealing With Inaccurate Forecasts, Retrieved from Partners For Excellence Web Site: http://www.partnersforexcellence.com/newsv4.htm


Landvater, D. & Gray, C. (1989). The MRP II Standard System, New York: John Wiley and Sons.


Mello, J. (2009). The Impact of Sales Forecast Game Playing on Supply Chains, Foresight, Issue 13 (Spring 2009), 13-22.


Önkal, D. & Gönül, M S. (2005). Judgmental Adjustment: A Challenge for Providers and Users of Forecasts, Foresight, Issue 1 (JuneChapter seven 2005), 13-17.


Stellwagen, E. & Tashman, L. (2013). ARIMA, the Models of Box and Jenkins, Foresight, Issue 30 (Summer 2013), 28-33.