Demand forecasting: models or experts, who knows best?

Are experts no longer necessary? Will you be better off using mathematical models and are they enough? Or can you improve the accuracy of model forecasts by combining them with human assessments and estimates (judgemental forecasts)? These are all questions to which there is not yet a simple answer. Ann Vereecke, Professor in Operations and Supply Chain Management, Karlien Vanderheyden, Professor in Organisational Behaviour and Philippe Baecke, Professor in Marketing Analytics, shed some light on what we already know and the direction research will now take.

The importance of good forecasts - projections or predictions - of demand cannot be underestimated. Without this kind of demand forecast, a company does not know what it should purchase, produce or supply where and when. Previously, these prognoses were based on experience or experts' instincts. Statistical models gradually gained ground. Today, companies have increasingly extensive and high-quality data at their disposal. Big data and analytics are not just passing trends, and the arsenal of tools, techniques and models available to recognise patterns and produce forecasts is becoming ever more effective.

But models cannot predict everything. "Models use data from the past and this cannot always simply be extrapolated to the future, even using sophisticated models," says Ann. "There are factors that cannot be identified using the available data, which mean that the future will be different to the past. This is why we need to continue combining models with other information, obtained from conversations with customers or insights into how a particular product or design could catch on in the market, for example”.

Broken leg cues

Karlien nods in agreement: "In contrast to models, people can take what is known in jargon as 'broken leg cues' into account. If you go to the cinema every Thursday evening but you break your leg on Thursday afternoon, a regression model would predict that you would still go to the cinema that evening. Someone who knows that you have broken your leg will, correctly, guess that you will not go this time. We recently conducted research for a company that distributes magazines. Their model predicts accurately how many magazines need to be on the shelves every week, but if there is a free music CD with the magazine, the model is less accurate. In this case an expert can better estimate how many extra magazines will be needed, on the basis of the specific CD".

"Purely statistical models struggle to take unexpected events, unstable environmental factors or missing data into consideration. The models would become too complex," Philippe explains. "And this is precisely where the added value of judgemental forecasts comes in. Experts can interpret the context and take changes into account. The more complicated and volatile the environment, the greater the added value expert judgements can provide".

Better to do something than nothing?

Unfortunately people, experts included, are not infallible. So far, little field research has been carried out but, according to Karlien, we can already make some deductions based on experimental studies with students and research carried out by Goodwin, for example. "People have a tendency to see patterns in a random set of data, even if there are none. They are also often too optimistic and are more likely to adjust demand forecasts upwards. People overestimate the importance of their own judgement and will adjust the outcome of a model to show that they do have something to contribute. Research has shown that most adjustments made are minor, although major adjustments are actually better as there is usually carefully-considered reasoning behind these, whilst small adjustments are usually just made for the sake of doing something".

The right motivation

Everyone is agreed that the motivation behind any adjustments is one of the main problems. Ann explains with an example: "Sales and Operations are often not on the same wavelength. If Sales are rewarded on the basis of the volume sold, this department will overestimate the demand so that Operations will produce enough for Sales to supply. But if Operations are then assessed on the basis of the stock levels, this department will in turn tend towards producing as little stock as possible. If the Sales department is rewarded when the demand forecast is exceeded, however, they will systematically underestimate demand. But Operations will base production on this underestimation and will not be able to meet actual demand. So this results in conflicts. If you allow people to adjust the demand forecast, you should have good knowledge of how the incentive system works".

"The accuracy of forecasts should be used as a KPI for bonuses and incentives much more often," Philippe thinks, "but you don't see systems like this very much".

A few tips

Further field research needs to be conducted to determine whether additional input from experts has added value, what the pitfalls are and the best way to approach these matters.

Even though the results of this research are not yet known, we would like to provide a few tips:

  • Continue to have adjustments to model forecasts made through a group of experts and use the average of their adjustments. Take the group dynamic into account if adjustments are made during a meeting.
  • Ask for an explanation and motivation for each adjustment.
  • Give feedback. Compare the forecast by the model and the forecast after adjustments by experts with actual demand.
Discover how you can improve your forecasts too

Every company will benefit from as accurate as possible a demand forecast: the more accurate the forecast, the better the reliability of supply.

Our recently-established Forecasting Research Centre brings companies from various sectors together with a multidisciplinary team of researchers, led by Ann, Karlien, Philippe and Professor Marc Buelens. Marc has years of experience and expertise in the field of executive decision making.

During the initial phase, a maturity assessment model will be developed to map how the members of the research centre are doing themselves. This kind of model provides insight into the quality of their forecasting process and what can be improved further. During the second phase, research will be conducted into (1) when judgemental forecasting is useful, (2) the impact of incentive systems, the motivation for adjustments and feedback, (3) how the forecasting process can best be implemented and (4) how expert judgements can be built into models. 

One of the researchers closely associated with the Centre is Shari De Baets. She began her PhD research into judgemental forecasting in November.

It is still possible to join the research centre. Interested? Don't hesitate to contact us.

 

 

Related news

  1. Operational excellence starts here

    Date: 18/09/2015
    Category: Research News
    “The essence of operational excellence is to work as efficiently and effectively as possible,” says Vlerick Research Associate Tom Van Steendam. “How to actually achieve this depends on your strategic ambitions.” Our Operational Excellence Assessment Tool maps your company’s current level of operational capabilities and offers actionable recommendations for further improvement, tailored to your strategy.
  2. How mature is your demand forecasting process?

    Date: 14/09/2015
    Category: Research News
    Statistical forecasting models are becoming increasingly sophisticated and software is readily available to support the forecasting process, yet surveys show that in only 25% of cases, demand forecasts are based exclusively on statistical models; 75% of the time they are the result of human judgement alone (25%), a statistical model adjusted by the forecaster(s) (33%) or an average of statistical and judgemental forecasts (17%). Human judgement therefore plays an important role. But how does it impact on forecasting accuracy? To date, little empirical research has been done in this area. The Vlerick Forecasting Research Centre fills this gap.
All articles