Forecasting Errors - Main Discussion
Forecasting Errors - Main Discussion
1. INTRODUCTION
which provides all the supply chain planning processes with market
Measuring forecast accuracy (or error) is not an easy task as there is no one-
Performance Indicator (KPI) is best for you. As you will see, each indicator
The first distinction we have to make is the difference between the precision
1
Bias represents the historical average error. Basically, will your forecasts be
on average too high (i.e., you overshot the demand) or too low (i.e.,
you undershot the demand)? …..This gives the overall direction of the error.
Precision measures how much spread you will have between the forecast and
the actual value. The precision of a forecast gives an idea of the magnitude of
Error: Let’s start by defining the error (et) as the forecast (ft) minus the
demand (dt) as : et = ft-dt ; Note that with this definition, if the forecast
overshoots the demand, the error will be positive, and if the forecast
undershoots the demand, then the error will be negative. We here discuss
MAD (Mean Absolute Deviation) measures the ‘size of the error in units’. It is
calculated as the average of the unsigned errors. The MAD is a good statistic to use
when analyzing the ‘error for a single item’. However, if you aggregate MADs
over multiple items you need to be careful about high-volume products dominating
2
As a basis for calculating allowable margin of error for forecasts when
When calculating the standard deviation for forecast error when setting the
Key:
3
ABS( ) = The absolute amount of a difference (without minus sign)
i = Period number
Base demand and base forecast represent demand and forecast, respectively, for a
period adjusted for seasonal variations and the effect of a varying number of
Example
4
The following MAD values will be calculated for December using the three
methods as listed:
By Exponential Smoothing
= (16 + 13 + 3 + 4) / 4 = 9
= (13 + 12 + 5 + 4) / 4 = 8.5
5
Actually, regardless of whether data values are zero, positive, or negative,
the MAD can never be negative. This is because the MAD is calculated by finding
absolute values of the deviations (or differences) from the mean, and then taking
MAPE measures the ‘size of the error in percentage’. The Mean Absolute
system is.
average absolute percent error for each time period minus actual values
6
Many organizations focus primarily on MAPE when assessing forecast
the MAPE easy to interpret. It can also convey information when you don’t
know the item’s demand volume. MAPE is scale sensitive and should not be
used when working with low-volume data. Notice that because "Actual" is
demand is zero. Furthermore, when the Actual value is not zero, but quite
small, MAPE will often take on extreme values. This scale sensitivity
data.
7
Excellent, MAPE < 20% is Good) without the context of the forecastability
of data.
Since MAPE is a measure of error, high numbers are bad and low numbers
are good. For reporting purposes, some companies will translate this to
When your MAPE is negative, it says you have larger problems than just
the MAPE calculation itself. MAPE = Abs (Act – Forecast) / Actual. Since
SUMMARY
Measuring forecast error can be a tricky business. The MAPE and MAD are the
most commonly used error measurement statistics; however, both can be
misleading under certain circumstances. The MAPE is scale sensitive and care
needs to be taken when using the MAPE with low-volume items. All error
measurement statistics can be problematic when aggregated over multiple items
and as a forecaster you need to carefully think through your approach when doing
so.
4. OTHER MEASURES
MAPE and MAD are by far the most commonly used error measurement statistics.
There are a slew of alternative statistics in the forecasting literature, many of which
8
are variations on the MAPE and the MAD. A few of the more important ones are
listed below:
5. MAD/Mean Ratio
be calculated when the actual equals zero and can take on extreme values when
dealing with low-volume data. These issues become magnified when you start to
average MAPEs over multiple time series. The MAD/Mean ratio tries to overcome
this problem by dividing the MAD by the Mean - essentially rescaling the error to
make it comparable across time series of varying scales. The statistic is calculated
exactly as the name suggests -it is simply the MAD divided by the Mean.
sample forecast performance. It is calculated using the relative error between the
naïve model (i.e., next period’s forecast is this period’s actual) and the currently
selected model. A GMRAE of 0.54 indicates that the size of the current model’s
error is only 54% of the size of the error generated using the naïve model for the
same data set. Because the GMRAE is based on a relative error, it is less scale
9
SMAPE (Symmetric Mean Absolute Percentage Error) is a variation on the MAPE
that is calculated using the average of the absolute value of the actual and the
absolute value of the forecast in the denominator. This statistic is preferred to the
competitions.
Tracking Signal (TS) is used to determine the larger deviation (in both plus
A positive tracking signal denotes that the demand is higher than the
forecast. The negative signals indicate that the demand is lower than the
forecast. A good or a better tracking signal denotes the one with less
cumulative errors. The positive signals of tracking denote that the demand is
10
A good or a better tracking signal denotes the one with less cumulative
errors. The positive signals of tracking denote that the demand is higher than
the forecast. On the other hand, a negative indicator denotes that the demand
Tracking signal is a measure used to evaluate if the actual demand does not
reflect the assumptions in the forecast about the level and perhaps trend in
than the actual demand quantity, then there is persistent under forecasting
then there is persistent under forecasting. On the other hand, if this is less
3.75 implies a forecast bias ==> TS < -3.75 or TS > 3.75 implies a bias.
11
So what is magical about 3.75. This is an approximation using the
Absolute deviation.
service level, you will be using a 3 Sigma level. As a measure of MAD, this
translates into 3.75 MAD hence the 3.75 as the threshold for TS.
12