Page 94 - DMGT523_LOGISTICS_AND_SUPPLY_CHAIN_MANAGEMENT
P. 94

Unit 4: Demand Planning and Forecasting




          Unlike regression models, which are discussed in the next section, exponential smoothing does  Notes
          not make use of information from series other than the one being forecast. These models are
          also readily available in standard computer software and  require limited  data storage and
          computational capacity.
          The Exponential Smoothing method is:

              Easy to adjust for past errors,
              Easy to prepare follow-on forecasts from, and
              Ideal for situations where many forecasts need to be prepared.
          Since exponential smoothing is an iterative process, we only need to define an initial value.

          Single Exponential Smoothing: The Single Exponential Smoothing method calculates the values
          for a smoothed series. You choose a damping coefficient which is called the weighting factor.
          This factor is used to smooth the data. It can have a value ranging from ‘1’ to ‘0’ and determines
          the sensitivity of the smoothing effect. The exponential relationship that was shown earlier can
          now be written as using standard notations:
                     F   =   D  + (1 –  ) F
                      t + 1  t        t
          Where: D  is the actual value
                   t
                 F  is the forecasted value
                  t
                   is the weighting factor, which ranges from 0 to 1
                 t is the current time period.
          Since      F   =   D  + (1 –  ) F
                      t + 1  t        t
                       F =   D   + (1 –  ) F   and so on
                        t    t – 1      t – 1
          Therefore  F   =   D  + (1 –  ) (  D   + (1 –  ) F  )…….
                      t + 1  t           t – 1     t – 1
                     F   =   D  +   (1 –  )F   +   (1 –  ) F   +   (1 –  ) F  …….
                                                               3
                                                  2
                      t + 1  t         t – 1        t – 2       t – 3
          Thus, the forecast for the next period is the algebraic sum of the forecast for the last period and
          ‘?’ times error in forecast in the last period.
          Exponential Smoothing assigns exponentially decreasing weights as the observation gets older. This
          means that recent observations are given relatively more weight in forecasting than the older observations.

          A small “smoothens the values by assigning lower weightages to recent changes, while a large”
          provides a fast response to the recent changes in the time series but provides a smaller amount
          of smoothing.
          When the data is smoothed exponentially, the smoothed value becomes the forecast for period
          ‘t + 1’. Also, only three items of data are required for the analysis, unlike the moving averages
          where the first value is for the fifth week. It is interesting to note how for this particular series,
          the moving average, the weighted moving average and simple exponential smoothing smooth
          out the seasonality. The difference between the different weighting factors is increasingly visible
          as the number of reading increases.

          The basic decision that needs to  be taken by the manager is the selection of the smoothing
          constant. How should it be taken? The constant has to be either equal to or between the value
          range of ‘0’ and ‘1’. The variance of the error increases as increases. To minimize the error, we
          would like to make as small as possible (0), but this makes the forecast unresponsive to a change
          in the underlying time series. To make the forecast responsive to changes, we want as large as
          possible (1), but this increases the error variance.




                                           LOVELY PROFESSIONAL UNIVERSITY                                   89
   89   90   91   92   93   94   95   96   97   98   99