Forecasting Part I - PowerPoint PPT Presentation

1 / 149
About This Presentation
Title:

Forecasting Part I

Description:

Estimate a seasonal adjustment factor for each period within the season. e.g. TSeptember ... Thus summed over a season, the ct must add to L ... – PowerPoint PPT presentation

Number of Views:56
Avg rating:3.0/5.0
Slides: 150
Provided by: ValuedGate1643
Category:
Tags: forecasting | part

less

Transcript and Presenter's Notes

Title: Forecasting Part I


1
ForecastingPart I
2
Forecasting
  • Trying to predict the future behavior of some
    process/variable based on past data
  • Fundamental to business planning
  • Most business decisions based to some extent on
    forecasts
  • Key forecasts in business
  • Future demand for products
  • Future price of various commodities

3
Why forecast demand?
  • We need to know how much to make ahead of time,
    i.e. our production schedule
  • How much raw material
  • How many workers
  • How much to ship to the warehouse in Denver
  • We need to know how much production capacity to
    build

4
Monthly Demand for Furniture
5
3-Month Lead time from Factory in China
6
Whenever you have
  • Significant lead time in production
  • Variation in demand
  • Need for fast customer service (no back orders)
  • You will need to maintain inventory
  • The more accurate the forecast, the less the
    inventory required (why?)

7
Why Forecast Raw Material Price?
8
Other Examples of Time Series Data
9
Monthly Australian Red Wine Sales
10
Yearly Level of Lake Huron
11
Monthly new polio cases in the U.S.A., 1970-1983
12
Monthly Traffic Injuries (G.B) beginning in
January 1975
13
U.S. Pop., 10-year Intervals, 1790--1980
14
Annual Canadian Lynx Trappings
15
Daily Dow Jones All Ordinaries (Australia)
Indices
16
1st 2nd Law of Forecasting
  • 1. In forecasting, we assume the future
    will behave like the past
  • If behavior changes, our forecasts can be
    terrible
  • 2. Even given 1, there is a limit to how
    accurate forecasts can be (or nothing can be
    predicted with complete accuracy)
  • The achievable accuracy depends on the magnitude
    of the noise component

17
What if this happened? Could you foresee it at
period 23?
18
NID(0,?2) Optimal Forecast is Xt0
19
Forecast Error
  • Not only must we forecast future values
  • (Point estimates)
  • We must estimate the accuracy of our forecast
  • We can use various measures associated with the
    forecast error.
  • e.g. prediction intervals

20
(No Transcript)
21
(No Transcript)
22
Why is it crucial to estimate forecast error?
  • Todays stock price 42/share
  • Forecast of tomorrows price 43/share
  • 43/share ? 0.10 (with 95 confidence)
  • 43/share ? 100. (with 95 confidence)
  • Obviously your decisions might change

23
Why is it crucial to estimate forecast error?
  • Next Months demand estimate 700 units
  • Production lead time is 1 month
  • 700 ? 5 (with 95 confidence)
  • 700 ? 300 (with 95 confidence)
  • How much inventory do you need to assure demand
    is met?

24
Forecasting Techniques
  • Model based methods
  • Trend and Seasonal Decomposition
  • Time based regression
  • Time Series Methods (e.g. ARIMA Models)
  • Multiple Regression using leading indicators
  • Forecasting methods
  • More heuristic approaches that attempt to track
    a signal

25
Univariate Time Series Models Based on
Decomposition
  • Xt the time series to forecast
  • Xt Tt St Nt
  • Where
  • Tt is a deterministic trend component
  • St is a deterministic seasonal/periodic component
  • Nt is a random noise component

26
S(Xt)0.257
27
(No Transcript)
28
Simple Linear Regression Model
Xt2.8771740.020726t
29
Use Model to Forecast into the Future
30
Residuals Actual-Predictedet
Xt-(2.8771740.020726t)
S(et)0.211
31
Simple Seasonal Model
  • Estimate a seasonal adjustment factor for each
    period within the season
  • e.g. TSeptember

32
Sorted by season
Season averages
33
Trend Seasonal Model
  • Xt2.8771740.020726t Tmod(t,3)
  • Where
  • T1 0.250726055
  • T2 -0.242500035
  • T3 -0.008226125

34
(No Transcript)
35
e?t Xt - (2.877174 0.020726t Tmod(t,3))
S(e?t)0.145
36
Can use other trend models
  • Xt ?0 ?1Sin(2?t/k) (where k is period)
  • Xt ?0 ?1t ?2t2 (multiple regression)
  • Xt ?0 ?1ekt
  • etc.
  • Examine the plot, pick a reasonable model
  • Test model fit, revise if necessary

37
(No Transcript)
38
(No Transcript)
39
Model Xt Tt St Nt
  • After extracting trend and seasonal components we
    are left with the Noise
  • Nt Xt (Tt St)
  • Can we extract any more predictable behavior from
    the noise?
  • Use Time Series analysis
  • Akin to signal processing in EE

40
A zero mean, aperiodic time seriesIs our best
forecast 0?
41
AR(1) Model
  • This data was generated using the model
  • Nt 0.9Nt-1 Zt
  • Where Zt N(0,?2)
  • Thus to forecast Nt1,we could use
  • 0.9Nt
  • 0.9 0.9(0.9)Nt , etc.

42
(No Transcript)
43
(No Transcript)
44
Time Series Models
  • Examine the correlation of the time series to
    past values.
  • This is called autocorrelation
  • If Nt is correlated to Nt-1, Nt-2,..
  • Then we can forecast better than
  • 0

45
Sample Autocorrelation Function
46
Sample Spectrum
Series dominated by low-frequency components
thus there is a signal that can be extracted
47
Back to our Demand Data
48
No Apparent Significant Autocorrelation
49
Residuals Appear Normal
50
Thus one possible model is
  • Xt2.8771740.020726t - Tmod(t,3)
  • Where
  • T1 0.250726055
  • T2 -0.242500035
  • T3 -0.008226125

51
Multiple Linear Regression
  • Y ?0 ?1 X1 ?2 X2 . ?p Xp ?
  • Where
  • Y is the independent variable you want to
    predict
  • The Xis are the dependent variables you want to
    use for prediction (known)
  • Model is linear in the ?is

52
Examples of MLR in Forecasting
  • Yt ?0 ?1t ?2t2 ?3Sin(2?t/k) ?4ekt
  • i.e a trend model, a function of t
  • Yt ?0 ?1X1t ?2X2t
  • Where X1t and X2t are leading indicators
  • Yt ?0 ?1Yt-1 ?2Yt-2 ?12Yt-12 ?13Yt-13
  • An Autoregressive model

53
Example Sales and Leading Indicator
54
Poors mans MVARIMA Models
  • ARIMA and MVARIMA models require significant
    expertise to apply
  • We will illustrate a simplified approach that can
    be useful
  • Should use with great caution

55
Make shifted copies of columns, chop off top and
bottom.
56
to get time shifted data, then regress
57
(No Transcript)
58
Regress again with significant variables
59
Final Model Is
  • Sales(t) -3.930.83Sales(t-3)
  • -0.78Sales(t-2)1.22Sales(t-1)
  • -5.0Lead(t)
  • SD(Sales) 21.19
  • SD(Residuals) 1.31

60
(No Transcript)
61
(No Transcript)
62
No apparent residual autocorrelations
63
Lets Try Raw Material Price
64
Shift Data
65
(No Transcript)
66
(No Transcript)
67
(No Transcript)
68
Forecasting Part II
  • Forecasting Techniques
  • Techniques that try to follow the time series
  • Often include an updating feature
  • Can typically be traced to an underlying model
    for which the method is optimal

69
M-Period Moving Average
  • Ft1(t) (At At-1 At-M1 )/M
  • i.e. the average of the last M data points
  • Basically assumes a stable (trend free) series
  • How should we choose M?
  • Advantages of large M?
  • Advantages of large M?

70
Simple Exponential Smoothing
  • Ft1(t) Forecast for time t1 made at time t
  • At Actual outcome at time t
  • 0lt?lt1 is the smoothing parameter
  • Ft1(t) Ft(t-1) ?At Ft(t-1)
  • Adjust forecast based on last forecast error
  • OR
  • Ft1(t) (1- ?)Ft(t-1) ?At
  • Weighted average of last forecast and last Actual

71
Simple Exponential Smoothing
  • Is appropriate when the underlying time series
    behaves like a constant Noise
  • Xt ? Nt
  • Or when the mean ? is moving very slowly
  • That is, for a quite stable process

72
Simple Exponential Smoothing
  • Ft1(t) (1- ?)Ft(t-1) ?At
  • Ft1(t) (1-?)2Ft-1(t-2) ?At ?(1-?)At-1
  • Ft1(t) (1-?)3Ft-2(t-3) ?At ?(1-?)At-1
    ?(1-?)2At-2
  • Ft1(t) ?At ?(1-?)At-1 ?(1-?)2At-2
    ?(1-?)3At-3 ..
  • Is a weighted average of past observations
  • Weights decay geometrically as we go backwards in
    time

73
(No Transcript)
74
Simple Exponential Smoothing
  • Ft1(t) ?At ?(1-?)At-1 ?(1-?)2At-2
    ?(1-?)3At-3 ..
  • Large ? adjusts more quickly to changes
  • Smaller ? provides more averaging and thus
    lower variance when things are stable
  • Exponential smoothing is intuitively more
    appealing than moving averages

75
Exponential Smoothing Examples
76
Zero Mean White Noise
77
(No Transcript)
78
(No Transcript)
79
Shifting Mean Zero Mean White Noise
80
(No Transcript)
81
(No Transcript)
82
(No Transcript)
83
Recommended Alpha
  • Typically alpha should be in the range 0.05 to
    0.3
  • If RMS analysis indicates larger alpha,
    exponential smoothing may not be appropriate

84
(No Transcript)
85
(No Transcript)
86
Might look good, but is it?
87
(No Transcript)
88
(No Transcript)
89
Series and Forecast using Alpha0.9
2
1.5
1
Forecast
0.5
0
-0.5
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
Period
90
(No Transcript)
91
(No Transcript)
92
(No Transcript)
93
(No Transcript)
94
(No Transcript)
95
(No Transcript)
96
(No Transcript)
97
Exponential smoothing will lag behind a trend
  • Suppose Xtb0 b1t
  • And St (1- ?)St-1 ?Xt
  • Can show that

98
(No Transcript)
99
Double Exponential Smoothing
  • Modifies exponential smoothing for following a
    linear trend
  • i.e. Smooth the smoothed value

100
St Lags
St2 Lags even more
101
2St -St2 doesnt lag
102
(No Transcript)
103
(No Transcript)
104
(No Transcript)
105
Example
106
?0.2
107
Single Lags a trend
108
6
5
4
Double Over-shoots a change (must re-learn the
slope)
3
Trend
2
Series Data
Single Smoothing
Double smoothing
1
0
-1
1
6
11
16
21
26
31
36
41
46
51
56
61
66
71
76
81
86
91
96
101
109
Winters Seasonal Methods
  • Exponential smoothing for seasonal data
  • Two models, Multiplicative and Additive
  • Models contain trend and seasonal components
  • Models smooth, I.e. place greater weight on
    more recent data

110
Winters Multiplicative Model
  • Xt (b1b2t)ct ?t
  • Where ct are seasonal terms and
  • Note that the amplitude depends on the level of
    the series

111
  • Example

112
(10.04t)
113
150
114
50
115
  • The seasonal terms average 100 (i.e.1)
  • Thus summed over a season, the ct must add to L
  • Each period we go up or down some percentage of
    the current trend value
  • The amplitude increasing with level seems to
    occur frequently in practice

116
Recall Australian Red Wine Sales
117
Smoothing
  • In Winters model, we smooth the permanent
    component, the trend component and the
    seasonal component
  • We may have a different smoothing parameter for
    each (?, ?, ?)
  • Think of the permanent component as the current
    level of the series (without trend)

118
(No Transcript)
119
Current Observation
120
Current Observation deseasonalized
121
Estimate of permanent component from last time
last level plus slope1
122
(No Transcript)
123
(No Transcript)
124
observed slope
125
observed slope
previous slope
126
(No Transcript)
127
(No Transcript)
128
Winters Additive Method
  • Xt b1 b2t ct ?t
  • Where ct are seasonal terms and
  • Similar to model from last lecture except we
    smooth estimates of b1, b2, and the ct

129
Forecast Errors and Prediction Intervals
  • Obviously our forecasts cannot predict with
    perfect accuracy
  • How much error might we expect?
  • Can we produce prediction intervals, i.e.
    forecast ? K with 95 probability

130
Forecast Errors
  • The forecast error at time T of a forecast made ?
    periods ago is
  • We are interested in V(e?(T))

131
Prediction Intervals
  • Given V(e?(T))
  • And assuming a normal distribution
  • A 95 prediction interval for XT? at time T is

132
Forecast Error Variance
133
Forecast Error Variance
  • Assuming a model of the behavior of Xt
  • Given a forecast equation f(Xt, Xt-1..)
  • We can derive the forecast error variance
  • But it aint easy

134
Example
  • Suppose Xt b1b2t?t
  • Where is NID(0,??2)
  • Assume that it is now time T and we use standard
    least squares to estimate b1and b2
  • Then we can show with much work that

135
(No Transcript)
136
(No Transcript)
137
(No Transcript)
138
Huge Assumption is
  • This assumes that the underlying model does not
    change.
  • As we get more data, the estimate of the trend
    line gets better.
  • However, the underlying model always changes

139
Adaptive Forecasting
  • The underlying behavior of the time series often
    changes
  • If series behavior seems stable, we want to
    weight data far into the past in our forecast.
  • (to take advantage of averaging)
  • If it is changing, we want to only consider
    recent data
  • (since past data no longer is valid for current
    behavior)

140
Basic Idea of Adaptive Forecasting
  • Monitor the forecast errors
  • If errors stable, use small ?
  • If errors increase, increase ?
  • We use Tracking signals to monitor the forecast
    error

141
Tracking signals
  • Let e1(T) be the one step ahead forecast error at
    time T
  • Let Y(T)Y(T-1) e1(T) be the error total
  • That is the cumulative sum of errors
  • Typically errors will be positive negative
  • Thus Y(T) should stay near zero

142
Tracking Signals
  • However, if the forecast starts to under or over
    predict the true value, the error total should
    behave like a trend
  • Example

143
(No Transcript)
144
(No Transcript)
145
Tracking Signal
  • TS Error Total / e1(T)
  • Dividing by the M.A.D. standardizes the Error
    total

146
(No Transcript)
147
(No Transcript)
148
Control Limits
149
Reduce smoothing parameter
Write a Comment
User Comments (0)
About PowerShow.com