Time Series Analysis and Forecasting I - PowerPoint PPT Presentation

1 / 59
About This Presentation
Title:

Time Series Analysis and Forecasting I

Description:

Mixed ARMA models. Non stationary models (ARIMA models) The mean parameter. The trend parameter ... Mixed AR and MA (ARMA) models (cont. ... – PowerPoint PPT presentation

Number of Views:1001
Avg rating:3.0/5.0
Slides: 60
Provided by: mrt4
Category:

less

Transcript and Presenter's Notes

Title: Time Series Analysis and Forecasting I


1
Time Series Analysis and Forecasting I
2
Introduction
  • A time series is a set of observations generated
    sequentially in time
  • Continuous vs. discrete time series
  • The observations from a discrete time series,
    made at some fixed interval h, at times ?1, ?2,,
    ?N may be denoted by z(?1), z(?2),, z(?N)

3
Introduction (cont.)
  • Discrete time series may arise in two ways
  • 1- By sampling a continuous time series
  • 2- By accumulating a variable over a period of
    time
  • Characteristics of time series
  • Time periods are of equal length
  • No missing values

4
Components of a time series
Zt Ft at
5
Areas of application
  • Forecasting
  • Determination of a transfer function of a system
  • Design of simple feed-forward and feedback
    control schemes

6
Forecasting
  • Applications
  • Economic and business planning
  • Inventory and production control
  • Control and optimization of industrial processes
  • Lead time of the forecasts
  • is the period over which forecasts are needed
  • Degree of sophistication
  • Simple ideas
  • Moving averages
  • Simple regression techniques
  • Complex statistical concepts
  • Box-Jenkins methodology

7
Approaches to forecasting
  • Self-projecting approach
  • Cause-and-effect approach

8
Approaches to forecasting (cont.)
  • Self-projecting approach
  • Advantages
  • Quickly and easily applied
  • A minimum of data is required
  • Reasonably short-to medium-term forecasts
  • They provide a basis by which forecasts developed
    through other models can be measured against
  • Disadvantages
  • Not useful for forecasting into the far future
  • Do not take into account external factors
  • Cause-and-effect approach
  • Advantages
  • Bring more information
  • More accurate medium-to long-term forecasts
  • Disadvantages
  • Forecasts of the explanatory time series are
    required

9
Some traditional self-projecting models
  • Overall trend models
  • The trend could be linear, exponential,
    parabolic, etc.
  • A linear Trend has the form
  • Trendt A Bt
  • Short-term changes are difficult to track
  • Smoothing models
  • Respond to the most recent behavior of the series
  • Employ the idea of weighted averages
  • They range in the degree of sophistication
  • The simple exponential smoothing method

10
Some traditional self-projecting models (cont.)
  • Seasonal models
  • Very common
  • Most seasonal time series also contain long- and
    short-term trend patterns
  • Decomposition models
  • The series is decomposed into its separate
    patterns
  • Each pattern is modeled separately

11
Drawbacks of the use of traditional models
  • There is no systematic approach for the
    identification and selection of an appropriate
    model, and therefore, the identification process
    is mainly trial-and-error
  • There is difficulty in verifying the validity of
    the model
  • Most traditional methods were developed from
    intuitive and practical considerations rather
    than from a statistical foundation
  • Too narrow to deal efficiently with all time
    series

12
ARIMA models
  • Autoregressive Integrated Moving-average
  • Can represent a wide range of time series
  • A stochastic modeling approach that can be used
    to calculate the probability of a future value
    lying between two specified limits

13
ARIMA models (Cont.)
  • In the 1960s Box and Jenkins recognized the
    importance of these models in the area of
    economic forecasting
  • Time series analysis - forecasting and control
  • George E. P. Box Gwilym M. Jenkins
  • 1st edition was in 1976
  • Often called The Box-Jenkins approach

14
Transfer function modeling
  • Yt ?(B)Xt where
  • ?(B) ?0 ?1B ?2B2 ..
  • B is the backshift operator
  • BmXt Xt - m

15
Transfer function modeling (cont.)
  • The study of process dynamics can achieve
  • Better control
  • Improved design
  • Methods for estimating transfer function models
  • Classical methods
  • Based on deterministic perturbations
  • Uncontrollable disturbances (noise) are not
    accounted for, and hence, these methods have not
    always been successful
  • Statistical methods
  • Make allowance for noise
  • The Box-Jenkins methodology

16
Process control
  • Feed-forward control
  • Feedback control

17
Process control (cont.)
18
Process control (cont.)
  • The Box-Jenkins approach to control is to typify
    the disturbance by a suitable time series or
    stochastic model and the inertial characteristics
    of the system by a suitable transfer function
    model
  • The Control equation, allows the action which
    should be taken at any given time to be
    calculated given the present and previous states
    of the system
  • Various ways corresponding to various levels of
    technological sophistication can be used to
    execute a control action called for by the
    control equation

19
The Box-Jenkins model building process
Model identification
Model estimation
Is model adequate ?
No
Modify model
Yes
Forecasts
20
The Box-Jenkins model building process (cont.)
  • Model identification
  • Autocorrelations
  • Partial-autocorrelations
  • Model estimation
  • The objective is to minimize the sum of squares
    of errors
  • Model validation
  • Certain diagnostics are used to check the
    validity of the model
  • Model forecasting
  • The estimated model is used to generate forecasts
    and confidence limits of the forecasts

21
Important Fundamentals
  • A Normal process
  • Stationarity
  • Regular differencing
  • Autocorrelations (ACs)
  • The white noise process
  • The linear filter model
  • Invertibility

22
A Normal process (A Gaussian process)
  • The Box-Jenkins methodology analyze a time series
    as a realization of a stochastic process.
  • The observation zt at a given time t can be
    regarded as a realization of a random variable zt
    with probability density function p(zt)
  • The observations at any two times t1 and t2 may
    be regarded as realizations of two random
    variables zt1, zt2 and with joint probability
    density function p(zt1, zt2)
  • If the probability distribution associated with
    any set of times is multivariate Normal
    distribution, the process is called a normal or
    Gaussian process

23
Stationary stochastic processes
  • In order to model a time series with the
    Box-Jenkins approach, the series has to be
    stationary
  • In practical terms, the series is stationary if
    tends to wonder more or less uniformly about some
    fixed level
  • In statistical terms, a stationary process is
    assumed to be in a particular state of
    statistical equilibrium, i.e., p(zt) is the same
    for all t

24
Stationary stochastic processes (cont.)
  • the process is called strictly stationary
  • if the joint probability distribution of any m
    observations made at times t1, t2, , tm is the
    same as that associated with m observations made
    at times t1 k, t2 k, , tm k
  • When m 1, the stationarity assumption implies
    that the probability distribution p(zt) is the
    same for all times t

25
Stationary stochastic processes (cont.)
  • In particular, if zt is a stationary process,
    then the first difference ?zt zt - zt-1and
    higher differences ?dzt are stationary
  • Most time series are nonstationary

26
Achieving stationarity
  • Regular differencing (RD)
  • (1st order) ?zt (1 B)zt zt zt-1
  • (2nd order) ?2zt (1 B)2zt zt 2zt-1 zt-2
  • B is the backward shift operator
  • It is unlikely that more than two regular
    differencing would ever be needed
  • Sometimes regular differencing by itself is not
    sufficient and prior transformation is also needed

27
Some nonstationary series
28
Some nonstationary series (cont.)
29
Some nonstationary series (cont.)
How can we determine the number of regular
differencing ?
30
Autocorrelations (ACs)
  • Autocorrelations are statistical measures that
    indicate how a time series is related to itself
    over time
  • The autocorrelation at lag 1 is the correlation
    between the original series zt and the same
    series moved forward one period (represented as
    zt-1)

31
Autocorrelations (cont.)
  • The theoretical autocorrelation function
  • The sample autocorrelation

32
Autocorrelations (cont.)
  • A graph of the correlation values is called a
    correlogram
  • In practice, to obtain a useful estimate of the
    autocorrelation function, at least 50
    observations are needed
  • The estimated autocorrelations rk would be
    calculated up to lag no larger than N/4

33
A correlogram of a nonstationary time seies
34
After one RD
35
After two RD
36
The white noise process
  • The Box-Jenkins models are based on the idea that
    a time series can be usefully regarded as
    generated from (driven by) a series of
    uncorrelated independent shocks at
  • Such a sequence at, at-1, at-2, is called a
    white noise process

37
The linear filter model
  • A linear filter is a model that transform the
    white noise process at to the process that
    generated the time series zt

38
The linear filter model (cont.)
  • ?(B) is the transfer function of the filter

39
The linear filter model (cont.)
  • The linear filter can be put in another form
  • This form can be written

40
Stationarity and invertibility conditions for a
linear filter
  • For a linear process to be stationary,
  • If the current observation zt depends on past
    observations with weights which decrease as we go
    back in time, the series is called invertible
  • For a linear process to be invertible,

41
Model building blocks
  • Autoregressive (AR) models
  • Moving-average (MA) models
  • Mixed ARMA models
  • Non stationary models (ARIMA models)
  • The mean parameter
  • The trend parameter

42
Autoregressive (AR) models
  • An autoregressive model of order p
  • The autoregressive process can be thought of as
    the output from a linear filter with a transfer
    function ?-1(B), when the input is white noise at
  • The equation ?(B) 0 is called the
    characteristic equation

43
Moving-average (MA) models
  • A moving-average model of order q
  • The moving-average process can be thought of as
    the output from a linear filter with a transfer
    function ?(B), when the input is white noise at
  • The equation ?(B) 0 is called the
    characteristic equation

44
Mixed AR and MA (ARMA) models
  • A moving-average process of 1st order can be
    written as
  • Hence, if the process were really MA(1), we would
    obtain a non parsimonious representation in terms
    of an autoregressive model

45
Mixed AR and MA (ARMA) models (cont.)
  • In order to obtain a parsimonious model,
    sometimes it will be necessary to include both AR
    and MA terms in the model
  • An ARMA(p, q) model
  • The ARMA process can be thought of as the output
    from a linear filter with a transfer function
    ?(B)/?(B), when the input is white noise at

46
The Box-Jenkins model building process
  • Model identification
  • Autocorrelations
  • Partial-autocorrelations
  • Model estimation
  • Model validation
  • Certain diagnostics are used to check the
    validity of the model
  • Model forecasting

47
Partial-autocorrelations (PACs)
  • Partial-autocorrelations are another set of
    statistical measures are used to identify time
    series models
  • PAC is Similar to AC, except that when
    calculating it, the ACs with all the elements
    within the lag are partialled out (Box Jenkins,
    1976)

48
Partial-autocorrelations (cont.)
  • PACs can be calculated from the values of the ACs
    where each PAC is obtained from a different set
    of linear equations that describe a pure
    autoregressive model of an order that is equal to
    the value of the lag of the partial-autocorrelatio
    n computed
  • PAC at lag k is denoted by ?kk
  • The double notation kk is to emphasize that ?kk
    is the autoregressive parameter ?k of the
    autoregressive model of order k

49
Model identification
  • The sample ACs and PACs are computed for the
    series and compared to theoretical
    autocorrelation and partial-autocorrelation
    functions for candidate models investigated

50
Stationarity and invertibility conditions
  • For a linear process to be stationary,
  • For a linear process to be invertible,

51
Stationarity requirements for AR(1) model
  • For an AR(1) to be stationary
  • -1 lt ?1 lt 1
  • i.e., the roots of the characteristic equation 1
    - ?1B 0 lie outside the unit circle
  • For an AR(1) it can be shown that
  • ?k ?1 ?k 1 which with ?0 1 has the solution
  • ?k ?1k k gt 0
  • i.e., for a stationary AR(1) model, the
    theoretical autocorrelation function decays
    exponentially to zero, however, the theoretical
    partial-autocorrelation function has a cut off
    after the 1st lag

52
Invertibility requirements for a MA(1) model
  • For a MA(1) to be invertible
  • -1 lt ?1 lt 1
  • i.e., the roots of the characteristic equation 1
    - ? 1B 0 lie outside the unit circle
  • For a MA(1) it can be shown that
  • i.e., for an invertible MA(1) model, the
    theoretical autocorrelation function has a cut
    off after the 1st lag, however, the theoretical
    partial-autocorrelation function decays
    exponentially to zero

53
Higher order models
  • For an AR model of order p gt 1
  • The autocorrelation function consists of a
    mixture of damped exponentials and damped sine
    waves
  • The partial-autocorrelation function has a cut
    off after the p lag
  • For a MA models of order q gt 1
  • The autocorrelation function has a cut off after
    the q lag
  • The partial-autocorrelation function consists of
    a mixture of damped exponentials and damped sine
    waves

54
Permissible regions for the AR and MA parameters
55
Theoretical ACs and PACs (cont.)
56
Theoretical ACs and PACs (cont.)
57
Model identification
58
Model estimation
59
Model verification
Write a Comment
User Comments (0)
About PowerShow.com