First steps in time series - PowerPoint PPT Presentation

1 / 22
About This Presentation
Title:

First steps in time series

Description:

Often made with two basic components. trend seasonality ... Magnification of the last 12 months (8 train patterns 4 predictions) prediction ... – PowerPoint PPT presentation

Number of Views:16
Avg rating:3.0/5.0
Slides: 23
Provided by: sophia6
Category:

less

Transcript and Presenter's Notes

Title: First steps in time series


1
First steps in time series
2
Time series look different
3
But they are not
Xt systematic pattern noise
Obscures the pattern
Often made with two basic components
trend seasonality
periodic
Underlying linear/non-linear, time changing,
uncanny model
4
Monthly car production in Spain
5
  • Trend analysis
  • Smooth data using an underlying model
  • Moving average
  • Exponential moving average
  • Fit any function
  • linear, splines, log, exp, your guess

Good for one-period ahead forecasting (weather)
6
linear
Detect and model trend Substract trend
ma(12)
Stationarity (mean and variance)
7
  • Seasonality
  • Look for autocorrelations
  • absolute values
  • increments
  • other combination of variables!
    (intuitionexpertise)

8
lag 1 npat 58 corr -0.0397427758 lag 2
npat 57 corr -0.274627552 lag 3 npat 56
corr -0.265544135 lag 4 npat 55 corr
0.20316958 lag 5 npat 54 corr -0.0689740424
lag 6 npat 53 corr -0.0261346967 lag 7
npat 52 corr -0.106164043 lag 8 npat 51
corr 0.242284075 lag 9 npat 50 corr
-0.245981648 lag 10 npat 49 corr -0.315532046
lag 11 npat 48 corr -0.0585207867 lag 12 npat
47 corr 0.94479132 lag 13 npat 46 corr
-0.0744984938 lag 14 npat 45 corr
-0.255095906 lag 15 npat 44 corr -0.272366416
lag 16 npat 43 corr 0.197473796 lag 17 npat
42 corr 2.4789972E-05 lag 18 npat 41 corr
-0.0635903421 lag 19 npat 40 corr
-0.138785333 lag 20 npat 39 corr 0.311166354
lag 21 npat 38 corr -0.224766545 lag 22 npat
37 corr -0.337246239 lag 23 npat 36 corr
-0.056239266 lag 24 npat 35 corr 0.916841357
xt vs xt -lag
Structure seasonality
anticorrelations
9
vs
lag 1 inc 1 npat 57 corr -0.390023157 lag
2 inc 1 npat 56 corr -0.101618385 lag 3
inc 1 npat 55 corr -0.184521702 lag 4 inc 1
npat 54 corr 0.275796168 lag 5 inc 1 npat
53 corr -0.11691902 lag 6 inc 1 npat 52
corr 0.0605651902 lag 7 inc 1 npat 51 corr
-0.146978369 lag 8 inc 1 npat 50 corr
0.309853501 lag 9 inc 1 npat 49 corr
-0.18309118 lag 10 inc 1 npat 48 corr
-0.127671412 lag 11 inc 1 npat 47 corr
-0.35079422 lag 12 inc 1 npat 46 corr
0.944415972 lag 13 inc 1 npat 45 corr
-0.390599355
12-month correlation!
10
vs
lag 1 inc 2 npat 56 corr 0.0635841158 lag
2 inc 2 npat 55 corr -0.661845284 lag 3
inc 2 npat 54 corr -0.14900299 lag 4 inc 2
npat 53 corr 0.236847899 lag 5 inc 2 npat
52 corr 0.0775265432 lag 6 inc 2 npat 51
corr -0.1446445 lag 7 inc 2 npat 50 corr
0.0427922726 lag 8 inc 2 npat 49 corr
0.268605346 lag 9 inc 2 npat 48 corr
-0.13164323 lag 10 inc 2 npat 47 corr
-0.660207622 lag 11 inc 2 npat 46 corr
0.0676404933 lag 12 inc 2 npat 45 corr
0.956921988 lag 13 inc 2 npat 44 corr
0.0444132255
2-month anticorrelation!
Danger correlations vs error
11
Fourier analysis uncovers periodicity
const
2?/48.131
Single tick 2?
More sophisticated analysis is possible but
brings little further information
12
Reasonable bets
13
Best linear fit neural fit to 12 based on the
last 12 months
14
Magnification of the last 12 months (8 train
patterns 4 predictions)
prediction
15
Linear fit heavily depends on one variable Neural
net finds non-linear relations that enhance
correlations
16
Simple is good
  • If in doubt, start with a simple dependency
  • Ex xt xt-lag
  • lag 1 day
  • Weather forecast
  • Donuts
  • lag 7 days
  • Donuts
  • Electricity load curve
  • lag 1 year
  • Electricity load curve
  • Sales

17
ARIMA
Define Backward shift Backward
difference Polynomials of degree p and q
Auto-Regressive Moving Average time series models
18
AR(p)
I(d)
MA(q)
Zeros in polynomials must lie outside unit circle
Example ARIMA(1,0,0)
19
NN enhancement
  • Rather than using recursive NN,
  • carry out linear analysis
  • preprocess data to ARIMA like
  • perform a linear forecast
  • feed a NN with all the linear analysis
  • preprocessed data
  • linear prediction

The NN will learn (if any) the underlying law
controlling the departures of real data from
linear analysis
20
Leave-one-out NN
When the number of data are very small,
leave out
Vars Goal Pattern 1 Vars Goal Pattern
2 Vars Goal Pattern 3 Vars Goal Pattern
4 Vars Goal Pattern 5 Vars Goal Pattern 6
train
train
leave out
step 1
step n
Collect statistics
Predict 1
Predict n
21
About noise
We have discarded noise Be careful asset1
trend1 noise1
finances asset2 trend2
noise2
correlations
Cov(i,j) 30003000 Huge CPU time
Brownian motion N(0,?t)
VaR
22
Summary
  • Time series are often structured
  • Analyze trend seasonality noise
  • Build a linear model with preprocessed data
  • Build NN on top of previous analysis
Write a Comment
User Comments (0)
About PowerShow.com