Lecture Notes in Statistics 145 Chapter 3 Part 2
Lecture Notes in Statistics 145 Chapter 3 Part 2
Lecture Notes in Statistics 145 Chapter 3 Part 2
Yt = + Yt 1 + t
E [Yt ] = E [ + Yt 1 + t ]
E [Yt ] = + E [Yt 1 ] + E [ t ]
= +
(1 ) =
= .
(1 )
Remarks:
The effect of the constant term depends on the
autoregressive parameter .
Note that the mean is not defined when =1. The process
where the AR parameter is unity is not stationary.
Yt = + Yt 1 + t
Yt = (1 ) + Yt 1 + t
Yt = (Yt 1 ) + t
X t = X t 1 + t .
46
Note that the series and the mean-adjusted one have the same
variances: 0 = Var(Yt ) = Var(Yt ) = Var( X t ) . To derive the it,
[ ]
Var( X t ) = E X t E[X t ]
2 2
= E [(X ]
2
t 1 + t )
=E X [ 2 2
t 1
2
+ + 2 X t 1 t
t ]
= E X 2
[ ] + E[ ] + 2E[X
2
t 1
2
t
2
t 1 t ]
= Var( X t 1 ) + + 0.
2 2
Remarks:
The effect of the noise variance also depends on the
autoregressive parameter .
Note that the variance is defined only when ||<1. This is
the restriction on the AR parameter for AR(1) to be
stationary.
Cov( X t , X t k ) = E ( X t X t k )
k = 1 : 1 = E ( X t X t 1 ) = E[(X t 1 + t )X t 1 ]
2
[ ] [ ]
= E X t21 + E Yt*1 t = 0 =
12
k = 2 : 2 = E ( X t X t 2 ) = E[(X t 1 + t )X t 2 ]
2
= E[X t 1 X t 2 ] + E[X t 2 t ] = 1 = 2 0 = 2
12
2
in general, k = E ( X t X t k ) = k 0 = k .
12
47
The covariance structure of AR(1) does not depend on time t but on
the distance k between the two variables.
Here are some simulated examples of AR(1) models. The red line in
the ACF and the PACF are the expected behavior:
Noise: yt = t
-1
-2
-3
-4
100 200 300 400 500 600 700 800 900 1000
10
-2
-4
-6
-8
100 200 300 400 500 600 700 800 900 1000
48
High negative AR parameter: yt = 0.9 yt 1 + t . Note the alternating
sign of the ACF when the parameter is negative.
12
-4
-8
100 200 300 400 500 600 700 800 900 1000
-1
-2
-3
-4
100 200 300 400 500 600 700 800 900 1000
Yt = + 1Yt 1 + 2Yt 2 + t .
49
(1 1 2 ) =
= .
1 1 2
0 = 1 1 + 2 2 + 2 . (1)
1
1 = 1 + 2 1 1 =
1 2
12
2 = 11 + 2 2 = + 2
1 2
k = 1 k 1 + 2 k 1 , k 3.
50
The partial autocorrelations for the AR(2) process is (derivation is
not shown):
2
1 , 2 21 ,0,0,... .
1 1
y t = 0.5 y t 1 + 0.3 y t 2 + t
-1
-2
-3
-4
100 200 300 400 500 600 700 800 900 1000
51
y t = 0.8 yt 2 + t
-2
-4
-6
100 200 300 400 500 600 700 800 900 1000
y t = 1.3 y t 1 0.8 y t 2 + t
12
-4
-8
100 200 300 400 500 600 700 800 900 1000
[ ] [
Var(Yt ) = Var( X t ) = E X t2 = E ( t + t 1 )
2
]
[ ]
= E t2 + 2 E [ ]+ 2E[ ]
2
t 1 t t 1
2 2 2
= +
= (1 + 2 ) 2 .
52
To derive the autocovariance function, Cov(Yt , Yt k ) = Cov( X t , X t k )
and Cov( X t , X t k ) = E ( X t X t k ) and are given as follows:
1 = E ( X t X t 1 ) = E [( t + t 1 )( t 1 + t 2 )]
[ ]
= E t t 1 + t t 2 + t21 + 2 t 1 t 2 = 2
2 = E ( X t X t 2 ) = E[( t + t 1 )( t 2 + t 3 )]
[ ]
= E t t 2 + t t 3 + t 1 t 2 + 2 t 1 t 3 = 0
M
k = E ( X t X t k ) = 0, k 2.
The ACF can right away be derived from the autocovariance function:
1 2
1 = = =
0 (1 + )
2 2
1+ 2
0
k = k = = 0, k 2.
0 (1 + 2 ) 2
E [Yt ] = E [ + t + 1 t 1 + 2 t 2 ] =
(
Var[Yt ] = 0 = 1 + 1 + 2 2
2 2
)
Autocovariance and autocorrelation functions:
1 + 1 2
1 = (1 + 1 2 ) 2 1 = 2 2
1 + 1 + 2
2
2 = 2 2 2 = 2 2
1 + 1 + 2
k = 0, k 3 k = 0, k 3
53
y t = t + 0.9 t 1
-2
-4
-6
100 200 300 400 500 600 700 800 900 1000
y t = t 0.5 t 1
-1
-2
-3
-4
-5
100 200 300 400 500 600 700 800 900 1000
y t = t 1.3 t 1 + 0.85 t 2
-2
-4
-6
-8
100 200 300 400 500 600 700 800 900 1000
54