100% found this document useful (1 vote)
82 views26 pages

Partial Autocorrelations: Timotheus Darikwa SSTA031: Time Series Analysis

For MA(1); k=p=1; j=1 1  11 11 = -θ So the PACF cuts off after the first lag and φ11 = -θ. For higher order MA models, the PACF will cut off after lag q.

Uploaded by

Maggie Kalembo
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPT, PDF, TXT or read online on Scribd
100% found this document useful (1 vote)
82 views26 pages

Partial Autocorrelations: Timotheus Darikwa SSTA031: Time Series Analysis

For MA(1); k=p=1; j=1 1  11 11 = -θ So the PACF cuts off after the first lag and φ11 = -θ. For higher order MA models, the PACF will cut off after lag q.

Uploaded by

Maggie Kalembo
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPT, PDF, TXT or read online on Scribd
You are on page 1/ 26

Partial Autocorrelations

Timotheus Darikwa
SSTA031: Time Series Analysis

University of Limpopo
Intro
• For MA(q) models the autocorrelation function is
zero for lags beyond q

• Autocorrelation is a good indicator of the order


of the process.
Moving Average Processes
• MA(1): First Order MA process
E (Yt )    0
Yt    et  et 1
Yt  et  et 1
• “moving average”
– Yt is constructed from a weighted sum of the two
most recent values of e .
Properties of MA(1)
E  Yt     0
Var  Yt   E  Yt     E   t   t 1 
2 2


 E  t2  2 t  t 1   2 t21 

 1 2  2 
E  Yt    Yt 1     E   t   t 1   t 1   t  2 

 E  t  t 1   t21   t  t  2   2 t 1 t  2 
  2
E  Yt    Yt  k    0
for k>1
MA(1)
• Covariance stationary
– Mean and autocovariances are not functions of time
• Autocorrelation of a covariance-stationary
process k
k 
0
• MA(1)   2

1  
1   
2 2
1   2 
Autocorrelation Function for MA(1):
Yt   t  0.8 t 1

1.0

0.8
A u to c o rre la tio n

0.6

0.4

0.2

0.0

0 5 10 15 20
Lag
Intro
• However, the autocorrelations of an AR(p) model
do not become zero after a certain number of lags
—they die off rather than cut off.

• So a different function is needed to help


determine the order of autoregressive
Autoregressive Processes
• AR(1): First order autoregression

Yt  c  Yt 1  et
• Stationarity: We will assume  1
• Can represent as an MA () :
Yt   c  et     c  et 1     c  et  2   
2

 c 
   et  et 1   et  2  
2

 1    
Properties of AR(1)
c
  0 ???????
1   
 0  E  Yt   
2

 
 E  t   t 1    t  2  
2 2


 1  2   4   2 
 2

1   
2
Properties of AR(1), cont.

 k  E  Yt    Yt  k   
   
 E  t   t 1   2 t  2     k  t  k     t  k   t  k 1   2 t  k  2  

  k   k 2   k 4    2

  k 1  2   4  2 
 k  2
 
 1  
 2
 

k  k   k
0
Autocorrelation Function for AR(1):
Yt  0.8Yt 1   t

1.0

0.8
A u to c o rre la tio n

0.6

0.4

0.2

0.0

0 5 10 15 20
Lag
Autocorrelation Function for AR(1):
Yt  0.8Yt 1   t

1.0

0.5
A u to c o rre la tio n

0.0

-0.5

0 5 10 15 20
Lag
Intro

•So a different function is needed to help determine


the order of autoregressive

•This function is called the partial autocorrelation


function
PACF
• If Yt is a normally distributed time series we
can define the PACF as
kk  Corr (Yt , Yt  k | Yt 1 , Yt  2 ,, Yt  k 1 )

• For a given ACF,  k , you find the PACF using


kk 
Yule-Walker equations:
 j  k1  j 1  k 2  j  2  k 3  j 3    kk  j  k
j  1,2,..., k
PACF
• We have k linear equations as shown below:
1  k1  0  k 2 1  k 3  2    kk  k 1
 2  k1 1  k 2  0  k 3 1    kk  k  2
.
.
.

 k  k1  k 1  k 2  k  2  k 3  k 3    kk  0
• Here 1 ,  2 , ....,  k are assumed to be known
and we must solve this system of linear
equations for k 1 , k 2 , kk .
Solving System of Linear Equations:
Cramer’s Rule

.
.
.
Solving System of Linear Equations

.
.
.
Solving System of Linear Equations
• 2 by 2 Matrix

.
.
.
Solving System of Linear Equations

.
.
.
PACF
• We have k linear equations as shown below:
k1  k 2 1  k 3  2    kk  k 1  1
k1 1  k 2  0  k 3 1    kk  k  2   2
.
.
.

k1  k 1  k 2  k  2  k 3  k 3    kk   k
• Here 1 ,  2 , ....,  k are assumed to be known
and we must solve this system of linear
equations for k 1 , k 2 , kk .
Use Cramer’s rule to solve system of linear equations.
The sample partial autocorrelation function is defined by:
 0  1 1   k 2 1
1 1   k 3 2
   
ˆ  k 1  k  2  1 k
 kk 
1 1   k 1
1 1   k 2
   
 k 1  k  2  1
PACF: AR(1) Model
• For an AR(p) process we have k=p :
kk   pp   p
kk  0 for k  p

• For AR(1); k=p=1; Substitute in


 j  k1  j 1  k 2  j 2  k 3  j 3    kk  j  k
• j=1. 1  11 11
1  11
11  
Since k   k
PACF: AR(2) Model
• For AR(2); k=p=2; j=1, 2.
 j  k1  j 1  k 2  j  2  k 3  j 3    kk  j  k

j  1; 1  21  22 1
j  2;  2  21 1  22
PACF: MA(1) Model

 j  k1  j 1  k 2  j  2  k 3  j 3    kk  j  k

You might also like