0% found this document useful (0 votes)
18 views42 pages

Lecture 2

The document discusses the concepts of autocovariance and autocorrelation functions in stationary processes, detailing their properties and the definitions of the partial autocorrelation function (PACF). It introduces the white noise process, its characteristics, and its applications, followed by methods for estimating mean, autocovariance, and autocorrelation. The document also covers ergodicity, sample functions, and representations of time series through moving average and autoregressive forms.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPT, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
18 views42 pages

Lecture 2

The document discusses the concepts of autocovariance and autocorrelation functions in stationary processes, detailing their properties and the definitions of the partial autocorrelation function (PACF). It introduces the white noise process, its characteristics, and its applications, followed by methods for estimating mean, autocovariance, and autocorrelation. The document also covers ergodicity, sample functions, and representations of time series through moving average and autoregressive forms.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPT, PDF, TXT or read online on Scribd
You are on page 1/ 42

LECTURE NOTES 2

1
THE AUTOCOVARIANCE AND THE
AUTOCORRELATION FUNCTIONS
• For a stationary process {Yt}, the
autocovariance between Yt and Yt-k is
k CovYt , Yt  k  E Yt   Yt  k   

and the autocorrelation function is


k
 k Corr Yt , Yt  k    ACF
0

2
THE AUTOCOVARIANCE AND THE
AUTOCORRELATION FUNCTIONS
PROPERTIES:
1. 0 Var Yt    0 1.
2. k 0   k 1.
3. k  k and  k   k , k .

3
THE PARTIAL AUTOCORRELATION
FUNCTION (PACF)
• PACF is the correlation between Yt and Yt-k after
their mutual linear dependency on the
intervening variables Yt-1, Yt-2, …, Yt-k+1 has been
removed.
• The conditional correlation
Corr Yt , Yt  k Yt  1 , Yt  2 ,, Yt  k 1  kk
is usually referred as the partial autocorrelation
in time series.
e.g .,  Corr Y , Y  
11 t t 1 1

22 Corr Yt , Yt  2 Yt  1  4


WHITE NOISE (WN) PROCESS

2
 a

5
WHITE NOISE (WN) PROCESS
• It is a stationary process with autocovariance
function  2 , k 0

k 
 0, k 0

ACF PACF
 1, k 0  1, k 0
 k  kk 
0, k 0 0 , k 0

Basic Phenomenon: ACF=PACF=0, k0.


6
WHITE NOISE (WN) Process
• Memoryless process
• Building block from which we can construct
more complicated models

7
EXAMPLE
• RANDOM WALK: Let e1,e2,… be a sequence of
2
i.i.d. rvs with 0 mean and variance  e . The
observed time series
Yt , t 1,2,, n
is obtained as
Y1 e1
Y2 e1  e2  Y2 Y1  e2
Y3 e1  e2  e3  Y3 Y2  e3

Yt e1    et  Yt Yt  1  et 8
WHITE NOISE (WN) Process Real-
Life Applications
 A series of numbers of a group of cars passing through a
certain location in a period of time is a clear example of a
series of errors, as there is no clear relationship between the
number of any car and the number of the previous or
upcoming car, and therefore it is not possible to rely on a
group of these numbers to predict the number of the
upcoming car.
 The winning lottery ticket numbers each month are a series of
errors as there is no relationship between the different
winning lottery ticket numbers and previous lottery ticket
numbers and cannot be relied upon to predict the winning
ticket number for the next month.
9
From the previous examples it is clear that the
error series has following characteristics:
• There is no relationship between the different
values ​of the series.
• A set of values ​in a series cannot be relied
upon to predict future values ​in the series.

10
ESTIMATION OF THE MEAN, AUTOCOVARIANCE
AND AUTOCORRELATION
• THE SAMPLE MEAN:
n
 yt
y t 1
n
0 kn  1 
with E Y   and Var Y   n   1   k .
k  n  1 n

Because Var Y   n 0, Y is a CE for  .


lim Y  in mean square 
n  
if this holds, the process is ergodic for the mean.
11
ERGODICITY
• In a strictly stationary or covariance stationary
stochastic process no assumption is made about
the strength of dependence between random
variables in the sequence. However, in many
contexts it is reasonable to assume that the
strength of dependence between random
variables in a stochastic process diminishes the
farther apart they become. This diminishing
dependence assumption is captured by the
concept of ergodicity.
12
ERGODICITY
• A process in which every sequence or sizable
sample is equally representative of the whole.
• Law of large number (LLN) tells that if Yii.i.d.(μ,
2) for i = 1, . . . , n, then we have the following
limit for the ensemblen average
 Yi
Yn i 1   .
n
• In time series, we have time series average, not
ensemble average. Hence, the mean is computed
by averaging over time. Does the time series
average converges to the same limit as the
ensemble average? The answer is yes, if Yt is
stationary and ergodic. 13
ERGODICITY
• The time-ensemble statistical properties are
the same as the realization-ensemble statistical
properties, if the process is ergodic.
• A covariance stationary process is said to
ergodic for the mean, if the time series average
converges to the population mean.
• Similarly, if the sample average provides an
consistent estimate for the second moment,
then the process is said to be ergodic for the
second moment.
14
THE SAMPLE AUTOCOVARIANCE
FUNCTION

1 n k
̂k   Yt  Y Yt  k  Y 
n t 1
or
1 n k
̂k   Yt  Y Yt  k  Y 
n  k t 1

15
THE SAMPLE AUTOCORRELATION
FUNCTION
n k
 Yt  Y Yt  k  Y 
ˆ k rk  t 1 n
, k 0,1,2,...
 Yt  Y 
2
t 1
• A plot ̂ k versus k a sample correlogram
• For large sample sizes, ̂ k is normally
distributed with mean k and variance is
approximated by Bartlett’s approximation for
processes in which k=0 for k>m.
16
THE SAMPLE AUTOCORRELATION
FUNCTION
1

Var ˆ k   1  2 12  2  22    2  m2
n

• In practice, i’s are unknown and replaced by
their sample estimates,̂i. Hence, we have the
following large-lag standard error of ̂ k :

s ˆ k 
1
n

1  2 ˆ 12  2 ˆ 22    2 ˆ m
2

17
THE SAMPLE AUTOCORRELATION
FUNCTION
• For a WN process, we have
1
sˆ k 
n
• The ~95% confidence interval for k:
1
ˆ k 2
n
For a WN process, it must be close to zero.

• Hence, to test the process is WN or not, draw a


2/n lines on the sample correlogram. If all k
1/2 ̂
are inside the limits, the process could be WN
(we need to check the sample PACF, too). 18
THE SAMPLE PARTIAL
AUTOCORRELATION FUNCTION
ˆ11 ˆ1
k1
ˆ k   ˆk  1, j ˆ k  j
j 1
ˆkk  k1
1   ˆk  1, j ˆ k  j
j 1

where ˆkj ˆk  1, j  ˆkkˆk  1,k  j , j 1,2,, k  1.

• For a WN process,
1
Var kk  
ˆ
n
• 2/n1/2 can be used as critical limits on kk to
test the hypothesis of a WN process. 19
CHARACTERISTICS OF THE WHITE
NOISE PROCESS

20
BACKSHIFT (OR LAG) OPERATORS
• Backshift operator, B is defined as
j 0
B Yt Yt  j , j 0 with B 1.
BYt Yt  1
B 2Yt Yt  2
B12Yt Yt  12

21
MOVING AVERAGE REPRESENTATION
OF A TIME SERIES
• Also known as Random Shock Form or Wold
(1938) Representation.
• Let {Yt} be a time series. Wold stated that For
any stationary process {Yt}, we can write {Yt}
as a linear combination of sequence of
uncorrelated (WN) r.v.s.
A GENERAL LINEAR PROCESS: 
Y t    t  1 t  1   2 t  2       j  t  j
j 0

2
where 0=I, t is a 0 mean WN process and  j  .

j 0
22
MOVING AVERAGE REPRESENTATION
OF A TIME SERIES


Y t    t  1B  t   2 B 2 t       j B j  t
j 0

   1  1B   2 B 2    t

   B  t where  B   1  1B   2 B     j B
2 j

j 0

23
AUTOREGRESSIVE REPRESENTATION
OF A TIME SERIES
• This representation is also known as
INVERTED FORM.
• Regress the value of Yt at time t on its own
past plus a random error.

Y t    1 Y t  1      2 Y t  2        t
 1
1   B  
        
2 B 2
 Y t     t
 B 

24
AUTOREGRESSIVE REPRESENTATION
OF A TIME SERIES
• It is an invertible process (it is important for
forecasting). Not every stationary process is
invertible (Box and Jenkins, 1978).
• Invertibility provides uniqueness of the
autocorrelation function.
• It means that different time series models can
be re-expressed by each other.

25
INVERTIBILITY RULE USING THE
RANDOM SHOCK FORM
• For a linear process,
Y t  B  t
to be invertible, the roots of (B)=0 as a
function of B must lie outside the unit circle.
• If  is a root of (B), then ||>1.

26
STATIONARITY RULE USING THE
INVERTED FORM
• For a linear process,
 B Y t     t
to be stationary, the roots of (B)=0 as a
function of B must lie outside the unit circle.
• If  is a root of (B), then ||>1.

27
RANDOM SHOCK FORM AND
INVERTED FORM
• AR and MA representations are not the model
form. Because they contain infinite number of
parameters that are impossible to estimate
from a finite number of observations.

28
TIME SERIES MODELS
• In the Inverted Form of a process, if only finite
number of  weights are non-zero, i.e.,
1 1 ,  2 2 ,,  p  p and Πk 0, k  p,

the process is called AR(p) process.

29
TIME SERIES MODELS

• In the Random Shock Form of a process, if only


finite number of  weights are non-zero, i.e.,
1  1 , 2   2 ,, q   q and k 0, k  q,

the process is called MA(q) process.

30
TIME SERIES MODELS
• AR(p) Process:
Y t    1 Y t  1        p Y t  p     t
c
Y t c  1Y t  1     pY t  p   t where   .
1- 1     p 
• MA(q) Process:
Y t    t  1 t  1    q  t  q .

31
TIME SERIES MODELS
• The number of parameters in a model can be
large. A natural alternate is the mixed AR and
MA process  ARMA(p,q) process
Y t c  1Y t  1    pY t  p  t  θ1t  1    θq t -q
1 1B     p B p Y t c  1  θ1B    θq B q t
• For a fixed number of observations, the more
parameters in a model, the less efficient is the
estimation of the parameters. Choose a
simpler model to describe the phenomenon.
32
Real-life applications of AR Models

Y t Y t  1   (1  )  t  Y t    (Y t  1   )  t
33
34
Real-life applications of MA
• Modelsmarket, the current
In the agricultural commodities
price of the commodity is determined as a result of
the interaction of the forces of supply and demand,
but practical experience indicates that there is a
degree of uncertainty regarding the future
compensation of the commodity by farmers, which
prompts market participants to store the commodity.
When new news comes to the market, such as the
commodity’s crop being damaged as a result of storms
or bad weather conditions, this news will inevitably
affect the price of the commodity.

35
36
37
If The effect of emergency news(crop damage)
continues for next days, then change in
commodity price can be represented by a higher
Order moving average model.

38
Real-life applications of ARMA Models

Below are some reasons for interest in studying


ARMA models:
1- The time series may be described by one of the
moving average models or one of the
autoregressive models, but by using a large
number of parameters, which reduces the validity
and efficiency of interpreting these parameters.
ARMA models in this case help in describing the
time series with a suitable model but using fewer
parameters
39
40
41
is ARMA(1,3) and yt is ARMA(1,5) then
Xt+yt is ARMA(2,6)

42

You might also like