AMFE Module 3 AR ARMA Model
AMFE Module 3 AR ARMA Model
Financial Econometrics
Auto Regressive Process
Auto Regressive Moving Average Process
Course Instructor:
Dr. Devasmita Jena
Pth-order, AR(p) Auto Regresive Process
• --- (1)
• AR model is one where the current value of Y is dependent on its own previous period
values and an error term (white noise with usual properties)
• Eq. (1) => --- (2), or, using L-operator:
• --- (3)
• (L) = --- (4); where
• Setting =0, we get (L) = or = (L)-1 --- (5)
• The process is stationary (L)-1 converges to zero => autocorrelation will decline
MA(∾)
• If AR(p) is stationary, the coeff. of MA(∾) process will decline
gradually with increase in lag length. If not, then the coeff. of
Pth-order, AR(p) Auto Regresive Process
• The general condition for testing stationarity of AR(p) process:
• All the roots of the characteristic equation: lie outside the unit circle
• The roots of characteristic equation as all greater than 1 in absolute terms
• Characteristic equation: determine the characteristics of the process, . For
example, the ACF of an AR process depends on the roots of this equation.
• Examples:
MA(∾) process.
processes – a purely deterministic part and a purely stochastic part, which will be
• The importance of this theorem is that the autocovariances and the ACF, of AR(p),
can be obtained, using the properties of the error terms, by solving a set of
simultaneous equations called the Yule Walker Equations.
Wold’s Decomposition Theorem
Multiplying both sides by
And so on,
These are Yule walker equations. For known values of , 1 st lag autocovarince can be obtained. Higher order
autocovariances can be obtained by using recursive relations. The AR(p) ACF tails off as k gets larger. It
does so as a mixture of exponential decays and/or dampened sine waves, depending on if roots are real or
Partial Autocorrelation Function
• PACF measure the correlation b/w an observation k periods ago and the current observation,
after controlling for observations at intermediate lags, i.e., the correlation between Yt and Yt-k
after removing the effects of Yt-k+1, Yt-k+2,…Yt-1.
• In case of AR(p), there will be direct connections between Y t and Yt-s for s≤ p, but no direct
connection for s>p.
• Thus, PACF will have non zero partial autocorrelation for lags up to the order of the model, but
will have zero partial autocorrelation coefficients thereafter.
• What Shape would PACF take for an MA process
• Hint: Think MA model as being transformed into an AR in order to consider whether Y t and Yt-k, k=1,2,3,…
are directly connected. In fact, as long as MA(q) process is invertible, it can be expressed as AR process of
infinite order.
Auto Regresive-Moving Average Process (ARMA)
• ARMA (p,q) process includes both the AR and the MA terms
• --- (1)
• Using L-operator:
• L - L2 -…-Lp) = c +(1+L + L2+…+Lq) --- (2)
• Diving both sides by L - L2 -…-Lp) --- (3):
• 𝜇+ѱ(L) ---(4); ѱ(L)= (1+L + L2+…+Lq)/L - L2 -…-Lp) and
𝜇= c/L - L2 -…-Lp)
• Eq. (4) is stationary (L)-1 if converges
• E()= 𝜇; V()=Ɣ0
• Stationarity of ARMA process depends on the roots of
characteristic equation attached with AR terms; the roots should
lie outside the unit circle.
ARMA(p,q) Process
• The stationarity of ARMA process depends entirely on the AR
parameters (, and not on the MA parameters (…,.
• Autocovariance and ACF of ARMA(p,q)
• Are solved using Yule Walker equations
• The autocov and acf of ARMA(p,q) model follow the same pth order
difference equation as the process itself
A note on the ACF and PACF of ARMA Process
• ACF and PACF can distinguish between a pure AR and MA
process
• AR process has a geometrically declining ACF process, but a PACF
which cuts off at zero after pth lag
• MA process has a geometrically declining PACF process, but an ACF
which cuts off at zero after qth lag
• An ARMA process will have geometrically declining ACF as well
as geometrically declining PACF
Invertibility
• Invertibility of MA (1) process: ---(1); is the white noise with usual properties and || <1
• The invertibility condition prevents the model, under AR(), from exploding. That is, the
• The only difference: invertibility condition is applied to MA and stationarity to AR
representation. Both these representation has the same 1st and 2nd order
moments
• The borderline case with = ± 1, there is only one representation and that
is non invertible
• Not only do the invertible and non invertible have the same moments, both
are equally valid description of MA(1) process. Also, either of the
representation could characterize any given data equally well.
• We prefer the invertible representation over the non invertible one
• to find the value of for date t with invertible representation we need to know current and past values of Y
• to find the value of for date t with non-invertible representation, we need to use future values of Y
Invertibility of MA(q) process
• Diving both side of (1) by : /
• MA (q) process: ---(1); is the white noise with usual properties
When model is not invertible, error terms can still be represented as future period
•
• For ARMA (p,q) to be stationary, roots of AR characteristic polynomial:should all exceed 1 in absolute terms