This document discusses time series data analysis and univariate time series models. It covers key concepts such as stationarity, autoregressive and moving average processes, unit roots, and cointegration. It describes how to test for stationarity and unit roots using methods like the Dickey-Fuller test. It also discusses random walks, trends, and how to differentiate non-stationary time series to make them stationary.
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPT, PDF, TXT or read online on Scribd
0 ratings0% found this document useful (0 votes)
183 views27 pages
Time Series Chap21
This document discusses time series data analysis and univariate time series models. It covers key concepts such as stationarity, autoregressive and moving average processes, unit roots, and cointegration. It describes how to test for stationarity and unit roots using methods like the Dickey-Fuller test. It also discusses random walks, trends, and how to differentiate non-stationary time series to make them stationary.
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPT, PDF, TXT or read online on Scribd
You are on page 1/ 27
Time Series Data Analysis
Main Reading: Gujarati, Chapter 21.
Hamid Ullah Key Concepts Stationarity and non-stationarity. Autoregressive and moving average processes. Unit roots. Dickey fuller test. Cointegration and spurious regressions. Testing for cointegration. Univariate Time Series Models
Aim to describe the behaviour of a variable in terms of its past values
Why use these models?
Simple Lack of theory Useful for forecasting Autoregressive processes:
An AR(1) process is written as
y t = uy t-1 + c t
where c t ~ IID(0,o 2 )
ie. the current value of y t is equal to u times its previous value plus an unpredictable component c t
This can be extended to an AR(p) process
y t = u 1 y t-1 + u 2 y t-2 +..+u p y t-p + c t
where c t ~ IID(0,o 2 )
ie. the current value of y t depends on p past values plus an unpredictable component c t
Moving Average processes:
A MA(1) process is written as
y t = c t + oc t-1
where c t ~ IID(0,o 2 )
ie. the current value is given by an unpredictable component c t and o times the previous periods error
This can also be extended to an MA(q) process
y t = c t + o 1 c t-1 +..+ o q c t-q
where c t ~ IID(0,o 2 )
STOCHASTIC PROCESSES A random or stochastic process is a collection of random variables ordered in time and is often denoted by Y t , where t = 1,,T (where the subscript t represents time).
Time series data has a temporal ordering, unlike cross-section data.
What is Stationarity? A stochastic process is said to be stationary if its mean and variance are constant over time and the value of the covariance between the two time periods depends only on the distance or gap between the two time periods and not the actual time at which the covariance is computed Stationarity Stationarity may be strong or weak (covariance)
E(y t ) is independent of time; Var(y t ) is a finite, positive constant and independent of time; Cov(y t , y k ) is a finite function of t-k, but not of t or k The whole distribution of the variable does not depend on time
Thus, this weaker form of Stationarity requires only that the mean and variance are constant across time, and the covariance between the two time periods depends only on the distance or gap or lag between the two time periods and not the actual time at which the covariance is computed. if a time series is stationary, its mean, variance, and auto-covariance (at various lags) remain the same no matter at what point of time we measure them; that is, they are time invariant.
Non Stationarity a non stationary time series will have a time- varying mean or a time-varying variance or both.
our interest is in stationary time series, one often encounters non-stationary time series, the classic example being the random walk model (RWM).
stock prices or exchange rates, follow a random walk; that is, they are non stationary. 11 Non-stationary TS Non-stationary series can be due to deterministic trend t t e t Y + + = | o t t t e Y Y + = 1 or stochastic trend (has a unit root) or both! 12 Random Walks A random walk is an AR(1) model where 1 = 1, meaning the series is not weakly dependent. With a random walk, the expected value of y t is a constant (it doesnt depend on t). Var(y t ) = o e 2 t, so it increases with t. We say a random walk is highly persistent since E(y t+h |y t ) = y t for all h 1
Types of random walks There are two types of RWM
(1) Random walk without drift (i.e., no constant or intercept term)
(2) random walk with drift (i.e., a constant term is present.) Random Walk without Drift. The value of Y at time t is equal to its value at time (t 1) plus a random shock; thus it is an AR(1) model. Yt = Yt1 + ut Efficient capital market hypothesis argue that stock prices are essentially random and therefore there is no scope for protable speculation in the stock market: If one could predict tomorrows price on the basis of todays price, we would all be millionaires. Y1 = Y0 + u1 Y2 = Y1 + u2 = Y0 + u1 + u2 Y3 = Y2 + u3 = Y0 + u1 + u2 + u3 In general, if the process started at some time 0 with a value of Y0, we have Yt = Y0 + ut the mean of Y is equal to its initial, or starting, value, which is constant, but as t increases, its variance increases Indenitely, thus violating a condition of stationarity. In short, the RWM without drift is a non stationary stochastic process. (Yt Yt1) = Yt = ut its rst difference is stationary. In other words, the rst differences of a random walk time series are stationary. So it is Difference Stationarity Process (DPS) Random Walk with Drift. Yt = + Yt1 + ut where is known as the drift parameter. The name drift comes from the fact that if we write the preceding equation as
it shows that Yt drifts upward or downward, depending on being positive or negative. Note that above model is also an AR(1) model. var (Yt ) = t2 RWM with drift the mean as well as the variance increases over time, again violating the conditions of (weak) Stationarity.
The above model is a DSP process because the non Stationarity in Yt can be eliminated by taking rst differences of the time series. 17 Trending Series The general tendency of the time series data to increase or decrease during a long period of time is called the secular trend or long term trend or simply trend. The concept of trend does not include short range oscillations. Trend may have either upward or downward movement, such as production, prices, income are upward trend while a downward trend is noticed in the time series relating to deaths, epidemics etc. A trending series cannot be stationary, since the mean is changing over time. If a series is weakly dependent and is stationary about its trend, we will call it a trend-stationary process.
18 Types of Trends Linear or Straight Line Trend: If we get a straight line, when the values are plotted on a graph, then it is called a linear trend. y t = 0 + 1 t + e t , t = 1,2, Non-Linear Trend: If we get a curve after plotting the time series values then it is called non-linear or curvilinear trend. a. Another possibility is an exponential trend, which can be modeled as log(y t ) = 0 + 1 t + e t , t = 1,2, b. Another possibility is a quadratic trend, which can be modeled as y t = 0 + 1 t + + 2 t 2 + e t , t = 1,2, 19 Unit Root We can write the AR(1) model as follows: More precisely, if is 1 [a random walk], we face what is known as a unit root problem. In fact, this is exactly the random walk (without drift) described above. Note, the variance is not stationary. 20 Non-stationary to stationary Add a trend (i.e. make it trend stationary) Difference the series (i.e. make it stochastic difference stationary) Stationary series are I(0) i.e. integrated of order 0. For I(1) series, difference once to make stationary; For I(2) series, difference twice to make stationary, etc. Or, add a trend as well as difference of the series 21 Graphical test for Stationarity One simple test of stationarity is based on the so- called autocorrelation function (ACF). The ACF at lag k, denoted by k , is defined as This can be estimated by the sample ACF 22 In other words, the autocorrelation function k
is the correlation between a variable (say, Y) and Y lagged k periods. 23 Testing for Unit Roots Consider an AR(1): y t = o + y t-1 + e t
Let H 0 : = 1, (assume there is a unit root). Define u = 1 and subtract y t-1 from both sides to obtain Ay t = o + uy t-1 + e t . Unfortunately, a simple t-test is inappropriate, since this is an I(1) process. A Dickey-Fuller Test uses the t-statistic, but different critical values. 24 Testing for Unit Roots (cont.d.) We can add p lags of Ay t to allow for more dynamics in the process. Still want to calculate the t-statistic for u . Now its called an augmented Dickey-Fuller test, but still the same critical values. The lags are intended to clear up any serial correlation, if too few, test wont be right. 25 Testing for Unit Roots w/ Trends If a series is clearly trending, then we need to adjust for that or might mistake a trend stationary series for one with a unit root. Can just add a trend to the model. Still looking at the t-statistic for q, but the critical values for the Dickey-Fuller test change. 26 The Augmented DickeyFuller (ADF) Test The ADF test consist of estimating the following equation If 1 = 0 then no deterministic trend If = 0 then the series has stochastic trend For example, = ( -1) for AR(1) model (where if you recall, unit root means =1). 27 AR(p) model And its generalization with p-lags can be written as: t p t p t t e Y Y t Y + + + + + = + 3 1 3 2 1 ... | | | | t m t m t t t e Y Y Y t Y + A + + A + + + = A
o o o | | ... 1 1 1 2 1 which can be written as: The deterministic trend with stationary AR(1) component can be written as: t t t e Y t Y + + + = 1 3 2 1 | | |