0% found this document useful (0 votes)
42 views29 pages

Time Series Analysis

Time series analysis involves the decomposition of a time series into various components such as trend, seasonal variation, cyclic variation, and irregular variation. A time series is a set of observations recorded at specified time intervals and can be discrete or continuous. Common assumptions of time series include equal time gaps between observations, homogeneity of data, and long time period of available data. Time series analysis has various uses including forecasting, planning, comparing performance over time, and predicting future behavior. Key components include trends (linear, nonlinear), periodic changes (seasonal, cyclic), and random variations. Models for decomposition include additive, multiplicative, and mixed models.

Uploaded by

Md. Karimuzzaman
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOC, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
42 views29 pages

Time Series Analysis

Time series analysis involves the decomposition of a time series into various components such as trend, seasonal variation, cyclic variation, and irregular variation. A time series is a set of observations recorded at specified time intervals and can be discrete or continuous. Common assumptions of time series include equal time gaps between observations, homogeneity of data, and long time period of available data. Time series analysis has various uses including forecasting, planning, comparing performance over time, and predicting future behavior. Key components include trends (linear, nonlinear), periodic changes (seasonal, cyclic), and random variations. Models for decomposition include additive, multiplicative, and mixed models.

Uploaded by

Md. Karimuzzaman
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOC, PDF, TXT or read online on Scribd
You are on page 1/ 29

Time Series Analysis

Time Series
A time series is a set of observations each one being recorded at a specified
time . In other words, arrangement of statistical data in accordance with occurrence of time is
known as time series.

According to Ya-lun Chou, “A Time Series may be defined as a collection of readings belonging
to different time periods, of some economic variable or composite of variables”.

Mathematically, a time series is defied by the functional relationship

Yt  f t 
where is the value of the phenomenon (variable) under consideration at time .

Thus, the values of a phenomenon or variable at times are


respectively, then the series:

constitute a time series.

Thus a time series invariably gives a bivariate distribution, one of the two variables being time
and the other being the value of the phenomenon at different points of time.

Example:
1. Many time series arise in economics, such as share prices on successive days, average
incomes in successive months, company profits in successive years etc.
2. Many types of time series occur in the physical sciences, particularly in methodology marine
science and geophysics. Examples are rainfall in successive days and air temperature
measured in successive hours or months etc.
3. The analysis of sales figure in successive weeks or months is an important problem in
commerce. Marketing data have many similarities to economic data. It is an important
forecast future sale so as to plan production. It is also of interest to examine the relationship
between sales and other time series such as advertising expenditure in successive time
periods.
4. The population of Bangladesh measured at every ten years is an example of demographic
time series.
5. A special type of time series arises when observations can take one of only two values,
usually denoted by . Time series of this type called binary processes, occur particularly
Time Series Analysis-1
in communication theory. For example, the position of a switch either ‘on’ or ‘off’ could be
recorded as respectively.
6. A different type of time series occurs when we consider a series of events occurring
‘randomly’ in time. The record of the dates of major railway disasters. A series of events of
this type is known as ‘point processes’.

What are the Different Types of Time Series?


Answer:
There are two types of time series, such as:
i) Discrete time series,
ii) Continuous time series.

Discrete Time Series:


A time series is said to be discrete when observations are taken only at a specific time.

Continuous Time Series:


A time series is said to be continuous when observations are recorded continuously over some
time interval.

What are the Assumptions of a Time Series?


Answer:
The main assumptions of a time series are given below:
i) The time gap between various values must be as for as possible equal.
ii) It must consist of a homogeneous set of values.
iii) Data must be available for a long period.

What are the Uses of Time Series?

Time Series Analysis-2


Answer:
The time series analysis is of great importance not only to businessman or an economist but also
to people working in various disciplines in natural, social and physical sciences. Some of its uses
are enumerated below:
i) It enables us to determine the type and nature of the variations in the data.
ii) It helps in planning and forecasting.
iii) It is very essential in business and economics; with the help of time series we can
make plans for the future.
iv) It helps to compare the actual current performance of accomplishments with the
expected ones and analyze the causes of such variation, if any.
v) It enables us to predict or estimate or forecast the behavior of the phenomenon in
future which is very essential for business planning.
vi) It helps us to compare the changes in the values of different phenomenon at different
times or places etc.

Describe in Brief the Important Components of Time Series.


Answer:
The important components of time series are:
i) Trend or secular trend or long term movement.
ii) Periodic changes or short term movement
a. Seasonal variations
b. Cyclic variations
iii) Random or irregular variation.

i) Trend:
The general tendency of the time series data to increase or decrease during a long period of time
is called the secular trend or long term trend or simply trend. The concept of trend does not
include short range oscillations. This is true in most of business and economic statistics. Trend
may have either upward or downward movement, such as production, prices, income are upward
trend while a downward trend is noticed in the time series relating to deaths, epidemics etc.
trend is the general, smooth, long term, average tendency.

There are different types of trend:

a) Linear or Straight Line Trend:


Time Series Analysis-3
If we get a straight line, when the values are plotted on a graph, then it is called a linear trend.

See Figure 1 or Figure 1-1 Page-2, Figure-1-5 Page-5, Figure-1-9 Page-11,


Figure-1-16 Page-21 By Brockwell.

b) Non-Linear Trend:
If we get a curve after plotting the time series values then it is called non-linear or curvilinear
trend.

See Figure 2 or Figure 1-5 Page-5 By Brockwell.

ii) Periodic Changes:


It would be observed that in many social and economic phenomena there are forces at work
which prevent the smooth flow of the series in a particular direction and tend to repeat
themselves over a period of time. These forces do not act continuously but operate in a regular
spasmodic manner. These changes may be classified as:
a) Seasonal Variations and
b) Cyclic Variations
a) Seasonal Variatins:
A variation which occurs with some degree of regularity within a specified period of one year or
less than one year is called seasonal variation. There are hourly, daily, weekly, monthly or half
yearly variations also.

See Figure 3 or Figure 1-3 Page-3, Figure-1-11 Page-11 By Brockwell.


Most of the economic time series are influenced by seasonal swings, e.g., prices, production and
consumption of commodities; sales and profits in a departmental store; bank clearings and bank
deposits etc are all affected by seasonal variations.

The seasonal variations may be attributed to the following two causes:


1) There resulting form natural forces, i.e., weather and seasons.
2) There resulting from non-made conventions, i.e., Eid day, Durga Pooja, Christmas etc.

b) Cyclic Variation:
The oscillatory movements in a time series with period of oscillation more than one year are
termed as cyclic variations. One complete period is called a cycle. The cyclic movements in a
time series are generally attributed to the so called “Business Cycle” which may also be referred
to as the four phase cycle of
1) Prosperity
2) Recession

Time Series Analysis-4


3) Depression
4) Recovery
And normally lasts from seven to eleven years. Most of the economic and commercial series e.g.
series relating to prices, production and wages etc are affected by business cycles.

See Figure 4 or Figure 1-4 Page-4, Figure-1-18 Page-25 By Brockwell.

iii) Random or Irregular Variation:


Apart from the regular variations, almost all the series contain another factor called the random
or irregular or residual variations which are not accounted for by secular trend, seasonal and
cyclic variations. These variations are happened due to uncertain and unusual cases. It is purely
random and beyond the control of human hand but a part of system. These are caused by
earthquakes, wars, floods, famines, revolutions, epidemics etc.

What are the Different Types of Model are Used in Time Series?

Answer:
The following are the models commonly used for the decomposition of a time series into its
components.
i) Decomposition by additive hypothesis
ii) Decomposition by multiplicative hypothesis
iii) Mixed models.

i) Decomposition by Additive Hypothesis:


According to the additive model, a time series can be expressed as:

where is the time series value at time , is the trend value at time , is the seasonal
variation at time , is the cyclic variation at time and is the random variation at time .
Assumptions:
i) operate with equal absolute effect irrespective of the trend.
ii) will have positive or negative values and the total of positive and
negative values for any cycle (and any year) will be zero.
iii) will also have positive or negative values and in the long run will be zero.

ii) Decomposition by Multiplicative Hypothesis:


According to the multiplicative model, a time series can be expressed as:

Time Series Analysis-5


The multiplicative decomposition of a time series is same as the additive decomposition of the
logarithmic vales of the original time series. i.e.,

Assumptions:
i) All the values are positive.
ii) The geometric mean of in a year, in a cycle and in a long term period are
unity.

iii)Mixed Model:
The different combination of additive and multiplicative models are named as mixed model
defined as-

Time Plot:
The first step in analyzing a time series is to plot the observations against time. This plot is
known as time plot. It is similar plot of scatter diagram. This will show up important features
such as trend, seasonality, discontinuities and outliers etc.

See Figure 5 or Figure 1-2 Page-3, Figure-1-7 Page-9 By Brockwell.

Time Series Model:


A time series model for the observed data is a specification of the joint distributions (or

possibly only the means and co-variances) of a sequence of random variables of which
is postulated to be a realization.

Some Zero Mean Model:

IID Noise:

Time Series Analysis-6


The simplest model for a time series is one in which there is no trend or seasonal component and
in which the observations are simply independent and identically distributed (iid) random
variables with mean zero. We refer to such a sequence of random variables
as IID noise.

By definition we can write, for any positive integer and real numbers .

Where, is the cumulative distribution function, of each of the identically distributed random
variables . In this model there is no dependence between observations. In
particular, for all and all

It shows that knowledge of is of no value for predicting the behavior of .


Given the values of the function that minimizes the mean squared error

is in fact identically zero.

Although, this means that IID noise is a rather uninteresting process for forecasters, it plays an
important role as a building block for more complicated time series model.

Example (Binary Process):


Let us consider a sequence of iid random variables with and

where .

The time series obtained by tossing a penny repeatedly and scoring for each head and for
each tail is usually modeled as a realization of this process.

Random Walk:
The random walk is obtained by cumulatively summing (or integrating) iid
random variables. Thus a random walk with zero mean is obtained by defining and

Where is IID noise. If is the binary process then is called a simple


symmetric random walk.

Models with Trend and Seasonality:


The graph of the population data, which contains no apparent periodic component, the model
will be of the form
Time Series Analysis-7
Where, is a slowly changing function known as the trend component and has zero mean.
is estimated by least square methods.

In the least squares procedure we attempt to fit a parametric family of functions, e.g.,

To the data by choosing the parameters to minimize .

A General Approach to Time Series Modeling:


1) Plot series and examine the main features of the graph, checking in particular if there is:
a) a trend
b) a seasonal component
c) any apparent sharp changes in behavior
d) any outlying observations.
2) Remove the trend and seasonal components to get stationary residuals (or irregular
components). To achieve this goal it may sometimes be necessary to apply a preliminary
transformation to the data.

For example, if the magnitude of the fluctuations appears to zero roughly linearly with the
level of the series, then the transformed series will have
fluctuations of more constant magnitude. There are several ways in which trend and
seasonality can be removed, some involving estimating the components and subtracting
them from the data and others depending on differencing the data i.e., replacing the original
series by for some positive integer . Whichever method is used, the
aim is to produce a stationary series.
3) Choose a model to fit the residuals, making use of various sample statistics including the
sample auto correlation function.
4) Forecasting will be achieved by forecasting the residuals and then investing the
transformations described above to arrive at forecasts of the original series .

Stochastic Process:
A stochastic process may be defied as a collection of random variables , where
denotes the set of time points. We denote the random variable at time by if is discrete
and by if is continuous. Thus a stochastic process is a collection of random variables
which are ordered in time.

Time Series Analysis-8


In time series analysis, it is often impossible to make more than one observation at time . So that
we may only have one observation on the random variable at time .Nevertheless we may regard
the observed time series as just one example of the infinite set of time series which might have
been observed. This infinite set of time series is some times called the ensemble. Every member
of the ensemble is a possible realization of the stochastic process. The observed time series can
be thought of as one particular realization and will be denoted by for if time
points are discrete.

Stationary Process:
A time series is said to be stationary if it has similar statistical

properties to the “time shifted” series for each integer . Simply, a

time series is said to be stationary time series if it is independent of


time .

Non-Stationary:
A time series is said to be non-stationary time series if it is dependent
of time .
Weakly Stationary:
A time series is said to be weakly stationary if it’s mean function and
covariance function is independent of time , i.e.,
i) is independent of time and
ii) Is independent of time for each .
Usually weakly stationary is termed as simply stationary.

Strictly Stationary:
A time series is said to be strictly stationary if the joint distribution of
is the same as the joint distribution of for all
integers and , i.e.

Mean Function of Time Series:


Let be a time series with , then the mean function of is

Covariance Function:
Time Series Analysis-9
The covariance function of is defied as:

For all integers , and .

Properties of Strictly Stationary Time Series :


The main properties of are given below:
i) The random variables are identically distributed.
ii) ; for all integers and .
iii) is weakly stationary if for all .
iv) Weak stationary does not imply strict stationary.
v) An iid sequence is strictly stationary.

Show that a Strictly Stationary Process with is Weakly Stationary Or


Properties of Weakly Stationary Time Series.

Proof:
If then the properties of strictly stationary are-

i) the random variables are identically distributed


ii) for all integers and .
We can write is independent of and which is also

independent of . Thus is weakly stationary.

Auto-Covariance Function:
Let be any stationary time series. Then auto-covariance function of is defined
as-

Auto Correlation Function :


Let be any stationary time series. Then the auto-correlation function of is defined as:

Linearity Property of Co-variances:


If , , and are any real constants, then

Time Series Analysis-10


Example: (IID noise):
If is IID noise and , then it is obvious that for all . So it is satisfies
the first requirement of weakly stationary.

By the assume are independence,

which does not depend on .

Hence IID noise with finite second moment is stationary. i.e., . It indicates that

the random variables are independent and identically distributed random variables with mean
and variance .

Show that when .

Proof:
By definition we have,

Time Series Analysis-11


Example: (White Noise):
If is a sequence of uncorrelated random variables, each with zero mean and variance ,

then clearly is stationary with the same covariance function as the IID noise as:

which does not depend on . Such a sequence is referred to as white noise. This is indicated by
the notation
.

Note: Every sequence is but not conversely.

Example: (Random Walk):


The random walk (starting at zero) is obtained by cumulatively summing or
integrating iid random variables
.

Let is the random walk and is a iid noise with for all .

Obviously and .

We have,

and

So, depends on . So that the random walk is non-stationary.

Time Series Analysis-12


First Order Moving Average or Process:
Let the series defined by the equation

where and is a real valued constant.

Now, we have the mean function of is

i.e., mean function of is independent of .

Squaring the equation we have,

Now, we have the auto-covariance function of ,

Finally we have,

which is independent of . So, we can say that is stationary.

Time Series Analysis-13


Now, the auto-correlation function of is-

Finally we have, of is-

First Order Auto-regressive or Process:


Let us assume that be a stationary series satisfying the equations

where , and is uncorrelated with for each . Find a Solution of

Equation .

Solution:
We have

Which is the required solution of equation .


Now, the mean function of is

Time Series Analysis-14


i.e., mean function of is independent of .

Again, multiplying the equation by , we get

Taking expectation on both sides we have

Now, The auto-covariance function is-

Similarly we have,

The auto-correlation functions is-

Now,

Time Series Analysis-15


The Sample Auto-Correlation Function:
We know how to compute the auto-correlation function for a sample time series model. But in
practical problem we do not start with a model but with observed data . To
assess the degree of dependence in the data and to select a model for the data, we use Sample
Auto-correlation Function (sample ACF) of the data. If the data are stationary time series ,

then the sample ACF will provide an estimate of the ACF of . Sample Auto-Correlation
Function is given by-

Sample mean: Let be observations of a time series. Then sample mean is

Sample Auto-Covariance Function: The sample auto-covariance function is defined as

Sample Auto-Correlation Function: The sample auto-correlation function is defined as

Sample Covariance Matrix:


The sample covariance matrix is denoted by and is defined as

Sample Correlation Matrix:


The sample correlation matrix is denoted by and defined by which is non-

negative. Each of its diagonal elements is equal to 1, since .

Non-Negative Definite:
A real valued function defined on the integers is non-negative definite if

for all positive integers and vectors with real valued component .

Importance of Sample Auto-Correlation Function :


The sample auto-correlation function provides a useful measure of the degree of dependence
between the value of a time series at different times and for this reason plays an important role

Time Series Analysis-16


when we consider the prediction of future values of the series in terms of the past and present
values.
Question: State and Justify the Basic Properties of the Auto-Covariance and Auto-
Correlation of a Stationary Time Series.

Answer:
The basic properties of auto-covariance and auto-correlation function of a stationary time series
are-
1.
2.
3. is even function i.e.

Proof of the properties:


1. The first property is simply the statement that

2.

3.

Auto-covariance function have another fundamental property, namely that of non-negative


definiteness.

Example:
Suppose we have simulated values of normally distributed denoted noise.
Since the auto-correlation function for , corresponding sample auto-correlations to
be near 0. It can be shown in fact that for IID noise with finite variance, the sample auto-

correlations are approximately for large . Hence approximately

of the sample auto-correlations should fall between the bounds (since is the

quartile of the standard normal distribution. If the sample auto-correlation

function are calculated at lags . We would therefore, expect roughly


values to fall out side the bounds.

Time Series Analysis-17


Remark:
The sample auto-correlation and auto-covariance functions can be computed for any data set
and are not restricted to observations from a stationary time series. For data
containing a trend will exhibit slow decay as increases and for data a substantial
deterministic, periodic component will exhibit similar behavior with the same periodicity.
Thus can be useful as an indicator of non-stationarity.

Estimation and Elimination of Trend and Seasonal Components:


The first step in the analysis of any time series is to plat the data. If there are any apparent
discontinuities in series, such as a sudden change of level, then break it into homogeneous
segments. If there are outlying observations, they should be studied carefully to check whether
there is any justification for discarding them. The classical decomposition model-

If the seasonal and noise fluctuations appear to increase with the level of the process, then a
preliminary transformation of the data is often used to make the transformed data more
compatible with the model.
Our aim is to estimate and extract the deterministic components in the hope that the
residual or noise component will turn out to be a stationary time series. We can then use the
theory of such processes to find a satisfactory probabilistic model for the process , to analyze
its properties and to use it in conjunction with for purpose of prediction and
simulation of .

Another approach, developed by Box and JenKins is to apply differencing operators repeatedly
to the series until the differenced observations resemble a realization of some stationary

time series . We can then use the theory of stationary processes for the modeling, analysis

and prediction of and hence of the original process.

Estimation and Elimination of Trend in the Absence of Seasonality:


In the absence of seasonal component the model is defied by

Time Series Analysis-18


a) Estimation of Trend:
The following are the methods which can be used to estimate the trend:
1) Free hand or Graphical Method.
2) Semi-average Method
3) Smoothing-
i. Smoothing with a finite moving average filter.
ii. Exponential smoothing
iii. Smoothing by elimination of high frequency components
iv. Polynomial Fitting by Principle of Least Square.

Time Series Analysis-19


i. Smoothing with a Finite Moving Filter:
Let be a non-negative integer and consider the two sided moving average,

of the process defined as then for ,

Assume that is approximately linear over the interval and that the average of the
error terms over this interval is close to zero. Thus the moving average provides us with the
estimates

Since is not observed for or , we cannot use for or .

It is useful to think of as a process obtained from by application of a linear operator or


filter

This particular filter is a low-pass filter in the sense that it takes the data and removes

rapidly fluctuating or high frequency component to leave the slowly varying estimated term

. The particular filter is only one of many that could be used for smoothing.

For large , , it will not only reduce noise but also will allow linear trend

functions to pass without distribution. We must alert of choosing to be too large


since if is not linear, the fitted process will not be a good estimate of . The Spencer -
paints moving average is a filter that passes polynomials of degree without any change. Its
weights are:

Applied to the process with , it gives

Time Series Analysis-20


Where the last step depends on the assumed form of i.e., iff and

ii. Exponential Smoothing:


Since exponential smoothing is based on a moving average of past values only, often used for
forecasting, the smoothed value at the present time being used as the forecast of the next value.

For any fixed the one sided moving averages defined by the
recursions

where, is the Forecast of the time series for period .


is actual value of the time series in period .
is smoothing constant
is forecast of the time series for period

And let equal the actual value of the time series in period .

i.e.,
Application of and is often referred to as exponential smoothing since the recursions
imply that for

a weighted moving average of with weights decreasing exponentially (except for


the last one).

iii. Polynomial Fitting:


A trend of the form can be fitted to the data by choosing the

parameters to minimize the sum of squares .

The method of least squares estimation can also be used to estimate higher order polynomial
trends in the same way.

b) Trend Elimination by Differencing:


We can eliminate the trend term from time series by differencing. For this we define
difference operator by where is the backward Shift operator
and .

Time Series Analysis-21


We have, . Powers of the operators and are defined in the obvious way

Polynomials in and are manipulated in precisely the same way as polynomial functions of
real variables.

Time Series Analysis-22


For example,

If the operator is applied to linear trend function , then we obtain the constant
function

In the same way any polynomial trend of degree can be reduced to a constant by application
of the operator .

For Example, if where, and is stationary with mean zero.

Application of gives a stationary process with mean .

These considerations suggest the possibility, given any sequence of data of applying the

operator repeatedly until we find a sequence than can plausibly be modeled as a


realization of stationary process.

Advantage of Differencing:
This method has the advantage that it usually requires the estimation of fewer parameters and
does not rest on the assumption of a trend that remains fixed throughout the observation period.

Estimation and Elimination of Both Trend and Seasonality:


Suppose that a time series can be partitioned into trend, seasonal and noise components such that

where, , and , where is the number of season / period of a year.

a) Estimation of Trend and Seasonal Component:


Suppose we have observations . The trend in first estimated by applying a
moving average filter to eliminate the seasonal component and to reduce the noise.

Time Series Analysis-23


If the period is even i.e., then we have,

If the period is odd i.e., then

The second step is to estimate the seasonal component. For each , we compute

the average of the deviations . Since these average

deviations do not necessarily sum to zero, we estimate the seasonal component as

The deseasonalized (Seasonally adjusted) data is then defined to be the original series with the
estimated seasonal component removed

Finally we re-estimate the trend from the deseasonalized data. Thus the estimated noise series is
given by obtained by subtracting the estimated seasonal and
trend components from the original data.

b) Elimination of Trend and Seasonal Component by Differencing:


Let us suppose that he time series data contains trend and seasonality with the period . Assume
that and the time series can be partitioned as:

The first step is to eliminate the seasonal component from by introducing the
difference operator defined by

Applying the operator to the model where has period , we obtain

which gives a decomposition of the difference into a trend component and a

noise term . The trend can then eliminated from by applying a


power of the operator . The resulting difference series need to be analyzed in order to see
whether it is stationary or not.

Time Series Analysis-24


Testing the Estimated Noise Sequence:
The residuals are obtained either by differencing the data or by estimating and subtracting the
trend and seasonal components. If there is no dependence between these residuals, we can
regard them as observations are independent random variables and there is no further modeling
to be done except to estimate their mean and variance.

Now we examine following simple tests for checking the hypothesis that the residuals are
independent and identically distributed random variables.

It can be tested by the following Test method:


i) The Sample Auto-Correlation Function
ii) The Portmanteau Test
iii) The Turning Point Test
iv) The Difference Sign Test
v) The Rank Test
vi) Checking for Normality

i) The Sample Auto-correlation Function:


Let be an iid sequence with finite variance. Then for large . The sample auto-

correlation function of these sequence are approximately iid with .

Let be the realization of the iid sequence. Now about of the sample auto

correlations should fall between . If we compute up to as

Decision Rule:
If more than values of fall outside of the bounds or that one value falls far
outside the bounds, then we accept the null hypothesis i.e., the population sequence is not
iid otherwise reject the null hypothesis.

ii) The Portmanteau Test:

Time Series Analysis-25


To test we can consider the test statistic as:

We know that,

is approximately distributed as the sum of sequences of the independent random


variables i.e., as Chi-squared with degrees of freedom.

For a large value of suggests that the sample autocorrelations of the data are too large for an
iid sequence. We therefore reject the iid hypothesis at level if , where is
the quantile of the chi-squared distribution with

Refinement of this Test:


a) Ljung and Box (1978) modified and is defined as

whose distribution is better approximates by the Chi-squared distribution with d.f.

b) Another Portmanteau Test, formulated by McLeod and Li (1983), can be used for testing
whether the data are an iid sequence of normally distributed random variables. They
defined the test statistic as:

where, is the sample auto-correlations of the squared data.

The hypothesis of iid normal data is then rejected at level if the observed value of is larger
than the quantile of the distribution.

iii)The Turning Point Test:


If is a sequence of observations, we say that there is a turning point at time
, if i.e.,

Figure-6

Time Series Analysis-26


For any 3 points there may be two turning point. So the probability of a turning point at time is
. If is the total number of turning points of an iid sequence of length , then the expected
value of is

and the variance of is

A large value of indicates that the series is fluctuating more rapidly than expected for an

iid sequence. On the other hand the value of much smaller than zero indicates a positive
correlation between neighboring observations. For an iid sequence with large , is
approximately .
The test statistic is

This means we can carry out a test of the iid hypothesis, rejecting it at level if ,

where is the quantile of the standard normal distribution.

iv) The Difference Sign Test:


Let be a sequence of observations where is positive. Let

be the number of total count between the difference . Assume that there is no zero
difference in the sequence. If it shows a zero difference then it drops from the analysis.
Therefore iid reduced. For an iid sequence we have,

If is large then is approximately .

Test statistic is:

A large positive (or negative) value of indicates the presence of an increasing (or
decreasing) trend in the data. We therefore reject the assumption of no trend in the data if

, where is the quantile of the standard normal distribution.

Time Series Analysis-27


v) The Rank Test:
The Rank test is particularly useful for detecting a linear trend in the data. Define be the
number of pairs such that and .

There is a total of pairs . For an iid sequence each event

has probability and the mean of is-

If is large then is approximately .

Test statistic is:

A large positive (negative) value of indicates the presence of an increasing (decreasing)


trend in the data. The assumption that a sample of sequence is an iid sequence is therefore

rejected at level if , where is the quantile of the standard normal

distribution.

vi) Checking for Normality:


If the noise process is Gaussian, i.e., if all of its joint distributions are normal, then stronger
conclusions can be drown when a model is fitted to the data. The following test enables us to
check whether it is reasonable to assume that observations from an iid sequence are also
Gaussian.

Let be an order statistics of a random sample from the


distribution . If are the order statistics from a sample fo size
, then
,

where . Thus a graph of the points

(known as a ) should be approximately linear. However, if the sample values are


not normally distributed, then the graph should be nonlinear. Consequently the squared
correlation of the points should be near if the normal assumption

Time Series Analysis-28


is correct. The assumption of normality is therefore rejected if the squared correlation is

sufficiently small. If we approximate by , then reduces to

where . Percentage points for the distribution of , assuming normality of

the sample values, are given by Shapiro and Francia (1972) for sample sizes . For
.

Time Series Analysis-29

You might also like