0% found this document useful (0 votes)
36 views46 pages

Stochastic Process

Uploaded by

Solopro
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
36 views46 pages

Stochastic Process

Uploaded by

Solopro
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 46

Stationary process

21 languages
Article
Talk
Read
Edit
View history
Tools
Appearance hide

Text
Small
Standard
Large
Width
Standard
Wide
Color (beta)
Automatic
Light
Dark
Report an issue with dark mode
From Wikipedia, the free encyclopedia

In mathematics and statistics, a stationary process (or a strict/strictly stationary


process or strong/strongly stationary process) is a stochastic process whose
unconditional joint probability distribution does not change when shifted in time.
Consequently, parameters such as mean and variance also do not change over time.

Since stationarity is an assumption underlying many statistical procedures used in time


series analysis, non-stationary data are often transformed to become stationary. The
most common cause of violation of stationarity is a trend in the mean, which can be due
either to the presence of a unit root or of a deterministic trend. In the former case of a
unit root, stochastic shocks have permanent effects, and the process is not mean-
reverting. In the latter case of a deterministic trend, the process is called a trend-
stationary process, and stochastic shocks have only transitory effects after which the
variable tends toward a deterministically evolving (non-constant) mean.
A trend stationary process is not strictly stationary, but can easily be transformed into a
stationary process by removing the underlying trend, which is solely a function of time.
Similarly, processes with one or more unit roots can be made stationary through
differencing. An important type of non-stationary process that does not include a trend-
like behavior is a cyclostationary process, which is a stochastic process that varies
cyclically with time.

For many applications strict-sense stationarity is too restrictive. Other forms of


stationarity such as wide-sense stationarity or N-th-order stationarity are then
employed. The definitions for different kinds of stationarity are not consistent among
different authors (see Other terminology).

Strict-sense stationarity[edit]
Definition[edit]

Formally, let

{𝑋𝑡}

be a stochastic process and let

𝐹𝑋(𝑥𝑡1+𝜏,…,𝑥𝑡𝑛+𝜏)

represent the cumulative distribution function of the


unconditional (i.e., with no reference to any particular starting value) joint distribution of

{𝑋𝑡}

at times

𝑡1+𝜏,…,𝑡𝑛+𝜏

. Then,

{𝑋𝑡}
is said to be strictly stationary, strongly stationary or strict-sense stationary
[1]: p. 155
if

𝐹𝑋(𝑥𝑡1+𝜏,…,𝑥𝑡𝑛+𝜏)=𝐹𝑋(𝑥𝑡1,…,𝑥𝑡𝑛)for all 𝜏,𝑡1,…,𝑡𝑛∈𝑅 and for all


𝑛∈𝑁>0 (Eq.
1)

Since

does not affect

𝐹𝑋(⋅)

𝐹𝑋

is independent of time.

Examples[edit]
Two simulated time series processes, one stationary and the other non-stationary, are shown
above. The augmented Dickey–Fuller (ADF) test statistic is reported for each process; non-
stationarity cannot be rejected for the second process at a 5% significance level.

White noise is the simplest example of a stationary process.

An example of a discrete-time stationary process where the sample space is also


discrete (so that the random variable may take one of N possible values) is a Bernoulli
scheme. Other examples of a discrete-time stationary process with continuous sample
space include some autoregressive and moving average processes which are both
subsets of the autoregressive moving average model. Models with a non-trivial
autoregressive component may be either stationary or non-stationary, depending on the
parameter values, and important non-stationary special cases are where unit roots exist
in the model.

Example 1[edit]

Let

be any scalar random variable, and define a time-series


{𝑋𝑡}

, by

𝑋𝑡=𝑌 for all 𝑡.

Then

{𝑋𝑡}

is a stationary time series, for which realisations consist of a series of constant


values, with a different constant value for each realisation. A law of large numbers does
not apply on this case, as the limiting value of an average from a single realisation takes
the random value determined by

, rather than taking the expected value of

The time average of

𝑋𝑡

does not converge since the process is not ergodic.

Example 2[edit]

As a further example of a stationary process for which any single realisation has an
apparently noise-free structure, let

have a uniform distribution on


[0,2𝜋]

and define the time series

{𝑋𝑡}

by

𝑋𝑡=cos⁡(𝑡+𝑌) for 𝑡∈𝑅.

Then

{𝑋𝑡}

is strictly stationary since (

(𝑡+𝑌)

modulo

2𝜋

) follows the same uniform distribution as

for any

Example 3[edit]

Keep in mind that a weakly white noise is not necessarily strictly stationary. Let
𝜔

be a random variable uniformly distributed in the interval

(0,2𝜋)

and define the time series

{𝑧𝑡}

𝑧𝑡=cos⁡(𝑡𝜔)(𝑡=1,2,...)

Then

𝐸(𝑧𝑡)=12𝜋∫02𝜋cos⁡(𝑡𝜔)𝑑𝜔=0,Var⁡(𝑧𝑡)=12𝜋∫02𝜋cos2⁡(𝑡𝜔)𝑑𝜔=1/2,Cov⁡(𝑧𝑡,𝑧𝑗)
=12𝜋∫02𝜋cos⁡(𝑡𝜔)cos⁡(𝑗𝜔)𝑑𝜔=0∀𝑡≠𝑗.

So

{𝑧𝑡}

is a white noise in the weak sense (the mean and cross-covariances are zero, and
the variances are all the same), however it is not strictly stationary.

Nth-order stationarity[edit]
In Eq.1, the distribution of

samples of the stochastic process must be equal to the distribution of the samples
shifted in time for all

. N-th-order stationarity is a weaker form of stationarity where this is only requested for
all

up to a certain order

. A random process

{𝑋𝑡}

[1]: p. 152
is said to be N-th-order stationary if:

𝐹𝑋(𝑥𝑡1+𝜏,…,𝑥𝑡𝑛+𝜏)=𝐹𝑋(𝑥𝑡1,…,𝑥𝑡𝑛)for all 𝜏,𝑡1,…,𝑡𝑛∈𝑅 and for all 𝑛∈{1,


…,𝑁} (Eq.
2)

Weak or wide-sense stationarity[edit]


Definition[edit]

A weaker form of stationarity commonly employed in signal processing is known as


weak-sense stationarity, wide-sense stationarity (WSS), or covariance
stationarity. WSS random processes only require that 1st moment (i.e. the mean) and
autocovariance do not vary with respect to time and that the 2nd moment is finite for all
times. Any strictly stationary process which has a finite mean and covariance is also
[2]: p. 299
WSS.

So, a continuous time random process

{𝑋𝑡}

which is WSS has the following restrictions on its mean function

𝑚𝑋(𝑡)≜E⁡[𝑋𝑡]

and autocovariance function

𝐾𝑋𝑋(𝑡1,𝑡2)≜E⁡[(𝑋𝑡1−𝑚𝑋(𝑡1))(𝑋𝑡2−𝑚𝑋(𝑡2))]

𝑚𝑋(𝑡)=𝑚𝑋(𝑡+𝜏)for all
𝜏,𝑡∈𝑅𝐾𝑋𝑋(𝑡1,𝑡2)=𝐾𝑋𝑋(𝑡1−𝑡2,0)for all 𝑡1,𝑡2∈𝑅E⁡[|𝑋𝑡|
2]<∞for all 𝑡∈𝑅
(Eq.3
)

The first property implies that the mean function

𝑚𝑋(𝑡)

must be constant. The second property implies that the autocovariance function
depends only on the difference between

𝑡1

and

𝑡2
[1]: p. 159
and only needs to be indexed by one variable rather than two variables. Thus,
instead of writing,

𝐾𝑋𝑋(𝑡1−𝑡2,0)

the notation is often abbreviated by the substitution

𝜏=𝑡1−𝑡2

𝐾𝑋𝑋(𝜏)≜𝐾𝑋𝑋(𝑡1−𝑡2,0)

This also implies that the autocorrelation depends only on

𝜏=𝑡1−𝑡2

, that is

𝑅𝑋(𝑡1,𝑡2)=𝑅𝑋(𝑡1−𝑡2,0)≜𝑅𝑋(𝜏).

The third property says that the second moments must be finite for any time

Motivation[edit]

The main advantage of wide-sense stationarity is that it places the time-series in the
context of Hilbert spaces. Let H be the Hilbert space generated by {x(t)} (that is, the
closure of the set of all linear combinations of these random variables in the Hilbert
space of all square-integrable random variables on the given probability space). By the
positive definiteness of the autocovariance function, it follows from Bochner's theorem
that there exists a positive measure
𝜇

2
on the real line such that H is isomorphic to the Hilbert subspace of L (μ) generated
−2πiξ⋅t
by {e }. This then gives the following Fourier-type decomposition for a continuous
time stationary stochastic process: there exists a stochastic process

𝜔𝜉

with orthogonal increments such that, for all

𝑋𝑡=∫𝑒−2𝜋𝑖𝜆⋅𝑡𝑑𝜔𝜆,

where the integral on the right-hand side is interpreted in a suitable (Riemann) sense.
The same result holds for a discrete-time stationary process, with the spectral measure
now defined on the unit circle.

When processing WSS random signals with linear, time-invariant (LTI) filters, it is
helpful to think of the correlation function as a linear operator. Since it is a circulant
operator (depends only on the difference between the two arguments), its
eigenfunctions are the Fourier complex exponentials. Additionally, since the
eigenfunctions of LTI operators are also complex exponentials, LTI processing of WSS
random signals is highly tractable—all computations can be performed in the frequency
domain. Thus, the WSS assumption is widely employed in signal processing algorithms.

Definition for complex stochastic process[edit]

In the case where

{𝑋𝑡}

is a complex stochastic process the autocovariance function is defined as

𝐾𝑋𝑋(𝑡1,𝑡2)=E⁡[(𝑋𝑡1−𝑚𝑋(𝑡1))(𝑋𝑡2−𝑚𝑋(𝑡2))¯]
and, in addition to the
requirements in Eq.3, it is required that the pseudo-autocovariance function

𝐽𝑋𝑋(𝑡1,𝑡2)=E⁡[(𝑋𝑡1−𝑚𝑋(𝑡1))(𝑋𝑡2−𝑚𝑋(𝑡2))]

depends only on the time lag. In


formulas,

{𝑋𝑡}

is WSS, if

𝑚𝑋(𝑡)=𝑚𝑋(𝑡+𝜏)for all
𝜏,𝑡∈𝑅𝐾𝑋𝑋(𝑡1,𝑡2)=𝐾𝑋𝑋(𝑡1−𝑡2,0)for all
𝑡1,𝑡2∈𝑅𝐽𝑋𝑋(𝑡1,𝑡2)=𝐽𝑋𝑋(𝑡1−𝑡2,0)for all 𝑡1,𝑡2∈𝑅E⁡[|
𝑋(𝑡)|2]<∞for all 𝑡∈𝑅
(Eq.4
)

Joint stationarity[edit]
The concept of stationarity may be extended to two stochastic processes.

Joint strict-sense stationarity[edit]

Two stochastic processes

{𝑋𝑡}

and

{𝑌𝑡}
are called jointly strict-sense stationary if their joint cumulative distribution

𝐹𝑋𝑌(𝑥𝑡1,…,𝑥𝑡𝑚,𝑦𝑡1′,…,𝑦𝑡𝑛′)

remains unchanged under time shifts, i.e. if

𝐹𝑋𝑌(𝑥𝑡1,…,𝑥𝑡𝑚,𝑦𝑡1′,…,𝑦𝑡𝑛′)=𝐹𝑋𝑌(𝑥𝑡1+𝜏,…,𝑥𝑡𝑚+𝜏,𝑦𝑡1′+𝜏,…,𝑦𝑡𝑛′+𝜏)for
(E
all 𝜏,𝑡1,…,𝑡𝑚,𝑡1′,…,𝑡𝑛′∈𝑅 and for all 𝑚,𝑛∈𝑁 q.5
)

Joint (M + N)th-order stationarity[edit]

Two random processes

{𝑋𝑡}

and

{𝑌𝑡}

[1]: p. 159
is said to be jointly (M + N)-th-order stationary if:

𝐹𝑋𝑌(𝑥𝑡1,…,𝑥𝑡𝑚,𝑦𝑡1′,…,𝑦𝑡𝑛′)=𝐹𝑋𝑌(𝑥𝑡1+𝜏,…,𝑥𝑡𝑚+𝜏,𝑦𝑡1′+𝜏,…,𝑦𝑡𝑛′+𝜏)for (E
all 𝜏,𝑡1,…,𝑡𝑚,𝑡1′,…,𝑡𝑛′∈𝑅 and for all 𝑚∈{1,…,𝑀},𝑛∈{1,…,𝑁} q.
6)

Joint weak or wide-sense stationarity[edit]

Two stochastic processes

{𝑋𝑡}

and
{𝑌𝑡}

are called jointly wide-sense stationary if they are both wide-sense stationary
and their cross-covariance function

𝐾𝑋𝑌(𝑡1,𝑡2)=E⁡[(𝑋𝑡1−𝑚𝑋(𝑡1))(𝑌𝑡2−𝑚𝑌(𝑡2))]

depends only on the time


difference

𝜏=𝑡1−𝑡2

. This may be summarized as follows:

𝑚𝑋(𝑡)=𝑚𝑋(𝑡+𝜏)for all 𝜏,𝑡∈𝑅𝑚𝑌(𝑡)=𝑚𝑌(𝑡+𝜏)for all


𝜏,𝑡∈𝑅𝐾𝑋𝑋(𝑡1,𝑡2)=𝐾𝑋𝑋(𝑡1−𝑡2,0)for all
𝑡1,𝑡2∈𝑅𝐾𝑌𝑌(𝑡1,𝑡2)=𝐾𝑌𝑌(𝑡1−𝑡2,0)for all
𝑡1,𝑡2∈𝑅𝐾𝑋𝑌(𝑡1,𝑡2)=𝐾𝑋𝑌(𝑡1−𝑡2,0)for all 𝑡1,𝑡2∈𝑅
(Eq.7
)

Relation between types of


stationarity[edit]
● If a stochastic process is N-th-order stationary, then it is also M-th-order
stationary for all
● 𝑀≤𝑁
● .
● If a stochastic process is second order stationary (
● 𝑁=2
● ) and has finite second moments, then it is also wide-sense stationary.
[1]: p. 159
● If a stochastic process is wide-sense stationary, it is not necessarily second-
[1]: p. 159
order stationary.
● If a stochastic process is strict-sense stationary and has finite second
[2]: p. 299
moments, it is wide-sense stationary.
● If two stochastic processes are jointly (M + N)-th-order stationary, this does
not guarantee that the individual processes are M-th- respectively N-th-order
[1]: p. 159
stationary.

Other terminology[edit]
The terminology used for types of stationarity other than strict stationarity can be rather
mixed. Some examples follow.

● Priestley uses stationary up to order m if conditions similar to those given


[3][4]
here for wide sense stationarity apply relating to moments up to order m.
Thus wide sense stationarity would be equivalent to "stationary to order 2",
which is different from the definition of second-order stationarity given here.
● Honarkhah and Caers also use the assumption of stationarity in the context of
multiple-point geostatistics, where higher n-point statistics are assumed to be
[5]
stationary in the spatial domain.

Differencing[edit]
One way to make some time series stationary is to compute the differences between
consecutive observations. This is known as differencing. Differencing can help stabilize
the mean of a time series by removing changes in the level of a time series, and so
eliminating trends. This can also remove seasonality, if differences are taken
appropriately (e.g. differencing observations 1 year apart to remove a yearly trend).

Transformations such as logarithms can help to stabilize the variance of a time series.

One of the ways for identifying non-stationary times series is the ACF plot. Sometimes,
patterns will be more visible in the ACF plot than in the original time series; however,
[6]
this is not always the case.

Another approach to identifying non-stationarity is to look at the Laplace transform of a


series, which will identify both exponential trends and sinusoidal seasonality (complex
exponential trends). Related techniques from signal analysis such as the wavelet
transform and Fourier transform may also be helpful.
See also[edit]
● Lévy process
● Stationary ergodic process
● Wiener–Khinchin theorem
● Ergodicity
● Statistical regularity
● Autocorrelation
● Whittle likelihood

References[edit]
● ^
● Jump up to:
abcdefg
● Park,Kun Il (2018). Fundamentals of Probability and Stochastic
Processes with Applications to Communications. Springer. ISBN 978-3-319-
68074-3.
● ^
● Jump up to:
ab
● Ionut Florescu (7 November 2014). Probability and Stochastic Processes.
John Wiley & Sons. ISBN 978-1-118-59320-2.
● ^ Priestley, M. B. (1981). Spectral Analysis and Time Series. Academic Press.
ISBN 0-12-564922-3.
● ^ Priestley, M. B. (1988). Non-linear and Non-stationary Time Series Analysis.
Academic Press. ISBN 0-12-564911-8.
● ^ Honarkhah, M.; Caers, J. (2010). "Stochastic Simulation of Patterns Using
Distance-Based Pattern Modeling". Mathematical Geosciences. 42 (5): 487–517.
Bibcode:2010MatGe..42..487H. doi:10.1007/s11004-010-9276-7.
● ^ 8.1 Stationarity and differencing | OTexts. Retrieved 2016-05-18. {{cite
book}}: |website= ignored (help)

Further reading[edit]
● Enders, Walter (2010). Applied Econometric Time Series (Third ed.). New
York: Wiley. pp. 53–57. ISBN 978-0-470-50539-7.
● Jestrovic, I.; Coyle, J. L.; Sejdic, E (2015). "The effects of increased fluid
viscosity on stationary characteristics of EEG signal in healthy adults". Brain
Research. 1589: 45–53. doi:10.1016/j.brainres.2014.09.035. PMC 4253861.
PMID 25245522.
● Hyndman, Athanasopoulos (2013). Forecasting: Principles and Practice.
Otexts. https://www.otexts.org/fpp/8/1
External links[edit]
● Spectral decomposition of a random function (Springer)

show

● V
● T
● E

Stochastic processes

hide

● V
● T
● E

Statistics

● Outline
● Index

show

Descriptive statistics

show

Data collection

show

Statistical inference
show

● Correlation
● Regression analysis

A weaker form of stationarity commonly employed in signal processing is known as


weak-sense stationarity, wide-sense stationarity (WSS), or covariance
stationarity. WSS random processes only require that 1st moment (i.e. the mean) and
autocovariance do not vary with respect to time and that the 2nd moment is finite for all
times. Any strictly stationary process which has a finite mean and covariance is also
[2]: p. 299
WSS.

So, a continuous time random process

{𝑋𝑡}

A weaker form of stationarity commonly employed in signal processing is known as


weak-sense stationarity, wide-sense stationarity (WSS), or covariance
stationarity. WSS random processes only require that 1st moment (i.e. the mean) and
autocovariance do not vary with respect to time and that the 2nd moment is finite for all
times. Any strictly stationary process which has a finite mean and covariance is also
[2]: p. 299
WSS.

So, a continuous time random process

{𝑋𝑡}

A weaker form of stationarity commonly employed in signal processing is known as


weak-sense stationarity, wide-sense stationarity (WSS), or covariance
stationarity. WSS random processes only require that 1st moment (i.e. the mean) and
autocovariance do not vary with respect to time and that the 2nd moment is finite for all
times. Any strictly stationary process which has a finite mean and covariance is also
[2]: p. 299
WSS.

So, a continuous time random process

{𝑋𝑡}

A weaker form of stationarity commonly employed in signal processing is known as


weak-sense stationarity, wide-sense stationarity (WSS), or covariance
stationarity. WSS random processes only require that 1st moment (i.e. the mean) and
autocovariance do not vary with respect to time and that the 2nd moment is finite for all
times. Any strictly stationary process which has a finite mean and covariance is also
[2]: p. 299
WSS.

So, a continuous time random process

{𝑋𝑡}

A weaker form of stationarity commonly employed in signal processing is known as


weak-sense stationarity, wide-sense stationarity (WSS), or covariance
stationarity. WSS random processes only require that 1st moment (i.e. the mean) and
autocovariance do not vary with respect to time and that the 2nd moment is finite for all
times. Any strictly stationary process which has a finite mean and covariance is also
[2]: p. 299
WSS.

So, a continuous time random process

{𝑋𝑡}

A weaker form of stationarity commonly employed in signal processing is known as


weak-sense stationarity, wide-sense stationarity (WSS), or covariance
stationarity. WSS random processes only require that 1st moment (i.e. the mean) and
autocovariance do not vary with respect to time and that the 2nd moment is finite for all
times. Any strictly stationary process which has a finite mean and covariance is also
[2]: p. 299
WSS.

So, a continuous time random process

{𝑋𝑡}

A weaker form of stationarity commonly employed in signal processing is known as


weak-sense stationarity, wide-sense stationarity (WSS), or covariance
stationarity. WSS random processes only require that 1st moment (i.e. the mean) and
autocovariance do not vary with respect to time and that the 2nd moment is finite for all
times. Any strictly stationary process which has a finite mean and covariance is also
[2]: p. 299
WSS.

So, a continuous time random process


{𝑋𝑡}

A weaker form of stationarity commonly employed in signal processing is known as


weak-sense stationarity, wide-sense stationarity (WSS), or covariance
stationarity. WSS random processes only require that 1st moment (i.e. the mean) and
autocovariance do not vary with respect to time and that the 2nd moment is finite for all
times. Any strictly stationary process which has a finite mean and covariance is also
[2]: p. 299
WSS.

So, a continuous time random process

{𝑋𝑡}

A weaker form of stationarity commonly employed in signal processing is known as


weak-sense stationarity, wide-sense stationarity (WSS), or covariance
stationarity. WSS random processes only require that 1st moment (i.e. the mean) and
autocovariance do not vary with respect to time and that the 2nd moment is finite for all
times. Any strictly stationary process which has a finite mean and covariance is also
[2]: p. 299
WSS.

So, a continuous time random process

{𝑋𝑡}

A weaker form of stationarity commonly employed in signal processing is known as


weak-sense stationarity, wide-sense stationarity (WSS), or covariance
stationarity. WSS random processes only require that 1st moment (i.e. the mean) and
autocovariance do not vary with respect to time and that the 2nd moment is finite for all
times. Any strictly stationary process which has a finite mean and covariance is also
[2]: p. 299
WSS.

So, a continuous time random process

{𝑋𝑡}

A weaker form of stationarity commonly employed in signal processing is known as


weak-sense stationarity, wide-sense stationarity (WSS), or covariance
stationarity. WSS random processes only require that 1st moment (i.e. the mean) and
autocovariance do not vary with respect to time and that the 2nd moment is finite for all
times. Any strictly stationary process which has a finite mean and covariance is also
[2]: p. 299
WSS.

So, a continuous time random process

{𝑋𝑡}

A weaker form of stationarity commonly employed in signal processing is known as


weak-sense stationarity, wide-sense stationarity (WSS), or covariance
stationarity. WSS random processes only require that 1st moment (i.e. the mean) and
autocovariance do not vary with respect to time and that the 2nd moment is finite for all
times. Any strictly stationary process which has a finite mean and covariance is also
[2]: p. 299
WSS.

So, a continuous time random process

{𝑋𝑡}

A weaker form of stationarity commonly employed in signal processing is known as


weak-sense stationarity, wide-sense stationarity (WSS), or covariance
stationarity. WSS random processes only require that 1st moment (i.e. the mean) and
autocovariance do not vary with respect to time and that the 2nd moment is finite for all
times. Any strictly stationary process which has a finite mean and covariance is also
[2]: p. 299
WSS.

So, a continuous time random process

{𝑋𝑡}

hide

Categorical / Multivariate / Time-series / Survival analysis


C


C


G


L


M


C


R


M


P


C


D


C


C


S


M



show

Applications

● Category

● Mathematics portal
● Commons
● WikiProject

Categories:

Stochastic processes
Signal processing
This page was last edited on 20 June 2024, at 06:45 (UTC).

Text is available under the Creative Commons Attribution-ShareAlike License 4.0; additional terms may apply.
By using this site, you agre

Stationary process
21 languages
Article
Talk
Read
Edit
View history
Tools
Appearance hide

Text
Small
Standard
Large
Width
Standard
Wide
Color (beta)
Automatic
Light
Dark
Report an issue with dark mode
From Wikipedia, the free encyclopedia

In mathematics and statistics, a stationary process (or a strict/strictly stationary


process or strong/strongly stationary process) is a stochastic process whose
unconditional joint probability distribution does not change when shifted in time.
Consequently, parameters such as mean and variance also do not change over time.

Since stationarity is an assumption underlying many statistical procedures used in time


series analysis, non-stationary data are often transformed to become stationary. The
most common cause of violation of stationarity is a trend in the mean, which can be due
either to the presence of a unit root or of a deterministic trend. In the former case of a
unit root, stochastic shocks have permanent effects, and the process is not mean-
reverting. In the latter case of a deterministic trend, the process is called a trend-
stationary process, and stochastic shocks have only transitory effects after which the
variable tends toward a deterministically evolving (non-constant) mean.

A trend stationary process is not strictly stationary, but can easily be transformed into a
stationary process by removing the underlying trend, which is solely a function of time.
Similarly, processes with one or more unit roots can be made stationary through
differencing. An important type of non-stationary process that does not include a trend-
like behavior is a cyclostationary process, which is a stochastic process that varies
cyclically with time.

For many applications strict-sense stationarity is too restrictive. Other forms of


stationarity such as wide-sense stationarity or N-th-order stationarity are then
employed. The definitions for different kinds of stationarity are not consistent among
different authors (see Other terminology).

Strict-sense stationarity[edit]
Definition[edit]
Formally, let

{𝑋𝑡}

be a stochastic process and let

𝐹𝑋(𝑥𝑡1+𝜏,…,𝑥𝑡𝑛+𝜏)

represent the cumulative distribution function of the


unconditional (i.e., with no reference to any particular starting value) joint distribution of

{𝑋𝑡}

at times

𝑡1+𝜏,…,𝑡𝑛+𝜏

. Then,

{𝑋𝑡}

is said to be strictly stationary, strongly stationary or strict-sense stationary


[1]: p. 155
if

𝐹𝑋(𝑥𝑡1+𝜏,…,𝑥𝑡𝑛+𝜏)=𝐹𝑋(𝑥𝑡1,…,𝑥𝑡𝑛)for all 𝜏,𝑡1,…,𝑡𝑛∈𝑅 and for all


𝑛∈𝑁>0 (Eq.
1)

Since

does not affect

𝐹𝑋(⋅)
,

𝐹𝑋

is independent of time.

Examples[edit]

Two simulated time series processes, one stationary and the other non-stationary, are shown
above. The augmented Dickey–Fuller (ADF) test statistic is reported for each process; non-
stationarity cannot be rejected for the second process at a 5% significance level.

White noise is the simplest example of a stationary process.

An example of a discrete-time stationary process where the sample space is also


discrete (so that the random variable may take one of N possible values) is a Bernoulli
scheme. Other examples of a discrete-time stationary process with continuous sample
space include some autoregressive and moving average processes which are both
subsets of the autoregressive moving average model. Models with a non-trivial
autoregressive component may be either stationary or non-stationary, depending on the
parameter values, and important non-stationary special cases are where unit roots exist
in the model.

Example 1[edit]

Let

be any scalar random variable, and define a time-series

{𝑋𝑡}

, by

𝑋𝑡=𝑌 for all 𝑡.

Then

{𝑋𝑡}

is a stationary time series, for which realisations consist of a series of constant


values, with a different constant value for each realisation. A law of large numbers does
not apply on this case, as the limiting value of an average from a single realisation takes
the random value determined by

, rather than taking the expected value of

The time average of

𝑋𝑡
does not converge since the process is not ergodic.

Example 2[edit]

As a further example of a stationary process for which any single realisation has an
apparently noise-free structure, let

have a uniform distribution on

[0,2𝜋]

and define the time series

{𝑋𝑡}

by

𝑋𝑡=cos⁡(𝑡+𝑌) for 𝑡∈𝑅.

Then

{𝑋𝑡}

is strictly stationary since (

(𝑡+𝑌)

modulo

2𝜋

) follows the same uniform distribution as

𝑌
for any

Example 3[edit]

Keep in mind that a weakly white noise is not necessarily strictly stationary. Let

be a random variable uniformly distributed in the interval

(0,2𝜋)

and define the time series

{𝑧𝑡}

𝑧𝑡=cos⁡(𝑡𝜔)(𝑡=1,2,...)

Then

𝐸(𝑧𝑡)=12𝜋∫02𝜋cos⁡(𝑡𝜔)𝑑𝜔=0,Var⁡(𝑧𝑡)=12𝜋∫02𝜋cos2⁡(𝑡𝜔)𝑑𝜔=1/2,Cov⁡(𝑧𝑡,𝑧𝑗)
=12𝜋∫02𝜋cos⁡(𝑡𝜔)cos⁡(𝑗𝜔)𝑑𝜔=0∀𝑡≠𝑗.
So

{𝑧𝑡}

is a white noise in the weak sense (the mean and cross-covariances are zero, and
the variances are all the same), however it is not strictly stationary.

Nth-order stationarity[edit]
In Eq.1, the distribution of

samples of the stochastic process must be equal to the distribution of the samples
shifted in time for all

. N-th-order stationarity is a weaker form of stationarity where this is only requested for
all

up to a certain order

. A random process

{𝑋𝑡}

[1]: p. 152
is said to be N-th-order stationary if:

𝐹𝑋(𝑥𝑡1+𝜏,…,𝑥𝑡𝑛+𝜏)=𝐹𝑋(𝑥𝑡1,…,𝑥𝑡𝑛)for all 𝜏,𝑡1,…,𝑡𝑛∈𝑅 and for all 𝑛∈{1,


…,𝑁} (Eq.
2)
Weak or wide-sense stationarity[edit]
Definition[edit]

A weaker form of stationarity commonly employed in signal processing is known as


weak-sense stationarity, wide-sense stationarity (WSS), or covariance
stationarity. WSS random processes only require that 1st moment (i.e. the mean) and
autocovariance do not vary with respect to time and that the 2nd moment is finite for all
times. Any strictly stationary process which has a finite mean and covariance is also
[2]: p. 299
WSS.

So, a continuous time random process

{𝑋𝑡}

which is WSS has the following restrictions on its mean function

𝑚𝑋(𝑡)≜E⁡[𝑋𝑡]

and autocovariance function

𝐾𝑋𝑋(𝑡1,𝑡2)≜E⁡[(𝑋𝑡1−𝑚𝑋(𝑡1))(𝑋𝑡2−𝑚𝑋(𝑡2))]

𝑚𝑋(𝑡)=𝑚𝑋(𝑡+𝜏)for all
𝜏,𝑡∈𝑅𝐾𝑋𝑋(𝑡1,𝑡2)=𝐾𝑋𝑋(𝑡1−𝑡2,0)for all 𝑡1,𝑡2∈𝑅E⁡[|𝑋𝑡|
2]<∞for all 𝑡∈𝑅
(Eq.3
)

The first property implies that the mean function

𝑚𝑋(𝑡)
must be constant. The second property implies that the autocovariance function
depends only on the difference between

𝑡1

and

𝑡2

[1]: p. 159
and only needs to be indexed by one variable rather than two variables. Thus,
instead of writing,

𝐾𝑋𝑋(𝑡1−𝑡2,0)

the notation is often abbreviated by the substitution

𝜏=𝑡1−𝑡2

𝐾𝑋𝑋(𝜏)≜𝐾𝑋𝑋(𝑡1−𝑡2,0)

This also implies that the autocorrelation depends only on

𝜏=𝑡1−𝑡2

, that is

𝑅𝑋(𝑡1,𝑡2)=𝑅𝑋(𝑡1−𝑡2,0)≜𝑅𝑋(𝜏).

The third property says that the second moments must be finite for any time

.
Motivation[edit]

The main advantage of wide-sense stationarity is that it places the time-series in the
context of Hilbert spaces. Let H be the Hilbert space generated by {x(t)} (that is, the
closure of the set of all linear combinations of these random variables in the Hilbert
space of all square-integrable random variables on the given probability space). By the
positive definiteness of the autocovariance function, it follows from Bochner's theorem
that there exists a positive measure

2
on the real line such that H is isomorphic to the Hilbert subspace of L (μ) generated
−2πiξ⋅t
by {e }. This then gives the following Fourier-type decomposition for a continuous
time stationary stochastic process: there exists a stochastic process

𝜔𝜉

with orthogonal increments such that, for all

𝑋𝑡=∫𝑒−2𝜋𝑖𝜆⋅𝑡𝑑𝜔𝜆,

where the integral on the right-hand side is interpreted in a suitable (Riemann) sense.
The same result holds for a discrete-time stationary process, with the spectral measure
now defined on the unit circle.

When processing WSS random signals with linear, time-invariant (LTI) filters, it is
helpful to think of the correlation function as a linear operator. Since it is a circulant
operator (depends only on the difference between the two arguments), its
eigenfunctions are the Fourier complex exponentials. Additionally, since the
eigenfunctions of LTI operators are also complex exponentials, LTI processing of WSS
random signals is highly tractable—all computations can be performed in the frequency
domain. Thus, the WSS assumption is widely employed in signal processing algorithms.

Definition for complex stochastic process[edit]


In the case where

{𝑋𝑡}

is a complex stochastic process the autocovariance function is defined as

𝐾𝑋𝑋(𝑡1,𝑡2)=E⁡[(𝑋𝑡1−𝑚𝑋(𝑡1))(𝑋𝑡2−𝑚𝑋(𝑡2))¯]

and, in addition to the


requirements in Eq.3, it is required that the pseudo-autocovariance function

𝐽𝑋𝑋(𝑡1,𝑡2)=E⁡[(𝑋𝑡1−𝑚𝑋(𝑡1))(𝑋𝑡2−𝑚𝑋(𝑡2))]

depends only on the time lag. In


formulas,

{𝑋𝑡}

is WSS, if

𝑚𝑋(𝑡)=𝑚𝑋(𝑡+𝜏)for all
𝜏,𝑡∈𝑅𝐾𝑋𝑋(𝑡1,𝑡2)=𝐾𝑋𝑋(𝑡1−𝑡2,0)for all
𝑡1,𝑡2∈𝑅𝐽𝑋𝑋(𝑡1,𝑡2)=𝐽𝑋𝑋(𝑡1−𝑡2,0)for all 𝑡1,𝑡2∈𝑅E⁡[|
𝑋(𝑡)|2]<∞for all 𝑡∈𝑅
(Eq.4
)

Joint stationarity[edit]
The concept of stationarity may be extended to two stochastic processes.

Joint strict-sense stationarity[edit]


Two stochastic processes

{𝑋𝑡}

and

{𝑌𝑡}

are called jointly strict-sense stationary if their joint cumulative distribution

𝐹𝑋𝑌(𝑥𝑡1,…,𝑥𝑡𝑚,𝑦𝑡1′,…,𝑦𝑡𝑛′)

remains unchanged under time shifts, i.e. if

𝐹𝑋𝑌(𝑥𝑡1,…,𝑥𝑡𝑚,𝑦𝑡1′,…,𝑦𝑡𝑛′)=𝐹𝑋𝑌(𝑥𝑡1+𝜏,…,𝑥𝑡𝑚+𝜏,𝑦𝑡1′+𝜏,…,𝑦𝑡𝑛′+𝜏)for
(E
all 𝜏,𝑡1,…,𝑡𝑚,𝑡1′,…,𝑡𝑛′∈𝑅 and for all 𝑚,𝑛∈𝑁 q.5
)

Joint (M + N)th-order stationarity[edit]

Two random processes

{𝑋𝑡}

and

{𝑌𝑡}

[1]: p. 159
is said to be jointly (M + N)-th-order stationary if:

𝐹𝑋𝑌(𝑥𝑡1,…,𝑥𝑡𝑚,𝑦𝑡1′,…,𝑦𝑡𝑛′)=𝐹𝑋𝑌(𝑥𝑡1+𝜏,…,𝑥𝑡𝑚+𝜏,𝑦𝑡1′+𝜏,…,𝑦𝑡𝑛′+𝜏)for (E
all 𝜏,𝑡1,…,𝑡𝑚,𝑡1′,…,𝑡𝑛′∈𝑅 and for all 𝑚∈{1,…,𝑀},𝑛∈{1,…,𝑁} q.
6)
Joint weak or wide-sense stationarity[edit]

Two stochastic processes

{𝑋𝑡}

and

{𝑌𝑡}

are called jointly wide-sense stationary if they are both wide-sense stationary
and their cross-covariance function

𝐾𝑋𝑌(𝑡1,𝑡2)=E⁡[(𝑋𝑡1−𝑚𝑋(𝑡1))(𝑌𝑡2−𝑚𝑌(𝑡2))]

depends only on the time


difference

𝜏=𝑡1−𝑡2

. This may be summarized as follows:

𝑚𝑋(𝑡)=𝑚𝑋(𝑡+𝜏)for all 𝜏,𝑡∈𝑅𝑚𝑌(𝑡)=𝑚𝑌(𝑡+𝜏)for all


𝜏,𝑡∈𝑅𝐾𝑋𝑋(𝑡1,𝑡2)=𝐾𝑋𝑋(𝑡1−𝑡2,0)for all
𝑡1,𝑡2∈𝑅𝐾𝑌𝑌(𝑡1,𝑡2)=𝐾𝑌𝑌(𝑡1−𝑡2,0)for all
𝑡1,𝑡2∈𝑅𝐾𝑋𝑌(𝑡1,𝑡2)=𝐾𝑋𝑌(𝑡1−𝑡2,0)for all 𝑡1,𝑡2∈𝑅
(Eq.7
)

Relation between types of


stationarity[edit]
● If a stochastic process is N-th-order stationary, then it is also M-th-order
stationary for all
● 𝑀≤𝑁
● .
● If a stochastic process is second order stationary (
● 𝑁=2
● ) and has finite second moments, then it is also wide-sense stationary.
[1]: p. 159
● If a stochastic process is wide-sense stationary, it is not necessarily second-
[1]: p. 159
order stationary.
● If a stochastic process is strict-sense stationary and has finite second
[2]: p. 299
moments, it is wide-sense stationary.
● If two stochastic processes are jointly (M + N)-th-order stationary, this does
not guarantee that the individual processes are M-th- respectively N-th-order
[1]: p. 159
stationary.

Other terminology[edit]
The terminology used for types of stationarity other than strict stationarity can be rather
mixed. Some examples follow.

● Priestley uses stationary up to order m if conditions similar to those given


[3][4]
here for wide sense stationarity apply relating to moments up to order m.
Thus wide sense stationarity would be equivalent to "stationary to order 2",
which is different from the definition of second-order stationarity given here.
● Honarkhah and Caers also use the assumption of stationarity in the context of
multiple-point geostatistics, where higher n-point statistics are assumed to be
[5]
stationary in the spatial domain.

Differencing[edit]
One way to make some time series stationary is to compute the differences between
consecutive observations. This is known as differencing. Differencing can help stabilize
the mean of a time series by removing changes in the level of a time series, and so
eliminating trends. This can also remove seasonality, if differences are taken
appropriately (e.g. differencing observations 1 year apart to remove a yearly trend).

Transformations such as logarithms can help to stabilize the variance of a time series.
One of the ways for identifying non-stationary times series is the ACF plot. Sometimes,
patterns will be more visible in the ACF plot than in the original time series; however,
[6]
this is not always the case.

Another approach to identifying non-stationarity is to look at the Laplace transform of a


series, which will identify both exponential trends and sinusoidal seasonality (complex
exponential trends). Related techniques from signal analysis such as the wavelet
transform and Fourier transform may also be helpful.

See also[edit]
● Lévy process
● Stationary ergodic process
● Wiener–Khinchin theorem
● Ergodicity
● Statistical regularity
● Autocorrelation
● Whittle likelihood

References[edit]
● ^
● Jump up to:
abcdefg
● Park,Kun Il (2018). Fundamentals of Probability and Stochastic
Processes with Applications to Communications. Springer. ISBN 978-3-319-
68074-3.
● ^
● Jump up to:
ab
● Ionut Florescu (7 November 2014). Probability and Stochastic Processes.
John Wiley & Sons. ISBN 978-1-118-59320-2.
● ^ Priestley, M. B. (1981). Spectral Analysis and Time Series. Academic Press.
ISBN 0-12-564922-3.
● ^ Priestley, M. B. (1988). Non-linear and Non-stationary Time Series Analysis.
Academic Press. ISBN 0-12-564911-8.
● ^ Honarkhah, M.; Caers, J. (2010). "Stochastic Simulation of Patterns Using
Distance-Based Pattern Modeling". Mathematical Geosciences. 42 (5): 487–517.
Bibcode:2010MatGe..42..487H. doi:10.1007/s11004-010-9276-7.
● ^ 8.1 Stationarity and differencing | OTexts. Retrieved 2016-05-18. {{cite
book}}: |website= ignored (help)

Further reading[edit]
● Enders, Walter (2010). Applied Econometric Time Series (Third ed.). New
York: Wiley. pp. 53–57. ISBN 978-0-470-50539-7.
● Jestrovic, I.; Coyle, J. L.; Sejdic, E (2015). "The effects of increased fluid
viscosity on stationary characteristics of EEG signal in healthy adults". Brain
Research. 1589: 45–53. doi:10.1016/j.brainres.2014.09.035. PMC 4253861.
PMID 25245522.
● Hyndman, Athanasopoulos (2013). Forecasting: Principles and Practice.
Otexts. https://www.otexts.org/fpp/8/1

External links[edit]
● Spectral decomposition of a random function (Springer)

show

● V
● T
● E

Stochastic processes

hide

● V
● T
● E

Statistics

● Outline
● Index

show

Descriptive statistics

show
Data collection

show

Statistical inference

show

● Correlation
● Regression analysis

hide

Categorical / Multivariate / Time-series / Survival analysis


C


C


G


L


M


C


R


M


P


C


D


C


C


S


M


show

Applications

● Category

● Mathematics portal
● Commons
● WikiProject
Categories:

Stochastic processes
Signal processing
This page was last edited on 20 June 2024, at 06:45 (UTC).

Text is available under the Creative Commons Attribution-ShareAlike License 4.0; additional terms may apply.
By using this site, you agre

You might also like