Stochastic Process
Stochastic Process
21 languages
Article
Talk
Read
Edit
View history
Tools
Appearance hide
Text
Small
Standard
Large
Width
Standard
Wide
Color (beta)
Automatic
Light
Dark
Report an issue with dark mode
From Wikipedia, the free encyclopedia
Strict-sense stationarity[edit]
Definition[edit]
Formally, let
{𝑋𝑡}
𝐹𝑋(𝑥𝑡1+𝜏,…,𝑥𝑡𝑛+𝜏)
{𝑋𝑡}
at times
𝑡1+𝜏,…,𝑡𝑛+𝜏
. Then,
{𝑋𝑡}
is said to be strictly stationary, strongly stationary or strict-sense stationary
[1]: p. 155
if
Since
𝐹𝑋(⋅)
𝐹𝑋
is independent of time.
Examples[edit]
Two simulated time series processes, one stationary and the other non-stationary, are shown
above. The augmented Dickey–Fuller (ADF) test statistic is reported for each process; non-
stationarity cannot be rejected for the second process at a 5% significance level.
Example 1[edit]
Let
, by
Then
{𝑋𝑡}
𝑋𝑡
Example 2[edit]
As a further example of a stationary process for which any single realisation has an
apparently noise-free structure, let
{𝑋𝑡}
by
Then
{𝑋𝑡}
(𝑡+𝑌)
modulo
2𝜋
for any
Example 3[edit]
Keep in mind that a weakly white noise is not necessarily strictly stationary. Let
𝜔
(0,2𝜋)
{𝑧𝑡}
𝑧𝑡=cos(𝑡𝜔)(𝑡=1,2,...)
Then
𝐸(𝑧𝑡)=12𝜋∫02𝜋cos(𝑡𝜔)𝑑𝜔=0,Var(𝑧𝑡)=12𝜋∫02𝜋cos2(𝑡𝜔)𝑑𝜔=1/2,Cov(𝑧𝑡,𝑧𝑗)
=12𝜋∫02𝜋cos(𝑡𝜔)cos(𝑗𝜔)𝑑𝜔=0∀𝑡≠𝑗.
So
{𝑧𝑡}
is a white noise in the weak sense (the mean and cross-covariances are zero, and
the variances are all the same), however it is not strictly stationary.
Nth-order stationarity[edit]
In Eq.1, the distribution of
samples of the stochastic process must be equal to the distribution of the samples
shifted in time for all
. N-th-order stationarity is a weaker form of stationarity where this is only requested for
all
up to a certain order
. A random process
{𝑋𝑡}
[1]: p. 152
is said to be N-th-order stationary if:
{𝑋𝑡}
𝑚𝑋(𝑡)≜E[𝑋𝑡]
𝐾𝑋𝑋(𝑡1,𝑡2)≜E[(𝑋𝑡1−𝑚𝑋(𝑡1))(𝑋𝑡2−𝑚𝑋(𝑡2))]
𝑚𝑋(𝑡)=𝑚𝑋(𝑡+𝜏)for all
𝜏,𝑡∈𝑅𝐾𝑋𝑋(𝑡1,𝑡2)=𝐾𝑋𝑋(𝑡1−𝑡2,0)for all 𝑡1,𝑡2∈𝑅E[|𝑋𝑡|
2]<∞for all 𝑡∈𝑅
(Eq.3
)
𝑚𝑋(𝑡)
must be constant. The second property implies that the autocovariance function
depends only on the difference between
𝑡1
and
𝑡2
[1]: p. 159
and only needs to be indexed by one variable rather than two variables. Thus,
instead of writing,
𝐾𝑋𝑋(𝑡1−𝑡2,0)
𝜏=𝑡1−𝑡2
𝐾𝑋𝑋(𝜏)≜𝐾𝑋𝑋(𝑡1−𝑡2,0)
𝜏=𝑡1−𝑡2
, that is
𝑅𝑋(𝑡1,𝑡2)=𝑅𝑋(𝑡1−𝑡2,0)≜𝑅𝑋(𝜏).
The third property says that the second moments must be finite for any time
Motivation[edit]
The main advantage of wide-sense stationarity is that it places the time-series in the
context of Hilbert spaces. Let H be the Hilbert space generated by {x(t)} (that is, the
closure of the set of all linear combinations of these random variables in the Hilbert
space of all square-integrable random variables on the given probability space). By the
positive definiteness of the autocovariance function, it follows from Bochner's theorem
that there exists a positive measure
𝜇
2
on the real line such that H is isomorphic to the Hilbert subspace of L (μ) generated
−2πiξ⋅t
by {e }. This then gives the following Fourier-type decomposition for a continuous
time stationary stochastic process: there exists a stochastic process
𝜔𝜉
𝑋𝑡=∫𝑒−2𝜋𝑖𝜆⋅𝑡𝑑𝜔𝜆,
where the integral on the right-hand side is interpreted in a suitable (Riemann) sense.
The same result holds for a discrete-time stationary process, with the spectral measure
now defined on the unit circle.
When processing WSS random signals with linear, time-invariant (LTI) filters, it is
helpful to think of the correlation function as a linear operator. Since it is a circulant
operator (depends only on the difference between the two arguments), its
eigenfunctions are the Fourier complex exponentials. Additionally, since the
eigenfunctions of LTI operators are also complex exponentials, LTI processing of WSS
random signals is highly tractable—all computations can be performed in the frequency
domain. Thus, the WSS assumption is widely employed in signal processing algorithms.
{𝑋𝑡}
𝐾𝑋𝑋(𝑡1,𝑡2)=E[(𝑋𝑡1−𝑚𝑋(𝑡1))(𝑋𝑡2−𝑚𝑋(𝑡2))¯]
and, in addition to the
requirements in Eq.3, it is required that the pseudo-autocovariance function
𝐽𝑋𝑋(𝑡1,𝑡2)=E[(𝑋𝑡1−𝑚𝑋(𝑡1))(𝑋𝑡2−𝑚𝑋(𝑡2))]
{𝑋𝑡}
is WSS, if
𝑚𝑋(𝑡)=𝑚𝑋(𝑡+𝜏)for all
𝜏,𝑡∈𝑅𝐾𝑋𝑋(𝑡1,𝑡2)=𝐾𝑋𝑋(𝑡1−𝑡2,0)for all
𝑡1,𝑡2∈𝑅𝐽𝑋𝑋(𝑡1,𝑡2)=𝐽𝑋𝑋(𝑡1−𝑡2,0)for all 𝑡1,𝑡2∈𝑅E[|
𝑋(𝑡)|2]<∞for all 𝑡∈𝑅
(Eq.4
)
Joint stationarity[edit]
The concept of stationarity may be extended to two stochastic processes.
{𝑋𝑡}
and
{𝑌𝑡}
are called jointly strict-sense stationary if their joint cumulative distribution
𝐹𝑋𝑌(𝑥𝑡1,…,𝑥𝑡𝑚,𝑦𝑡1′,…,𝑦𝑡𝑛′)
𝐹𝑋𝑌(𝑥𝑡1,…,𝑥𝑡𝑚,𝑦𝑡1′,…,𝑦𝑡𝑛′)=𝐹𝑋𝑌(𝑥𝑡1+𝜏,…,𝑥𝑡𝑚+𝜏,𝑦𝑡1′+𝜏,…,𝑦𝑡𝑛′+𝜏)for
(E
all 𝜏,𝑡1,…,𝑡𝑚,𝑡1′,…,𝑡𝑛′∈𝑅 and for all 𝑚,𝑛∈𝑁 q.5
)
{𝑋𝑡}
and
{𝑌𝑡}
[1]: p. 159
is said to be jointly (M + N)-th-order stationary if:
𝐹𝑋𝑌(𝑥𝑡1,…,𝑥𝑡𝑚,𝑦𝑡1′,…,𝑦𝑡𝑛′)=𝐹𝑋𝑌(𝑥𝑡1+𝜏,…,𝑥𝑡𝑚+𝜏,𝑦𝑡1′+𝜏,…,𝑦𝑡𝑛′+𝜏)for (E
all 𝜏,𝑡1,…,𝑡𝑚,𝑡1′,…,𝑡𝑛′∈𝑅 and for all 𝑚∈{1,…,𝑀},𝑛∈{1,…,𝑁} q.
6)
{𝑋𝑡}
and
{𝑌𝑡}
are called jointly wide-sense stationary if they are both wide-sense stationary
and their cross-covariance function
𝐾𝑋𝑌(𝑡1,𝑡2)=E[(𝑋𝑡1−𝑚𝑋(𝑡1))(𝑌𝑡2−𝑚𝑌(𝑡2))]
𝜏=𝑡1−𝑡2
Other terminology[edit]
The terminology used for types of stationarity other than strict stationarity can be rather
mixed. Some examples follow.
Differencing[edit]
One way to make some time series stationary is to compute the differences between
consecutive observations. This is known as differencing. Differencing can help stabilize
the mean of a time series by removing changes in the level of a time series, and so
eliminating trends. This can also remove seasonality, if differences are taken
appropriately (e.g. differencing observations 1 year apart to remove a yearly trend).
Transformations such as logarithms can help to stabilize the variance of a time series.
One of the ways for identifying non-stationary times series is the ACF plot. Sometimes,
patterns will be more visible in the ACF plot than in the original time series; however,
[6]
this is not always the case.
References[edit]
● ^
● Jump up to:
abcdefg
● Park,Kun Il (2018). Fundamentals of Probability and Stochastic
Processes with Applications to Communications. Springer. ISBN 978-3-319-
68074-3.
● ^
● Jump up to:
ab
● Ionut Florescu (7 November 2014). Probability and Stochastic Processes.
John Wiley & Sons. ISBN 978-1-118-59320-2.
● ^ Priestley, M. B. (1981). Spectral Analysis and Time Series. Academic Press.
ISBN 0-12-564922-3.
● ^ Priestley, M. B. (1988). Non-linear and Non-stationary Time Series Analysis.
Academic Press. ISBN 0-12-564911-8.
● ^ Honarkhah, M.; Caers, J. (2010). "Stochastic Simulation of Patterns Using
Distance-Based Pattern Modeling". Mathematical Geosciences. 42 (5): 487–517.
Bibcode:2010MatGe..42..487H. doi:10.1007/s11004-010-9276-7.
● ^ 8.1 Stationarity and differencing | OTexts. Retrieved 2016-05-18. {{cite
book}}: |website= ignored (help)
Further reading[edit]
● Enders, Walter (2010). Applied Econometric Time Series (Third ed.). New
York: Wiley. pp. 53–57. ISBN 978-0-470-50539-7.
● Jestrovic, I.; Coyle, J. L.; Sejdic, E (2015). "The effects of increased fluid
viscosity on stationary characteristics of EEG signal in healthy adults". Brain
Research. 1589: 45–53. doi:10.1016/j.brainres.2014.09.035. PMC 4253861.
PMID 25245522.
● Hyndman, Athanasopoulos (2013). Forecasting: Principles and Practice.
Otexts. https://www.otexts.org/fpp/8/1
External links[edit]
● Spectral decomposition of a random function (Springer)
show
● V
● T
● E
Stochastic processes
hide
● V
● T
● E
Statistics
● Outline
● Index
show
Descriptive statistics
show
Data collection
show
Statistical inference
show
● Correlation
● Regression analysis
{𝑋𝑡}
{𝑋𝑡}
{𝑋𝑡}
{𝑋𝑡}
{𝑋𝑡}
{𝑋𝑡}
{𝑋𝑡}
{𝑋𝑡}
{𝑋𝑡}
{𝑋𝑡}
{𝑋𝑡}
{𝑋𝑡}
hide
●
C
●
C
●
G
●
L
●
M
●
C
●
R
●
M
●
P
●
C
●
D
●
C
●
C
●
S
●
M
●
●
show
Applications
● Category
● Mathematics portal
● Commons
● WikiProject
Categories:
Stochastic processes
Signal processing
This page was last edited on 20 June 2024, at 06:45 (UTC).
Text is available under the Creative Commons Attribution-ShareAlike License 4.0; additional terms may apply.
By using this site, you agre
Stationary process
21 languages
Article
Talk
Read
Edit
View history
Tools
Appearance hide
Text
Small
Standard
Large
Width
Standard
Wide
Color (beta)
Automatic
Light
Dark
Report an issue with dark mode
From Wikipedia, the free encyclopedia
A trend stationary process is not strictly stationary, but can easily be transformed into a
stationary process by removing the underlying trend, which is solely a function of time.
Similarly, processes with one or more unit roots can be made stationary through
differencing. An important type of non-stationary process that does not include a trend-
like behavior is a cyclostationary process, which is a stochastic process that varies
cyclically with time.
Strict-sense stationarity[edit]
Definition[edit]
Formally, let
{𝑋𝑡}
𝐹𝑋(𝑥𝑡1+𝜏,…,𝑥𝑡𝑛+𝜏)
{𝑋𝑡}
at times
𝑡1+𝜏,…,𝑡𝑛+𝜏
. Then,
{𝑋𝑡}
Since
𝐹𝑋(⋅)
,
𝐹𝑋
is independent of time.
Examples[edit]
Two simulated time series processes, one stationary and the other non-stationary, are shown
above. The augmented Dickey–Fuller (ADF) test statistic is reported for each process; non-
stationarity cannot be rejected for the second process at a 5% significance level.
Example 1[edit]
Let
{𝑋𝑡}
, by
Then
{𝑋𝑡}
𝑋𝑡
does not converge since the process is not ergodic.
Example 2[edit]
As a further example of a stationary process for which any single realisation has an
apparently noise-free structure, let
[0,2𝜋]
{𝑋𝑡}
by
Then
{𝑋𝑡}
(𝑡+𝑌)
modulo
2𝜋
𝑌
for any
Example 3[edit]
Keep in mind that a weakly white noise is not necessarily strictly stationary. Let
(0,2𝜋)
{𝑧𝑡}
𝑧𝑡=cos(𝑡𝜔)(𝑡=1,2,...)
Then
𝐸(𝑧𝑡)=12𝜋∫02𝜋cos(𝑡𝜔)𝑑𝜔=0,Var(𝑧𝑡)=12𝜋∫02𝜋cos2(𝑡𝜔)𝑑𝜔=1/2,Cov(𝑧𝑡,𝑧𝑗)
=12𝜋∫02𝜋cos(𝑡𝜔)cos(𝑗𝜔)𝑑𝜔=0∀𝑡≠𝑗.
So
{𝑧𝑡}
is a white noise in the weak sense (the mean and cross-covariances are zero, and
the variances are all the same), however it is not strictly stationary.
Nth-order stationarity[edit]
In Eq.1, the distribution of
samples of the stochastic process must be equal to the distribution of the samples
shifted in time for all
. N-th-order stationarity is a weaker form of stationarity where this is only requested for
all
up to a certain order
. A random process
{𝑋𝑡}
[1]: p. 152
is said to be N-th-order stationary if:
{𝑋𝑡}
𝑚𝑋(𝑡)≜E[𝑋𝑡]
𝐾𝑋𝑋(𝑡1,𝑡2)≜E[(𝑋𝑡1−𝑚𝑋(𝑡1))(𝑋𝑡2−𝑚𝑋(𝑡2))]
𝑚𝑋(𝑡)=𝑚𝑋(𝑡+𝜏)for all
𝜏,𝑡∈𝑅𝐾𝑋𝑋(𝑡1,𝑡2)=𝐾𝑋𝑋(𝑡1−𝑡2,0)for all 𝑡1,𝑡2∈𝑅E[|𝑋𝑡|
2]<∞for all 𝑡∈𝑅
(Eq.3
)
𝑚𝑋(𝑡)
must be constant. The second property implies that the autocovariance function
depends only on the difference between
𝑡1
and
𝑡2
[1]: p. 159
and only needs to be indexed by one variable rather than two variables. Thus,
instead of writing,
𝐾𝑋𝑋(𝑡1−𝑡2,0)
𝜏=𝑡1−𝑡2
𝐾𝑋𝑋(𝜏)≜𝐾𝑋𝑋(𝑡1−𝑡2,0)
𝜏=𝑡1−𝑡2
, that is
𝑅𝑋(𝑡1,𝑡2)=𝑅𝑋(𝑡1−𝑡2,0)≜𝑅𝑋(𝜏).
The third property says that the second moments must be finite for any time
.
Motivation[edit]
The main advantage of wide-sense stationarity is that it places the time-series in the
context of Hilbert spaces. Let H be the Hilbert space generated by {x(t)} (that is, the
closure of the set of all linear combinations of these random variables in the Hilbert
space of all square-integrable random variables on the given probability space). By the
positive definiteness of the autocovariance function, it follows from Bochner's theorem
that there exists a positive measure
2
on the real line such that H is isomorphic to the Hilbert subspace of L (μ) generated
−2πiξ⋅t
by {e }. This then gives the following Fourier-type decomposition for a continuous
time stationary stochastic process: there exists a stochastic process
𝜔𝜉
𝑋𝑡=∫𝑒−2𝜋𝑖𝜆⋅𝑡𝑑𝜔𝜆,
where the integral on the right-hand side is interpreted in a suitable (Riemann) sense.
The same result holds for a discrete-time stationary process, with the spectral measure
now defined on the unit circle.
When processing WSS random signals with linear, time-invariant (LTI) filters, it is
helpful to think of the correlation function as a linear operator. Since it is a circulant
operator (depends only on the difference between the two arguments), its
eigenfunctions are the Fourier complex exponentials. Additionally, since the
eigenfunctions of LTI operators are also complex exponentials, LTI processing of WSS
random signals is highly tractable—all computations can be performed in the frequency
domain. Thus, the WSS assumption is widely employed in signal processing algorithms.
{𝑋𝑡}
𝐾𝑋𝑋(𝑡1,𝑡2)=E[(𝑋𝑡1−𝑚𝑋(𝑡1))(𝑋𝑡2−𝑚𝑋(𝑡2))¯]
𝐽𝑋𝑋(𝑡1,𝑡2)=E[(𝑋𝑡1−𝑚𝑋(𝑡1))(𝑋𝑡2−𝑚𝑋(𝑡2))]
{𝑋𝑡}
is WSS, if
𝑚𝑋(𝑡)=𝑚𝑋(𝑡+𝜏)for all
𝜏,𝑡∈𝑅𝐾𝑋𝑋(𝑡1,𝑡2)=𝐾𝑋𝑋(𝑡1−𝑡2,0)for all
𝑡1,𝑡2∈𝑅𝐽𝑋𝑋(𝑡1,𝑡2)=𝐽𝑋𝑋(𝑡1−𝑡2,0)for all 𝑡1,𝑡2∈𝑅E[|
𝑋(𝑡)|2]<∞for all 𝑡∈𝑅
(Eq.4
)
Joint stationarity[edit]
The concept of stationarity may be extended to two stochastic processes.
{𝑋𝑡}
and
{𝑌𝑡}
𝐹𝑋𝑌(𝑥𝑡1,…,𝑥𝑡𝑚,𝑦𝑡1′,…,𝑦𝑡𝑛′)
𝐹𝑋𝑌(𝑥𝑡1,…,𝑥𝑡𝑚,𝑦𝑡1′,…,𝑦𝑡𝑛′)=𝐹𝑋𝑌(𝑥𝑡1+𝜏,…,𝑥𝑡𝑚+𝜏,𝑦𝑡1′+𝜏,…,𝑦𝑡𝑛′+𝜏)for
(E
all 𝜏,𝑡1,…,𝑡𝑚,𝑡1′,…,𝑡𝑛′∈𝑅 and for all 𝑚,𝑛∈𝑁 q.5
)
{𝑋𝑡}
and
{𝑌𝑡}
[1]: p. 159
is said to be jointly (M + N)-th-order stationary if:
𝐹𝑋𝑌(𝑥𝑡1,…,𝑥𝑡𝑚,𝑦𝑡1′,…,𝑦𝑡𝑛′)=𝐹𝑋𝑌(𝑥𝑡1+𝜏,…,𝑥𝑡𝑚+𝜏,𝑦𝑡1′+𝜏,…,𝑦𝑡𝑛′+𝜏)for (E
all 𝜏,𝑡1,…,𝑡𝑚,𝑡1′,…,𝑡𝑛′∈𝑅 and for all 𝑚∈{1,…,𝑀},𝑛∈{1,…,𝑁} q.
6)
Joint weak or wide-sense stationarity[edit]
{𝑋𝑡}
and
{𝑌𝑡}
are called jointly wide-sense stationary if they are both wide-sense stationary
and their cross-covariance function
𝐾𝑋𝑌(𝑡1,𝑡2)=E[(𝑋𝑡1−𝑚𝑋(𝑡1))(𝑌𝑡2−𝑚𝑌(𝑡2))]
𝜏=𝑡1−𝑡2
Other terminology[edit]
The terminology used for types of stationarity other than strict stationarity can be rather
mixed. Some examples follow.
Differencing[edit]
One way to make some time series stationary is to compute the differences between
consecutive observations. This is known as differencing. Differencing can help stabilize
the mean of a time series by removing changes in the level of a time series, and so
eliminating trends. This can also remove seasonality, if differences are taken
appropriately (e.g. differencing observations 1 year apart to remove a yearly trend).
Transformations such as logarithms can help to stabilize the variance of a time series.
One of the ways for identifying non-stationary times series is the ACF plot. Sometimes,
patterns will be more visible in the ACF plot than in the original time series; however,
[6]
this is not always the case.
See also[edit]
● Lévy process
● Stationary ergodic process
● Wiener–Khinchin theorem
● Ergodicity
● Statistical regularity
● Autocorrelation
● Whittle likelihood
References[edit]
● ^
● Jump up to:
abcdefg
● Park,Kun Il (2018). Fundamentals of Probability and Stochastic
Processes with Applications to Communications. Springer. ISBN 978-3-319-
68074-3.
● ^
● Jump up to:
ab
● Ionut Florescu (7 November 2014). Probability and Stochastic Processes.
John Wiley & Sons. ISBN 978-1-118-59320-2.
● ^ Priestley, M. B. (1981). Spectral Analysis and Time Series. Academic Press.
ISBN 0-12-564922-3.
● ^ Priestley, M. B. (1988). Non-linear and Non-stationary Time Series Analysis.
Academic Press. ISBN 0-12-564911-8.
● ^ Honarkhah, M.; Caers, J. (2010). "Stochastic Simulation of Patterns Using
Distance-Based Pattern Modeling". Mathematical Geosciences. 42 (5): 487–517.
Bibcode:2010MatGe..42..487H. doi:10.1007/s11004-010-9276-7.
● ^ 8.1 Stationarity and differencing | OTexts. Retrieved 2016-05-18. {{cite
book}}: |website= ignored (help)
Further reading[edit]
● Enders, Walter (2010). Applied Econometric Time Series (Third ed.). New
York: Wiley. pp. 53–57. ISBN 978-0-470-50539-7.
● Jestrovic, I.; Coyle, J. L.; Sejdic, E (2015). "The effects of increased fluid
viscosity on stationary characteristics of EEG signal in healthy adults". Brain
Research. 1589: 45–53. doi:10.1016/j.brainres.2014.09.035. PMC 4253861.
PMID 25245522.
● Hyndman, Athanasopoulos (2013). Forecasting: Principles and Practice.
Otexts. https://www.otexts.org/fpp/8/1
External links[edit]
● Spectral decomposition of a random function (Springer)
show
● V
● T
● E
Stochastic processes
hide
● V
● T
● E
Statistics
● Outline
● Index
show
Descriptive statistics
show
Data collection
show
Statistical inference
show
● Correlation
● Regression analysis
hide
●
C
●
C
●
G
●
L
●
M
●
C
●
R
●
M
●
P
●
C
●
D
●
C
●
C
●
S
●
M
●
show
Applications
● Category
● Mathematics portal
● Commons
● WikiProject
Categories:
Stochastic processes
Signal processing
This page was last edited on 20 June 2024, at 06:45 (UTC).
Text is available under the Creative Commons Attribution-ShareAlike License 4.0; additional terms may apply.
By using this site, you agre