0% found this document useful (0 votes)
61 views57 pages

SP5 Stoch Processes

This document provides an overview of stochastic processes. It defines stochastic processes as sequences of random variables where the ordering is important. Stochastic processes can be described using probability distributions like probability mass functions or probability density functions. For independent and identically distributed processes, the joint distribution is the product of the individual distributions. Examples of stochastic processes discussed include the Bernoulli process and Gaussian processes.

Uploaded by

rasaraman
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
61 views57 pages

SP5 Stoch Processes

This document provides an overview of stochastic processes. It defines stochastic processes as sequences of random variables where the ordering is important. Stochastic processes can be described using probability distributions like probability mass functions or probability density functions. For independent and identically distributed processes, the joint distribution is the product of the individual distributions. Examples of stochastic processes discussed include the Bernoulli process and Gaussian processes.

Uploaded by

rasaraman
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 57

Stochastic processes – Lecture 5:

Stochastic processes – Ch. 13

Dr. Ir. Richard C. Hendriks

1
From scalar, to stochastic processes
X(s) = X
S
X
Y

X
X(t, s) = X(t)

One experiment ) one outcome or realization of the stochas-


tic/random process
2
Stochastic Processes
Stochastic process:
• Stochastic: random
• Process: sequence of variables where the ordering is of im-
portance. X(s) = X
S
X
Y

X
X(t, s) = X(t)

Random variable: Mapping from an outcome in the sample


space to a real number.
Stochastic process: Mapping from an outcome in the sample
3
space to a function (depending on an ordering variable like time
or space).
Stochastic Processes
• The stochastic process X(t).
• Sample function x(t, s1 ) is one particular realisation (out-
come s1 ) of this process.

From R. D. Yates & D. j. Goodman, ”Probability and stochastic 4

processes”, 3rd ed.


Ensemble of a Stochastic Processes

}
The ensemble

{z
|
From R. D. Yates & D. j. Goodman, ”Probability and stochastic
processes”, 3rd ed.

The ensemble of a stochastic process: The set of all possible time


functions that can result from an experiment. 5
Why stochastic processes? Remember
Lecture 1

X(t) Y (t)
linear system (LTI)
h(n), h(t)

6
Why stochastic processes? Remember
Lecture 1

X(t) Y (t)
linear system (LTI)
h(n), h(t)

Statistical descriptions of X(t): Statistical descriptions of Y (t):


R
• mean µX • mean µY = µX t h(t)dt

• Autocorrelation function RX (⌧ ). • RY (⌧ ) = h(⌧ ) ⇤ h( ⌧ ) ⇤ RX (⌧ ).

7
Notation
amplitude of x(t, s).

continuous discrete

continuous continuous time & continuous time &


continuous value discrete value
X(t)
time axis

discrete discrete time & discrete time &


continuous value discrete value
X(nT ) = Xn

8
Speech signals, weather, stock market

9
Example speech

• Experiment ’s’:
x(t, s1 )

x(t, s2 )

x(t, s3 )

• Pronounce ’ssss’ Ensemble

• sample function: one realisation of the waveform x(t, s)


10
Random sinusoidal process

• Amplitude A and phase are random variables.


• Random process: Xn = A sin(2⇡f n + )
11
Arrival times of packets in a network
x(t, s1 )

t
x(t, s2 )

x(t, s3 )

12
Random Telegraph Signal
x(t, s1 )
t

x(t, s2 )

x(t, s3 )
t

13
Binary bit pattern

14
Handwritten digits

Model trajectories, gestures

15
Description of a random process

x(t, s1 )

x(t, s2 )

x(t, s3 )

X(t1 ) X(t2 )

• How to describe a stochastic process at one time instance:


e.g., X(t1 )?
16
• How to describe a stochastic process at multiple time in-
stances: e.g., [X(t1 ), X(t2 )]T ?
Description of a random process
Similar as for RVs, we can use the pdf/pmf to describe stochastic
processes:
• At any (fixed) time tk the stochastic process can be re-
garded as a random variable:

X(tk ) ! fXtk (xtk )

• This pdf may be di↵erent for each tk !!!


• The joint behavior for multiple time instances t, i.e., t1 , ..., tk
is give by the joint pdf:

[X(t1 ), ..., X(tk ), ...]T ! fXt1 ,Xt2 ,...,Xtk ,... (xt1 , xt2 , ..., xtk , ...)
17
Description of a random process - example

Let X(t) = R|cos(2⇡f t)| with


⇢ 1 r/10
fR (r) = 10 e r 0
0 otherwise

• fX(t) (x)?
• First calculate CDF FX(t) (x)
dFX(t) (x)
• Then calculate the pdf fX(t) (x) = dx

18
Description of a random process - example
• FX(t) (x) = P [X(t)  x] = P [R |cos(2⇡f t)|  x] =
P [R  x/| cos(2⇡f t)|]
R x/| cos(2⇡f t)| x
= 0 fR (r)dr = 1 e 10| cos(2⇡f t)|

• When cos(2⇡f t) 6= 0

0 x0
FX(t) (x) = x
1 e 10| cos(2⇡f t)| x 0.

• When cos(2⇡f t) 6= 0
(
dFX(t) (x) 0 x0
fX(t) (x) = = 1
x
dx 10| cos(2⇡f t)| e x 0.
10| cos(2⇡f t)|

19
• When cos(2⇡f t) = 0, X(t) = 0 and fX(t) (x) = (x).
Notice that…
[X(t1 ), ..., X(tk ), ...]T ! fXt1 ,Xt2 ,...,Xtk ,... (xt1 , xt2 , ..., xtk , ...)
resembles a vector random variable

• ...but can be of infinite dimensionality


• ... and ordering (in time) of the X(tk)’s is essential
With the exception of a few special cases, the joint PDF

fXt1 ,Xt2 ,...,Xtk ,... (xt1 , xt2 , ..., xtk , ...)

is very difficult to get in practice. Exceptions:


• iid random sequence/process
• Gaussian stochastic process
20

• Poisson stochastic process


IID Random Sequences
• IID means: Independent Identically Distributed
• This means that
– all X(tk ) are mutually independent random variables
for all tk
– all X(tk ) have the same pdf for all tk
Let Xn denote an (time discrete) iid random sequence with sample
vector X = [Xn1 , ..., Xnk ].
• For a discrete valued Xn , the joint PMF is thus given by
k
Y
PX (x) = PX1 (x1 )...PXk (xk ) = PX (xi )
i=1 21
IID Random Sequences

• For a continuous valued Xn , the joint PDF is thus given by


k
Y
fX (x) = fX1 (x1 )...fXk (xk ) = fX (xi )
i=1

22
Bernoulli Process (time discrete)
One realisation of a Bernoulli process (e.g., a packet is present or
not)

PMF for one time instance k:


8
< p xk = 1
PXk (xk ) = 1 p xk = 0 23
:
0 otherwise
Bernoulli Process (time discrete)
Notice that the PMF
8
< p xk = 1
PXk (xk ) = 1 p xk = 0
:
0 otherwise

can be rewritten as

pxk (1 p)1 xk
xk = 0, 1
PXk (xk ) =
0 otherwise

For two time instances t1 and t2 we then get (IID process!)

PX1 ,X2 (x1 , x2 ) = PX1 (x1 )PX2 (x2 ) = px1 (1 p)1 x1 x2


p (1 p)1 x2

= px1 +x2 (1 p)2 x1 x2


24
Bernoulli Process (time discrete)

For k time instances we thus get


k
Y
PX (x) = pxi (1 p)1 xi
= px1 +···+xk (1 p)k (x1 +···+xk )

i=1

25
Gaussian Process
Gaussian processes occur quite often in nature (remember the central
limit theorem!)

The process X(t) is a Gaussian stochastic process (sequence) if and


only if X = [X(t1 ) · · · X(tk )]T is a Gaussian random vector for any
integer k > 0 and any set of time instances t1 , t2 , ..., tk .

The pdf?
x(t, s1 )

x(t, s2 )

x(t, s3 ) 26
Gaussian Process

Remember that for Gaussian random vectors (lecture 1)


X = [X1 , X2 , ..., XN ]T we have:
h i
1 T 1
exp 2 (x E[X]) C X (x E[X])
fX (x) =
(2⇡)N/2 det(CX )1/2

For a Gaussian stochastic process X(t), the distribution of


X = [Xt1 , Xt2 , ..., Xtk ]T is thus also given by fX (x).

27
Characterization of Stochastic Processes
Remember from lecture 1:
• pdf/pmf specifies the complete probability model
• In practice the pdf/pmf is often unknown. We therefore often
use parameters that characterize the behaviour of the RVs:

– Expected: value E[X]


– Variance: V ar[X]
– Covariance between two RVs: Cov(X, Y )
• We apply these (known) concepts to stochastic processes.

• Notice that as the pdf of the stochastic process X(t) can be


time dependent, its descriptors can be time dependent as well!
28
Expected value of a stochastic process

x(t, s1 )

x(t, s2 )

x(t, s3 )

X(t1 ) X(t2 )

Expected value of X(tk ) at time tk :


Z 1
E[X(t)] = xfX(t) (x)dx 29
1
Example sinusoidal process

X(t) = A sin(!t + )

• Amplitude A and phase are independent RVs.


• A is uniformly distributed on [ 1, +1]
• is uniformly distributed on [0, 2⇡]
30
Example sinusoidal process
• Expected value of this process

E[X(t)] = E[A sin(!t + )] = E[A]E[sin(!t + )]


Z 2⇡ Z 2⇡
1
E[sin(!t+ )] = sin(!t+ )f ( )d = sin(!t+ )d = 0
0 2⇡ 0
Z 1 Z 1
1
E[A] = afA (a)da = ada = 0
1 2 1

31
Another Example 1
⇢ ⇢
0 t<0 1 0x1
X(t) = fAt (x) =
At t t 0 0 otherwise

• Random variables At are stochastically independent for all t.


one realisation x(t, s):

32
Another Example 1
⇢ ⇢
0 t<0 1 0x1
X(t) = fAt (x) =
At t t 0 0 otherwise

• Random variables At are stochastically independent for all t.


• For t 0, E[X(t)] = E[At t] = tE[At ] = 0.5t
2 t2
• For t 0, V ar[X(t)] = V ar[At t] = t V ar[At ] = 12

For t < 0, E[X(t)] = 0 and V ar[X(t)] = 0.

33
Another Example 2
Assume that the amplitude is still a random variable, but it does
NOT depend on time t:
⇢ ⇢
0 t<0 1 0x1
X(t) = fA (x) =
At t 0 0 otherwise

• How do the realizations of this process look like?


• What is the expected value and variance of this process?

34
Autocovariance

x(t, s1 )

x(t, s2 )

x(t, s3 )

X(t1 ) X(t2 ) = X(t1 + ⌧ )


How can we express how much X(t1 ) says about X(t2 )?
Remember the Covariance Cov(X, Y ) between RV X and RV Y .
• The covariance Cov(X, Y ) indicates how much information
X provides about Y . (When |Cov(X, Y )| is large, an ob- 35
servation from X provides accurate information about Y .)
Autocovariance

x(t, s1 )

x(t, s2 )

x(t, s3 )

X(t1 ) X(t2 ) = X(t1 + ⌧ )

• In the same way, Cov(X(t1 ), X(t2 )) indicates how much


the process is likely to change from t1 to t2 .
• High covariance: sample function unlikely to change.
36
• Zero covariance: sample function expected to change rapidly.
Autocovariance and Autocorrelation
The covariance of a stochastic process at two di↵erent time in-
stances is called ”autocovariance”:

CX (t, ⌧ ) = Cov[X(t), X(t + ⌧ )]


= E [(X(t) E[X(t)]) (X(t + ⌧ ) E[X(t + ⌧ )])]
= E [X(t)X(t + ⌧ )] E [X(t)] E [X(t + ⌧ )]
| {z }
Similar to crosscorrelation E[XY ]

The autocorrelation is then defined as

RX (t, ⌧ ) = E [X(t)X(t + ⌧ )]

CX (t, ⌧ ) = RX (t, ⌧ ) E [X(t)] E [X(t + ⌧ )]


37
Autocovariance and Autocorrelation
The covariance of a stochastic process at two di↵erent time in-
stances is called ”autocovariance”:

CX (t, ⌧ ) = Cov[X(t), X(t + ⌧ )]


= E [(X(t) E[X(t)]) (X(t + ⌧ ) E[X(t + ⌧ )])]
= E [X(t)X(t + ⌧ )] E [X(t)] E [X(t + ⌧ )]

What is CX (t, ⌧ = 0)?

CX (t, 0) = E[X(t)X(t)] E[X(t)]E[X(t)] = V ar[X(t)]

38
The Autocorrelation Function
Notice that exact formulation of the autocorrelation depends on
whether time and amplitudes are discrete or continuous:
Z Z
RX (t, ⌧ ) = E[X(t)X(t + ⌧ )] = xyfX(t)X(t+⌧ ) (x, y)dxdy
Z Z
RX (n, k) = E[X(n)X(n+k)] = E[Xn Xn+k ] = xyfXn Xn+k (x, y)dxdy
XX
RX (n, k) = E[Xn Xn+k ] = P [Xn = x, Xn+k = y]
x y
XX
RX (n, k) = E[X(t)X(t+⌧ )] = P [X(t) = x, X(t+⌧ ) = y]
x y

39
Remember: Example sinusoidal process

X(t) = A sin(!t + )

• Amplitude A and phase are independent RVs.


• A is uniformly distributed on [ 1, +1]
• is uniformly distributed on [0, 2⇡]
40
Remember: Example sinusoidal process
RX (t, ⌧ ) = E[X(t)X(t + ⌧ )]
= E[A2 sin(!t + ) sin(!(t + ⌧ ) + )]
= E[A2 ]E[sin(!t + ) sin(!(t + ⌧ ) + )]
Z 1 2
a
= da E[sin(!t + ) sin(!(t + ⌧ ) + )]
1 2
1 Q: What is the
= E[sin(!t + ) sin(!(t + ⌧ ) + )]
3 variable over
1 which we
= E[cos(!⌧ ) cos(2!t + !⌧ + 2 )]
6 integrate here?
1 1
= cos(!⌧ ) E[cos(2!t + !⌧ + 2 )]
6 6| {z }
=0
1
= cos(!⌧ )
6
1
use sin(A) sin(B) = 2 [cos(A B) cos(A + B)]
Remember: Example sinusoidal process
1
So we found: RX (t, ⌧ ) = 6 cos(!⌧ )

high correlation for !⌧ = 0

very negative correlation for !⌧ = ±⇡ zero correlation for !⌧ = ±⇡/2


42
Remember: Another Example 1
⇢ ⇢
0 t<0 1 0x1
X(t) = fAt (x) =
At t t 0 0 otherwise

• Random variables At are stochastically independent for all t.

43
Remember: Another Example 1
⇢ ⇢
0 t<0 1 0x1
X(t) = fAt (x) =
At t t 0 0 otherwise

• Random variables At are stochastically independent for all t.


for ⌧ 6= 0

RX (t, ⌧ ) = E[X(t)X(t + ⌧ )]
= E[At tAt+⌧ (t + ⌧ )]
= t(t + ⌧ )E[At At+⌧ ]
t(t + ⌧ )
= t(t + ⌧ )E[At ]E[At+⌧ ] =
4
2
RX (t, ⌧ = 0) = t2 E[A2t ] = t3
44
Pictures of the Autocorrelation function
t(t + ⌧ )
RX (t, ⌧ ) =
4

• RX (t, ⌧ ) is thus a
function of two variables.
• This RX (t, ⌧ ) looks
a bit strange...It will
turn out that this X(t)
is in fact non-stationary.

45
Uncorrelated processes
Stationary processes
Wide-sense stationary processes

46
Uncorrelated and orthogonal processes
If all pairs X(t), X(t + ⌧ ) are uncorrelated, i.e.,

var(t) 8 t and ⌧ = 0
CX (t, ⌧ ) =
0 8 t and ⌧ 6= 0

then X(t) is called an uncorrelated process

If all pairs X(t), X(t + ⌧ ) are orthogonal, i.e.,



E[X 2 (t)] 8 t and ⌧ = 0
RX (t, ⌧ ) =
0 8 t and ⌧ =
6 0

then X(t) is called an orthogonal process

47
Stationary Process
A stochastic process is stationary if and only if every joint-pdf is shift
invariant:

fX(t1 ),X(t2 ),...,X(tk ) (x1 , x2 , ..., xk ) = fX(t1 + t),X(t2 + t),...,X(tk + t) (x1 , x2 , ..., xk )

• Consequence I

– The marginal pdf’s are independent of t

fX(t) (x) = fX(t+ t) (x) = fX (x)

– The marginal pdf’s are identical for all tk ’s!


– Therefore: Expected value and variance are time indepen-
dent.

48
Stationary Process

• Consequence II
– The 2D joint-pdf is shift invariant

fX(t1 ),X(t2 ) (x1 , x2 ) = fX(t1 + t),X(t2 + t) (x1 , x2 )

= fX(0),X(t2 t1 ) (x1 , x2 )

– ...only the ”distance” between t2 and t1 matters.


– Therefore:
RX (t, ⌧ ) = RX (⌧ )
CX (t, ⌧ ) = CX (⌧ ) = RX (⌧ ) E[X]2

49
Stationary Process
Example of stationary processes
• iid process
• Bernoulli process
• Poisson process

Remember the PMF for the Bernoulli stochastic process:


k
Y
PX (x) = pxi (1 p)1 xi
= px1 +···+xk (1 p)k (x1 +···+xk )
,
i=1

which does not depend on the actual time.

Notice that non-stationary processes are difficult to model and to50


handle in practice.
Wide-Sense Stationary (WSS) Processes
• To show that a process is stationary, we need the overall joint-
pdf

– Pretty impossible to get, except for special cases.

• However, we can often estimate the process

– Expected value
– Autocorrelation function
• If only the Expected value and the autocorrelation function
satisfy the property of stationarity, we call this process wide
sense stationary (WSS).
– Hence, we dont know anything about other properties of
51
the process!
Wide-Sense Stationary (WSS) Processes
• A process is wide-sense stationary, if and only if
– The expected value E[X(t)] does not depend on time:
E[X(t)] = c.
– The autocorrelation function does only depend on the
time di↵erence ⌧ and not the actual time:

RX (t, ⌧ ) = RX (⌧ )

or
RX (n, k) = RX (k)

• Example: sinusoidal random process


1
CX (⌧ ) = RX (⌧ ) = cos(!⌧ ) 52
6
WSS Processes and the autocorrelation function

Important properties of RX (⌧ ) for WSS processes

RX (0) 0

RX (⌧ ) = RX ( ⌧ )
RX (0) |RX (⌧ )|

Example
1
RX (t, ⌧ ) = 6 cos(!⌧ )

53
Notice

Stationary process: reflects properties of the joint-pdf

WSS process: reflects only properties of the expected value and


the autocorrelation function!

54
Cross correlation for Stochastic Processes
In addition to the autocorrelation function, we can also define the
cross-correlation between two stochastic processes:

RXY (t, ⌧ ) = E[X(t)Y (t + ⌧ )]

RXY [n, k] = E[Xn Yn+k ]


Two random processes X(t) and Y (t) are jointly wide sense sta-
tionary, if X(t) and Y (t) are wide sense stationary, and

RXY (t, ⌧ ) = RXY (⌧ ).

If X(t) and Y (t) are jointly WSS, then

RXY (⌧ ) = RY X ( ⌧ ).
55

(and of course similar for time-discrete processes)


Example

Xn is a WSS stochastic process. Given is

Yn = 1n Xn

• RY [n, k] = E[Yn Yn+k ] = E[( 1)n Xn ( 1)n+k Xn+k ] =


( 1)2n+k E[Xn Xn+k ] = ( 1)k RX [k]

• Process Yn is WSS as RY only depends on k.


• RXY [n, k] = E[Xn Yn+k ] = E[Xn ( 1)n+k Xn+k ]
= ( 1)n+k E[Xn Xn+k ] = ( 1)n+k RX [k].
• X and Y are not jointly WSS as the cross-correlation func-
tion depends on both n and k.
56
Practice before next lecture

Some selected exercises:


• 3rd ed: 13.1.1, 13.3.1, 13.7.1, 13.7.3, 13.7.5, 13.9.3, 13.9.5,
13.9.7, 13.10.1, 13.10.3

57

You might also like