0% found this document useful (0 votes)
59 views16 pages

Yates' Chapter 6, 10: Stochastic Processes & Stochastic Filtering

The document defines stochastic (random) processes and their basic properties. It discusses defining a stochastic process as a mapping from a probability space to real-valued time functions. It also covers the mean and autocorrelation function of stochastic processes as well as stationary processes where the mean is constant over time. Filtering of stochastic processes allows estimation of signals from noisy measurements.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
59 views16 pages

Yates' Chapter 6, 10: Stochastic Processes & Stochastic Filtering

The document defines stochastic (random) processes and their basic properties. It discusses defining a stochastic process as a mapping from a probability space to real-valued time functions. It also covers the mean and autocorrelation function of stochastic processes as well as stationary processes where the mean is constant over time. Filtering of stochastic processes allows estimation of signals from noisy measurements.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 16

Yates’ Chapter 6, 10:

Stochastic Processes & Stochastic Filtering


(Yates’ Chapter 6, 10)

• Definition of a Stochastic Process

• Basic Properties of Stochastic Processes

• Filtering of Stochastic Processes

May 10, 2004 By Prof.'s W. Zhuang & K. T. Wong 1


([email protected])
Definition of a Stochastic (Random) Process

Random Process : is a mapping from an outcome ei of an Ep to a real - valued time function (i.e. waveform).
ei X (t , ⋅ )
 → X (t, ei ) = xi (t ), a real - value waveform
e.g.,
X (t, e 1 ) = x1 (t )

X (t, ⋅) t

X (t, ⋅) X (t, e 2 ) = x 2 (t )
S e1
e2 t
e4 X (t, ⋅)
e3 X (t, e 3 ) = x 3 (t )
t
X (t, ⋅)
X (t, e 4 ) = x 4 (t )

S  → R 2 , a wo dimensional signal space ( x,t )


X (t , ⋅ )

May 10, 2004 By Prof.'s W. Zhuang & K. T. Wong 2


([email protected])
Interpretations :
A random process X (t, e ) is
− a family of deterministic functions, where t and e are variables, X (t,e ) = {x(t,ei ) ei ∈ S }
− a random variable at t = t 0 , X = X (t 0 ,e ) where e is variable depending on the outcome of a
particular trial.
− a single time function (or a sample of the given process) given that e is fixed, X (t,e ) = x(t )
where t is variable.
− a real number if both t and e are fixed.

x(t,e )

t
e1

en
e

X (t ) is commonly used as a shorthand to represent X (t, e )

May 10, 2004 By Prof.'s W. Zhuang & K. T. Wong 3


([email protected])
A and ω c are deterministic scalar constants
Example : X (t ) = A cos(ω c t + Θ ), where 
Θ is a random variable, uniformly distributed on [0, 2π )

Θ = θ , a deterministic scalar constant ⇒ X (t ) = A cos(ω c t + θ ) = x(t ) is a deterministic waveform

t = t 0 ⇒ X (t 0 ) = A cos(ω c t 0 + Θ ) = X is a random scalar - variable

FX ( x ) = P( X ≤ x ) = P(a ≤ Θ ≤ b ), What are a and b?


If X = x, then A cos(ω c t 0 + θ ) = x, where θ = a or b
a = arccos x ( A) − ω tc 0 ∈ [0, 2π ], b = 2π − arccos x ( A) − ω t c 0 ∈ [0, 2π ]
FX ( x ) = ∫
a
b
f Θ (θ ) dθ =
1
2π π
( )
(b − a ) = 1 − 1 arccos x A , ∀x ∈ (− A, A)
d 1
f X (x ) = FX ( x ) = , ∀x ∈ (− A,+ A)
dx (
π A −x
2 2 2
1
)
A cos (ω c t 0 + θ )

x
θ + ω ct0
0

2 π − cos −1
(x A )
cos −1
(x A )
May 10, 2004 By Prof.'s W. Zhuang & K. T. Wong 4
([email protected])
Descriptio n of Random Processes
− analytical descriptio n :
X (t ) : {x (t , e ) e ∈ S } + probabilit y informatio n of S
− statistica l descriptio n :
Definition : A " complete" statistica l descriptio n of a random process X (t ) is : for any integer n
and any choice of sampling instants t1 , t 2 , L , t n , the joint - PDF f X (t1 ), X (t 2 ), L, X (t n ) ( x1 , x 2 , L , x n )
of ( X (t1 ), X (t 2 ), L , X (t n )), is given.

Statistica l averages
x(t , e3 )
− mean or expectatio n :
m X (t )

m(t 2 ) = E ( X (t 2 ))
x(t , e1 )
m(t1 )
t
t1 t2

x(t , e2 )

Figure. The mean of a random process.

May 10, 2004 By Prof.'s W. Zhuang & K. T. Wong 5


([email protected])
Definition : The mean (a.k.a., statistical expectation) of a random process X (t ) is a deterministic
function of time m X (t ), that at each time instant t 0 equals the mean of the random
variable X (t 0 ) . That is, m X (t ) = E ( X (t )) for all t.

At t = t 0 , X (t 0 ) is defined by a PDF f X (t0 ) (x ) :



m X (t 0 ) = E ( X (t 0 )) = ∫ x ⋅ f X (t0 ) (x ) dx
−∞

Example Find the mean of the random process X (t ) = A cos(ω C t + Θ ), where Θ is a


random variable uniformly distributed on [0,2π ] i.e.,
 1 , 0 ≤ θ ≤ 2π
f Θ (θ ) =  2π
0, otherwise
⇒ m X (t ) = E ( X (t )) = E [A cos(ω C t + Θ )]

= ∫ A cos(ω C t + θ ) f Θ (θ ) dθ = 0.
0

May 10, 2004 By Prof.'s W. Zhuang & K. T. Wong 6


([email protected])
Correlation Function :
Definition : The autocorrelation function R X (t1 , t 2 ) of a random process X (t ) :
R X (t1 , t 2 ) = E [ X (t1 )X (t 2 )].
R X (t1 , t 2 ) provides a way of describing the inter - dependence of two random
variables, obtained by observing a random process X (t ) at t1 and t 2 . The more
rapidly each realization X (t , e ) changes with ti me, the more rapidly R X (t1 , t 2 )
decreases from its maximum value R X (t1 , t 2 ) as t 2 − t1 increases.

Example
The autocorrelation function of the random process X (t ) = A cos(ω C t + Θ ) is
R X (t1 , t 2 ) = E [ A cos(ω C t1 + Θ ) A cos(ω C t 2 + Θ )]
1 1 
= A 2 E  cos[ω C (t1 − t 2 )] + cos[ω C (t1 + t 2 ) + 2Θ]
2 2 
A2 A 2 2π 1 A2
= cos[ω C (t1 − t 2 )] + cos[ (
∫0 444C414224444
ω t + t ) + 2θ ] dθ = cos[ω C (t1 − t 2 )]
2 2 1 2π43 2
=0

May 10, 2004 By Prof.'s W. Zhuang & K. T. Wong 7


([email protected])
Stationary Stochastic Processes
Definition : A random process X (t ) is wide - sense stationary (WSS) if m X (t ) = constant
and R X (t , t − τ ) = R X (τ ).
For example, the random process in the preceding examples is WSS. ECE 316 will be
concerned almost exclusively with WSS processes.

Properties of R X (τ ) :
(1) R X (τ ) = R X (− τ ) if X (t ) is real - valued.
proof : R X (τ ) = R X (t , t − τ ) = E [ X (t )X (t − τ )] = E [X (t − τ )X (t )] = R X (− τ ).

( 2) R X (τ ) ≤ R X (0 ).
proof : [
E ( X (t ) ± X (t − τ ))
2
] ≥ 0 [ ]
⇒ E X 2 (t ) + X 2 (t − τ ) ± 2 X (t )X (t − τ ) ≥ 0
[ ] [ ]
⇒ E X 2 (t ) + E X 2 (t − τ ) ± 2 E [X (t )X (t − τ )] ≥ 0
⇒ R X (0 ) + R X (0 ) ± 2 R X (τ ) ≥ 0 ⇒ m R X (τ ) ≤ R X (0 ) ⇒ R X (τ ) ≤ R X (0 )

(3) R X (0 ) is the average power of X (t )


 1 T  1 T
proof : The average power of X (t ) is P = E lim ∫ T2 X 2 (t )dt  = lim ∫ T2 E X 2 (t ) dt [ ]
 T −∞ T − 2  T −∞ T

2

1 T2
= lim ∫−T R X (0 )dt = R X (0 )
T −∞
T 2

May 10, 2004 By Prof.'s W. Zhuang & K. T. Wong 8


([email protected])
Power & Power Spectral Density (PSD) of a Deterministic Signal x(t )

The time - average power of x(t ) is

T −∞
1 T
P = lim ∫ T2 [x(t )] dt = lim ∫ x(t ) rect t
T −
2
2

T −∞
[
T
1 ∞
−∞
( )] T
2

T −∞
1 ∞
T
[
dt = lim ∫ F x(t ) rect t
−∞
( )]
T
2
df

1 ∞ ∞ 1 ∞
= lim ∫ X T ( f ) df = ∫ lim X T ( f ) df = ∫ S x ( f ) df
2 2

T −∞ T T −∞ T
−∞ −∞ −∞

1,
where rect ( ) t
T
= 
−T ≤ t ≤ T
2 2.
0, otherwise
1 T 1 ∞
The identity lim ∫ T2 [x(t )] dt = lim ∫ X T ( f ) df is called the Parseval's theorem.
2 2

T −∞ T T −∞ T
− −∞
2

1
S x ( f ) = lim X T ( f ) is called the PSD of x(t ) .
2

T −∞ T
∞ ∞ 1
F −1 {S x ( f )} = ∫ S x ( f )e j 2π fτ df = ∫ lim X T ( f ) ⋅ e j 2π fτ df
2

T −∞ T
−∞ −∞

*
1 ∞ T   T 
= lim ∫  ∫ T2 x(t1 ) e − j 2π f t1 dt1  ⋅  ∫ T2 x(t ) e − j 2π f t dt  e j 2π f τ df
T −∞ T −∞  − 2  − 2 
1 T T 1 T T
= lim ∫ T2 x(t )∫ T2 x(t1 ) ∫ e j 2π f (τ −t1 +t ) df  dt1 dt = lim ∫ T2 x(t )∫ T2 x(t1 ) ⋅ δ (τ − t1 + t ) dt1 dt

T −∞ T − 2 −
2
 −∞  T −∞ T − 2 −
2

1 T
= lim ∫ T2 x(τ + t ) x(t ) dt , the autocorrelation function of x(t )
T −∞ T − 2

May 10, 2004 By Prof.'s W. Zhuang & K. T. Wong 9


([email protected])
Power and PSD of a Random Signal X (t )
1 T2 2
For a random signal X (t ), its time - average power lim ∫ T X (t ) dt is a random variable.
T −∞ T − 2

The statistically - averaged and time - average power of X (t ) is defined by


 1  1
[ ]
T T
P = E lim ∫
2
X 2 (t ) dt  = lim ∫
2
E X 2 (t ) dt
−T −T
 T −∞ T 2  T −∞ T 2

[ ] 1 T
If X (t ) is WSS, E X 2 (t ) = R X (0) = constant, then P = lim ∫T
2
R X (0) dt = R X (0)
T −∞ T −
2

 1
The PSD of X (t ) is defined as S X ( f ) = E lim F X (t ) ⋅ rect t
 T −∞ T T
[ ( )] 2

,


Hence, P = ∫ S X ( f ) df = R X (0 )
−∞

 1 T
 1 T 1 T
F −1
{S X ( f )} = E lim ∫ T
2
X (τ + t )X (t ) dt  = lim ∫ T
2
E [X (τ + t ) X (t )] dt = lim ∫ T
2
R X (τ ) dt = R X (τ )
 T −∞ T −
2  T −∞ T −
2 T −∞ T −
2

S ( f ) = ∞
R X (τ ) e − j 2π fτ dτ
 X
Wiener - Khintchine Relations for WSS stochastic processes : 

−∞

 R X (τ ) = ∫ S X ( f ) e j 2π fτ df
 −∞

May 10, 2004 By Prof.'s W. Zhuang & K. T. Wong 10


([email protected])
Filtering of a Stochastic Signal through a Linear Time - Invariant (LTI) System

 y (t ) = x(t ) ⊗ h(t ) = ∞

Input - output Relationship :  ∫ x(t − τ ) h(τ ) dτ


−∞

Y ( f ) = X ( f )H ( f )

where H ( f ) = F [h(t )] =

∫ h(t ) e − j 2π f t
dt
−∞

x(t ) h(t ) y (t )

In the case of a stochastic input - signal :

= E [Y (t )] = E ∫ X (t − τ ) h(τ ) dτ  =

∫ E[X (t − τ )]h(τ ) dτ
∞ ∞
Mean : mY (t )
 −∞  −∞

= E [X (t )] ⊗ h(t ) = m X (t ) ⊗ h(t )

If X (t ) is WSS, then mY (t ) = m X ∫ h(τ ) dτ = m X ⋅ H (0 ) = constant
−∞

X (t ) h(t ) Y (t )

May 10, 2004 By Prof.'s W. Zhuang & K. T. Wong 11


([email protected])
Time - Domain Correlation Function :

RY (t , t + τ ) = E [Y (t ) Y (t + τ )] = E  ∫ X (t − α )h(α )dα   ∫ X (t + τ − β ) h( β ) dβ 


∞ ∞

 −∞   −∞ 
E [ X (t − α ) X (t + τ − β )] h(α ) h( β ) dα dβ
∞ ∞ ∞ ∞
= ∫ ∫ = ∫ ∫ R X (t − α , t + τ − β ) h(α ) h( β ) dα dβ
-∞ -∞ -∞ - ∞

If X (t ) is WSS, R X (t − α , t + τ − β ) = R X (τ − β + α ), then

∫ ∫ R X (τ − β + α )h(β )dβ  h(α ) dα ∫ [R (τ + α ) ⊗ h(τ + α )] h(α ) dα


∞ ∞ ∞
RY (t , t + τ ) = =
-∞ -∞  -∞
X

∫ [R (τ − α ) ⊗ h(τ − α )] h(−α ) dα

= X = R X (τ ) ⊗ h(τ ) ⊗ h(− τ ) = RY (τ )
-∞

Hence, if X (t ) is WSS, then Y (t ) is also WSS.

The PSD of Y (t ) is
SY ( f ) = F [RY (τ )] = F [R X (τ ) ⊗ h(τ ) ⊗ h(− τ )] = F [R X (τ )] ⋅ F [h(τ )] ⋅ F [h(− τ )]
123
Output −Signal's
PSD

S X ( f ) H ( f )H * ( f ) = H( f ) ⋅ SX ( f )
2
=
123 142 4 43 4
Input −Signal's System
PSD Impulse− Response's
PSD

The statistically - averaged and time - averaged power of Y (t ) is :



= RY (0 ) = ∫ H ( f ) ⋅ S X ( f ) ⋅ df
2
PY
−∞

May 10, 2004 By Prof.'s W. Zhuang & K. T. Wong 12


([email protected])
Properties of the PSD

(1) PX = R X (0 ) = ∫ S X ( f ) df
−∞

(2) S X (0 ) = ∫ R X (τ ) ⋅ dτ
−∞

(3) S X ( f ) ≥ 0, ∀f
(4) S X ( f ) = S X (− f )

Proof : S X (− f ) = ∫ R X (τ ) ⋅ e − j 2π (− f )τ dτ
−∞

∫ R (− τ ) ⋅ e
− j 2π f ( −τ )
= X dτ
−∞
−∞
∫ R (u ) ⋅ e (− du )
− j 2π f u
= X
u = -τ ∞

= ∫ R X (u ) ⋅ e − j 2π f u du = S X ( f )
−∞

(5) S Y ( f ) = H( f ) ⋅ SX ( f )
2

May 10, 2004 By Prof.'s W. Zhuang & K. T. Wong 13


([email protected])
Example : Derive the PSD of the random process Y (t ) = X (t ) cos(2π f C t + Θ ), where X (t ) is a WSS
stochastic process with PSD S X ( f ), f c is a deterministic scalar constant, Θ is a random variable
uniformly distributed on [0, 2π ] and is statistically independent of X (t )

Solution :
Step 1 : To show that Y (t ) is WSS :
mY (t ) = E [X (t ) cos(2π f C t + Θ )] = E [X (t )] ⋅ E [cos(2π f C t + Θ )]
2π 1
= m X ⋅ ∫ cos(2π f C t + θ ) ⋅ dθ = 0
0 2π
R Y (t , t + τ ) = E [Y (t )Y (t + τ )] = E [ X (t ) cos(2πf C t + Θ ) ⋅ X (t + τ ) cos(2πf C t + 2πf Cτ + Θ )]
1
= E [ X (t )X (t + τ )] ⋅ E [cos(2πf C t + Θ ) ⋅ cos(2πf C t + 2πf Cτ + Θ )] = R X (τ ) ⋅ cos(2πf Cτ )
2
= R Y (τ ) ⇒ Y (t ) is WSS.
Step 2 : To use the Wiener - Khintchine relation to derive the PSD of Y (t ) :
1  1 1 
S Y ( f ) = F [R Y (τ )] = F  R X (τ ) cos(2πf Cτ ) = F  R X (τ ) exp( j 2πf Cτ ) + R X (τ ) exp(− j 2πf Cτ )
2  4 4 
1
= [S X ( f − f C ) + S X ( f + f C )]
4

May 10, 2004 By Prof.'s W. Zhuang & K. T. Wong 14


([email protected])
Example A random process N (t ) is " zero - mean whit e Gaussian noise" if
(1) N (t ) is a Gaussian random variable at all t ,
(2) E [N (t )] = 0 at all t , and
N0
(3) R N (τ ) = δ (τ )
2
N(t) Y(t)
These imply :
h(t) H(f)
i ) E [N (t1 )N (t 2 )] = 0 if t1 ≠ t 2 ;
ii) [
E N 2 (t )] = R N (0 ) = ∞ , i.e., N (t ) is physically unrealizab le!
iii) S N ( f ) = F [R N (τ )] = N 0 / 2, − ∞ < f < ∞ ( two - sided psd)
N0
iv ) S Y ( f ) = H ( f ) , wher e H (t ) is usually realizable .
H ( f ) SN ( f ) =
2 2

2
At the output, the mean is mY (t ) = m N (t ) ⊗ h (t ) = 0
and the statistica lly averaged and time - average power is :

[ ] ∞
E Y 2 (t ) = ∫ S Y ( f ) df =
−∞
N0 ∞
2 ∫− ∞
H ( f ) 2
df = σ Y
2

If the input to a linear ti me - invariant filter is Gaussian, then th e


output is also Gaussian; hence, at any time instant t 0 , Y (t 0 ) ~ N 0 ,σ Y ( 2
)
May 10, 2004 By Prof.'s W. Zhuang & K. T. Wong 15
([email protected])
Summary
mean : m X (t ) = E [X (t )]
(1) Definition and description of a random process 
correlation : R X (t1 , t 2 ) = E [X (t1 )X (t 2 )]

(2) WSS random processes :


• m X (t ) = constant
• R X (t , t + τ ) = R X (τ ) → R X (τ ) = R X (− τ ), R X (0) ≥ R X (τ )
• Wiener - Khintchine Relation : R X (τ ) ←→ F SX ( f )
F − 1
 1   t 
2

where S X ( f ) ∆ E  lim F  X (t ) ⋅ rect    ≥ 0
T →∞ T   T  

∞  1 
S X ( f ) df X 2 (t )dt  = R X (0 )
T /2
(3) PX = ∫ -∞
= E lim
 T →∞ T

−T / 2

(4) mY (t ) = m X (t ) ⊗ h(t ) = m X (t ) H (0)


RY (τ ) = R X (τ ) ⊗ h(τ ) ⊗ h(−τ ) X(t) LTI Y(t)
SY ( f ) = SX ( f ) H( f )
2 h(t)
WSS WSS

∫ S X ( f ) H ( f ) df
2
PY =
−∞

May 10, 2004 By Prof.'s W. Zhuang & K. T. Wong 16


([email protected])

You might also like