0% found this document useful (0 votes)
27 views24 pages

Unit 4

The document discusses random processes, focusing on stationary and weakly stationary processes, their definitions, properties, and classifications. It includes mathematical formulations for mean and variance, along with problems and proofs related to stationary processes. The document also provides examples and conditions under which a random process is considered stationary.

Uploaded by

V. Arun Sundar
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
27 views24 pages

Unit 4

The document discusses random processes, focusing on stationary and weakly stationary processes, their definitions, properties, and classifications. It includes mathematical formulations for mean and variance, along with problems and proofs related to stationary processes. The document also provides examples and conditions under which a random process is considered stationary.

Uploaded by

V. Arun Sundar
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 24

Unit IV-Random Process and Stationary Process

Contents
1 Introduction 2

2 Stationary random processes 3


2.1 Problems based on Stationary process . . . . . . . . . . . . . . . . . . . . . . 4

3 Auto correlation of the random process and its properties 8

4 Wide(Weakly) sense stationary process 9


4.1 Problems of Wide sense stationary(WSS) process . . . . . . . . . . . . . . . 10
4.2 Examples of WSS for discrete random variables . . . . . . . . . . . . . . . . 15

5 Cross correlation and Jointly WSS process 18


5.1 Properties of cross correlation . . . . . . . . . . . . . . . . . . . . . . . . . . 19
5.2 Jointly WSS process . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 20

6 Ergodicity 22
6.1 Mean-Ergodic Theorem . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 23

1
1 Introduction
In electrical systems, voltage or current waveforms are used as signals for collecting, trans-
mitting or processing information, as well as for controlling and providing power to a variety
of devices. These signals (voltage or current waveforms) are functions of time and are of two
classesâĂŤdeterministic and random. Deterministic signals can be described by the usual
mathematical functions with time t as the independent variable. But a random signal always
has some element of uncertainty associated with it and hence it is not possible to determine
its value exactly at any given point of time. However, we may be able to describe the random
signal in terms of its average properties such as the average power in the random signal, its
spectral distribution and the probability that the signal amplitude exceeds a given value.
The probabilistic model used for characterising a random signal is called a random process
or stochastic process.
A random variable (RV) is a rule (or function) that assigns a real number to every
outcome of a random experiment, while a random process is a rule (or function) that assigns
a time function to every outcome of a random experiment. For example, consider the random
experiment of tossing a die at t = 0 and observing the number on the top face. The sample
space of this experiment consists of the outcomes {1, 2, 3, . . . , 6}. For each outcome of the
experiment, let us arbitrarily assign a function of time t(0 ≤ t < ∞) in the following manner.
Outcome: 1 2 3 4 5 6
Function x1 (t) x2 (t) x3 (t) x4 (t) x5 (t) x6 (t)
of time: = −4 = −2 =2 =4 = −t/2 = t/2
The set of functions {x1 (t), x2 (t), . . . , x6 (t)} represents a random process.

Definition: A random process is a collection (or ensemble) of RVs {X(s, t)} that are
functions of a real variable, namely time t where s ∈ S (sample space) and t ∈ T (parameter
set or index set).
The set of possible values of any individual member of the random process is called state
space. Any individual member itself is called a sample function or a realisation of the process.
Note:
1. If s and t are fixed, then {X(s, t)} is a number.
2. If t is fixed, then {X(s, t)} is a random variable.
3. If s is fixed, then {X(s, t)} is a single time function.
4. If s and t are variables, then {X(s, t)} is a collection of random variables which are
time functions.

Classification of Random Processes


Depending on the continuous or discrete nature of the state space S and parameter set T , a
random process can be classified into four types:
1. If both T and S are discrete, the random process is called a discrete random
sequence. For example, if Xn represents the outcome of the nth toss of a fair
dice, then {Xn , n ≥ 1} is a discrete random sequence, since T = {1, 2, 3, . . .} and
S = {1, 2, 3, 4, 5, 6}.

2
2. If T is discrete and S is continuous, the random process is called a continuous random
sequence. For example, if Xn represents the temperature at the end of the nth hour
of a day, then {Xn , 1 ≤ n ≤ 24} is a continuous random sequence, since temperature
can take any value in an interval and hence continuous.

3. If T is continuous and S is discrete, the random process is called a discrete random


process. For example, if X(t) represents the number of telephone calls received in the
interval (0, t) then {X(t)} is a discrete random process, since S = {0, 1, 2, 3, . . .}.

4. If both T and S are continuous, the random process is called a continuous random
process. For example, if X(t) represents the maximum temperature at a place in the
interval (0, t), then {X(t)} is a continuous random process. In this context, the names
given above, the word ‘discrete’ or ‘continuous’ is used to refer to the nature of S and
the word ‘sequence’ or ‘process’ is used to refer to the nature of T .

2 Stationary random processes


Mean: The ensemble mean or average of a random process {X(t)} is the expected value
of the random variable at time t.
i.e.,
Z+∞
E (X (t)) = µX (t) = X(t) f (x, t) dx
−∞

+∞
R
Ensemble average in I order: E[X(t)] = X(t) f (x, t) dx
−∞
+∞
R
Ensemble average in II order: E[X(t) · X(t + τ )] = [X(t) · X(t + τ )] f (x, t) dx
−∞
In certain probability distributions or averages of the random variables forming a ran-
dom process {X(t)} not depends on time t, then the random process is called a stationary
process.

• First order stationary processes: A random process {X(t)} is called stationary to


order 1, if its first order density function does not change with a shift in time origin.

fx (x, t) = fx (x, t + ∆), ∀t and for any real number ∆.

i.e.,

E[X(t)] = constant

• Second order stationary processes: A random process {X(t)} is called stationary


to order 2, if its second order density function does not change with a shift in time
origin.

f (x1 , x2 ; t1 , t2 ) = f (x1 , x2 ; t1 + ∆, t2 + ∆), ∀t1 , t2 and real number ∆.

3
i.e.,

E [X (t1 ) · X (t2 )] = E [X (t1 + ∆) · X (t2 + ∆)]

Note: All the second order stationary process are first order stationary but the con-
verse need not be true.

• Stationary process (or) Strictly Stationary process (or) Strictly sense Sta-
tionary process [SSS processes]: If a random process is stationary to all order,
then the random process is said to strict sense stationary process.

f (x1 , x2 , · · · , xn ; t1 , t2 , · · · , , tn ) = f (x1 , x2 , · · · , xn ; t1 + ∆, t2 + ∆, · · · , tn + ∆), ∀ti

and real number ∆.

• Evolutionary process: A random process X(t) that is not stationary in any sense
is called an evolutionary process.

2.1 Problems based on Stationary process


Problem 1: Prove that a first order stationary random process has a constant
mean.
Proof: Let X(t) be a first order stationary random process. Then we have

f (x, t + ∆) = f (x, t) (1)

where t, ∆ are arbitrary.


Now,
Z∞
E[X(t + ∆)] = x f (x, t + ∆) dx
−∞
Z∞
= x f (x, t) dx [ by (1)]
−∞

= E[X(t)]

Hence, a first order stationary random process has a constant mean.


i.e., a first order stationary random process is independent of time.
Problem 2: Prove that a first order stationary random process has a constant
variance.
Proof: Let X(t) be a first order stationary random process. Then we have

f (x, t + ∆) = f (x, t) (1)

where t, ∆ are arbitrary.

4
Now,
Z∞
Var[X(t + ∆)] = (x − E[X(t)])2 f (x, t + ∆) dx
−∞
Z∞
= (x − E[X(t)])2 f (x, t) dx [ by (1)]
−∞

= Var[X(t)]

Hence, a first order stationary random process has a constant variance.


Problem 3: Let random process X(t) = cos(t + φ) where φ is a random variable
1 π π
with f (φ) = , − < φ < . Check whether X(t) stationary or not.
π 2 2
Solution : Given: X(t) = Random process = cos(t + φ).
π π
φ = a continuous Random variable, − <φ< .
2 2
1
f (φ) = P.d.f. of φ = .
π
Check X(t) is stationary or not:
Now,
R.V.Upper
Z limit

E[X(t)] = X(t) f (r.v.) d(r.v.)


R.V.Lower limit

π/2
1
Z
= cos(t + φ) dφ
π
−π/2
 π/2
1 sin(t + φ)
=
π 1 −π/2
1 nh  π i h  π io
= sin t + − sin t −
π 2 2  
1n h π io ∵ sin(t + 90) = cos t
= [cos (t)] − − sin −t
π 2 sin(t − 90) = − sin(90 − t)
1n h π io
= [cos t] + sin −t
π 2
1
= {cos t + cos (t)} [∵ sin(90 − t) = cos t]
π
1
= [2 cos t]
π
∴ E[X(t)] depends on time ‘t’.
Hence X(t) is not stationary.
Problem 4: Show that X(t) = A cos(ωt + θ) is not stationary if A and ω are

5
constants and θ is uniformly distributed in (0, π).

Solution : Given: X(t) = Random process = A cos(ωt + θ),


where A and ω are constants
θ = a uniformly distributed r. v. defined in (0, π).
1 1
f (θ) = P.d.f. of θ = = .
π−0 π
We have to prove E[X(t)] is a function of t.
Now,
R.V.Upper
Z limit

E[X(t)] = X(t) f (r.v.) d(r.v.)


R.V.Lower limit


1
∴ E[X(t)] = A cos(ωt + θ) dθ
π
0
A
= [sin(ωt + θ)]π0
π
Ah i
= [sin (ωt + π)] − [sin (ωt − 0)]
π
A
= [−sinωt − sin ωt] [∵ sin(ωt + 180) = − sin ωt]
π
−2A
= sin ωt
π
∴ E[X(t)] depends on time ‘t’.
Hence X(t) is not stationary.
Problem 5: Let X(t) = cos(ωt + θ), where θ is uniformly distributed in (−π, π).
Check whether X(t) is stationary or not? Also find first and second moments of
process.

Solution : Given: X(t) = Random process = cos(ωt + θ),


where ω is a constant
θ = a uniformly distributed r. v. defined in (−π, π).
1 1
f (θ) = P.d.f. of θ = = .
π − (−π) 2π

6
First moment of the Random process is

1
∴ E[X(t)] = cos(ωt + θ) dθ

−π
1h iπ
= sin(ωt + θ)
2π −π
1h i
= sin (ωt + π) − sin (ωt − π)
2π  
1h i ∵ sin(ωt+180) = − sin ωt
= − sin ωt + sin(π − ωt)
2π sin(ωt−π) = − sin(π−ωt)
1
= [− sin ωt + sin ωt]

= 0 = a constant
= not a function of t

∴ X(t) is stationary.
∴ First moment of the Random process X(t) = E[X(t)] = 0
Second moment of the Random process is

E X 2 (t) = E cos2 (ωt + θ)


   
 
1 + cos 2(ωt + θ)
=E
2
1 1
= E[1] + E [cos(2ωt + 2θ)]
2 2

1 1 1
= + cos(2ωt + 2θ) dθ [∵ E[constant] = constant]
2 2 2π
−π
 π
1 1 sin(2ωt + 2θ)
= +
2 4π 2 −π
1 1 h i
= + sin(2ωt+2π)−sin(2ωt−2π)
2 8π
1 1h i
= + sin(2π+2ωt)+sin(2π−2ωt) [∵ sin(A−B) = −sin(B −A)]
2 8π

 
1 1h i ∵ sin(2π+A) = sin(A)
= + sin(2ωt)−sin(2ωt)
2 8π sin(2π−A) = −sin(A)
1
= +0
2
1
=
2
1
∴ Second moment of the Random process X(t) = E [X 2 (t)] =
2
∴ X(t) is stationary w.r.t. to both I and II moment.

7
Note : Variance of the random process X(t) = E X 2 (t) − {E[X(t)]}2
 

1 1
= − (0)2 =
2 2

3 Auto correlation of the random process and its prop-


erties
Auto correlation of the random process {X(t)} is denoted by RXX (t, t + τ ) (or) RXX (t1 , t2 )
and is defined as
RXX (t, t + τ ) = E[X(t).X(t + τ )]
(or)
RXX (t1 , t2 ) = E[X(t1 ).X(t2 )]
Properties of RXX (t, t + τ )
1. The mean square value of the random process may be obtained from the
Auto correlation function RXX (τ ), by putting τ = 0.
Proof: W.K.T., RXX (τ ) = E{X(t)X(t + τ )}
RXX (0) = E{X(t)X(t + 0)}
= E{X(t)X(t)}
= E{X 2 (t)}
i.e., RXX (0) is the mean square value.
i.e., 2nd moment of the random process.
2. RXX (τ ) is an even function of τ i.e., RXX (τ ) = RXX (−τ ).
Proof : W.K.T. RXX (τ ) = E{X(t)X(t + τ )}
RXX (−τ ) = E{X(t)X(t − τ )}
Put t − τ = P ⇒ t = P + τ

RXX (−τ ) = E{X(P + τ )X(P )}


= E{X(P )X(P + τ )}
= RXX (τ )

∴ RXX (τ ) is an even function.


3. The maximum value of RXX (τ ) is attained at the point τ = 0
i.e., |RXX (τ )| ≤ RXX (0).
Proof : Consider E{[X(t1 ) ± X(t2 )]2 } ≥ 0

E X 2 (t1 ) + X 2 (t2 ) ± 2X(t1 )X(t2 ) ≥ 0


 

E X 2 (t1 ) + E X 2 (t2 ) ± 2E{X(t1 )X(t2 )} ≥ 0


   

∵ RXX (0) = E X 2 (t)


  
RXX (0) + RXX (0) ± 2RXX [t1 , t2 ] ≥ 0
2RXX (0) ≥ |2RXX (τ )|
RXX (0) ≥ |RXX (τ )|

4. If a random process X(t) has no periodic components and if X(t) is of non


zero mean, then lim RXX (τ ) = [E (X)]2 .
|τ |→∞

8
Proof : We have RXX (t1 , t2 ) = E[X(t1 ).X(t2 )] = E[X1 .X2 ]
where t2 = t1 + τ .
If there are no periodic components, as |τ | → ∞, X1 and X2 can be considered as
independent.

∴ RXX (τ ) = E[X1 ] · E[X2 ]

Since X1 and X2 are from the same random process at two different time instants,
E[X1 ] = E[X2 ] = E[X]

∴ lim RXX (τ ) = [E (X)]2


|τ |→∞

Note : If the random variables have zero mean, then the auto correlation function
RXX (t1 , t2 ) is zero. i.e.,

lim RXX (t1 , t2 ) = Lt E [X (t1 ) .X (t1 + τ )] = E [X (t1 )] E [X (t1 + τ )] = 0


|τ |→∞ |τ |→∞

5. If X(t) is periodic, then its auto correlation function is also periodic.


Proof : Let T0 be period of X(t).

Consider R(τ ± T0 ) = E[X(t).X(t + τ ± T0 )]


= E[X(t).X(t + τ )][∵ T0 is period of X(t)]
= R(τ ).

Hence R(τ ) is a periodic function.

6. If the random process Z(t) = X(t) + Y (t) where X(t) and Y (t) are random processes,
then RZZ (τ ) = RXX (τ ) + RY Y (τ ) + RXY (τ ) + RY X (τ ).
Proof :

Consider RZZ (τ )
= E[Z(t).Z(t + τ )]
= E{[X(t) + Y (t)] · [X(t + τ ) + Y (t + τ )]}
= E{[X(t) · X(t + τ ) + X(t) · Y (t + τ ) + Y (t) · X(t + τ ) + Y (t) · Y (t + τ )]}
= E[X(t) · X(t+τ )]+E[X(t) · Y (t+τ )]+E[Y (t) · X(t+τ )]+E[Y (t) · Y (t+τ )]
= RXX (τ ) + RXY (τ ) + RY X (τ ) + RY Y (τ )

4 Wide(Weakly) sense stationary process


A random process X(t) is said to be Wide sense stationary (WSS)process, if its mean is
a constant and its auto correlation depends only on time difference (i.e.,)

(1) E[X(t)] = constant

9
(2) RXX (t, t + τ ) = E[X(t) · X(t + τ )] = RXX (τ ) = a function of τ
(or)
RXX (t1 , t2 ) = E[X (t1 ) · X (t2 ))]
= a function of time difference (t2 − t1 ) (or) (t1 − t2 )

Note: Wide sense stationary process is also called as Weak sense stationary process(W.S.S.).

4.1 Problems of Wide sense stationary(WSS) process


Show that random process X(t) = A cos(ωt + θ) is W.S.S.(or covariance sta-
tionary) if A and ω are constants and θ is uniformly distributed random variable
in (0, 2π).
Solution : Given: X(t) = Random process = A cos(ωt + θ),
where A and ω are constants
θ = a uniformly distributed r. v. defined in (0, 2π).
1 1
f (θ) = P.d.f. of θ = = .
2π − 0 2π
To show X(t) is Wide sense stationary, we have to prove the following

(i) Mean of X(t) = E[X(t)] = a constant (1)


(ii) Autocorrelation of X(t) = E[X(t).X(t + τ )] = a function of τ (2)

Now, to prove (1):

R.V.Upper
Z limit

E[X(t)] = X(t) f (r.v.) d(r.v.)


R.V.Lower limit

Z2π
1
= A cos(ωt + θ) dθ

0
Ah i2π
= sin(ωt + θ)
2π 0
Ah i
= sin (ωt + 2π) − sin (ωt − 0)

Ah i  
= sin ωt − sin ωt ∵ sin(ωt + 2π) = sin ωt

A
= [0]

=0
= a constant

Now, to prove (2):

RXX (t, t + τ ) = E[X(t) · X(t + τ )]


h i
= E A cos(ωt + θ) · A cos(ω(t + τ ) + θ)

10
A2 h i
= E cos [(ωt+θ)−(ωt+ωτ +θ)]+cos [(ωt + θ)+(ωt+ωτ +θ)]
2
1
(∵ cos A cos B = [cos(A−B)+cos(A+B)])
2
2 h
A i
= E cos [ωτ ] + cos [2ωt + ωτ + 2θ] (∵ cos(−ωt) = cos ωt)
2
A2 A2 h i
= cos [ωτ ] + E cos(2ωt + ωτ + 2θ) (3)
2 2
(∵ E[R.P. without r.v. ] = E[constant] = constant])

A2 1
Z
2
= cos [ωτ ] + A cos [2ωt + ωτ + 2θ] dθ
2 2π
0
Z2π
A2 A2
= cos [ωτ ] + cos [2ωt + ωτ + 2θ] dθ
2 2π
0
2 2
 2π
A A sin (2ωt + ωτ + 2θ)
= cos [ωτ ] +
2 2π 2 0
A2 A2
= cos [ωτ ] + [sin (2ωt + ωτ + 4π) − sin (2ωt + ωτ )]
2 4π
A2 A2
= cos [ωτ ] + [sin (2ωt + ωτ ) − sin (2ωt + ωτ )] (∵ sin(4π + A) = sin(A))
2 4π
2
A A2
= cos [ωτ ] + [0]
2 4π
A2
= cos [ωτ ] + 0
2
= a function of τ

∴ (1) and (2) are proved.


Hence X(t) is WSS.
A random variable Y with characteristic function φ(ω) = E(eiωY ) and X(t) =
cos(λt + Y ). Show that X(t) is WSS, if φ(1) = φ(2) = 0.
Solution : Given: X(t) = random process = cos(λt + Y )
Y = a random variable
[a r.v. without range, so use expansion ideas]
φ(ω) = characteristic function of Y = φY (ω)
= E eiωY
 

= E [cos ωY + i sin ωY ]
= E [cos ωY ] + iE [sin ωY ]

Also given φ(1) = 0


⇒ E [cos Y ] + iE [sin Y ] = 0
⇒ E [cos Y ] = 0, E [sin Y ] = 0 (1)

11
and also φ(2) = 0
E [cos 2Y ] + iE [sin 2Y ] = 0
⇒ E [cos 2Y ] = 0, E [sin 2Y ] = 0 (2)

To show X(t) is WSS, we have to prove

(i) E[X(t)] = a constant


(ii) RXX (t, t + τ ) = a function of τ
(or)
RXX (t1 , t2 ) = a function of time difference

Now, to prove (i)

E[X(t)] = E[cos(λt + Y )]
= E[cos λt cos Y − sin λt sin Y ]
= cos λtE[cos Y ] − sin λtE[sin Y ]
h i
= cos λt(0) − sin λt(0) ∵ E(cos Y ) = 0, E(sin Y ) = 0
=0

Now, to prove (ii)

RXX (t1 , t2 ) = E [X (t1 ) · X (t2 )]


h i
= E cos (λt1 + Y ) · cos (λt2 + Y )
1 h i
= E cos (λt1 + Y − λt2 − Y ) + cos (λt1 + Y + λt2 + Y )
2
1 h i 1 h i
= E cos λ (t1 − t2 ) + E cos λ (t1 + t2 ) + 2Y
2 2
1 1 h i
= cos λ (t1 − t2 ) + E cos λ (t1 + t2 ) cos 2Y − sin λ (t1 + t2 ) sin 2Y
2 2
(∵ cos(A + B) = cos A cos B − sin A sin B)
1
= cos λ (t1 − t2 )
2
1h i
+ cos λ (t1 + t2 ) E(cos 2Y ) − sin λ (t1 + t2 ) E(sin 2Y )
2
1
= cos λ (t1 − t2 )
2
1h i
+ cos λ (t1 + t2 ) E(cos 2Y ) − sin λ (t1 + t2 ) E(sin 2Y )
2
1 1h i
= cos λ (t1 − t2 ) + cos λ (t1 + t2 ) (0) − sin λ (t1 + t2 ) (0)
2 2
1
= cos λ (t1 − t2 ) + 0
2
1
= cos λτ (where τ = t1 − t2 or τ = t2 − t1 )
2
∴ (i) and (ii) are proved.

12
Hence X(t) is WSS.
Let random process X(t) = B cos(50t + φ) where B and φ are independent
random variables. B is a random variable with mean ‘0’ and variance ‘1’ and φ
is uniformly distributed in interval (−π, π). Find mean and auto correlation of
process.

Solution : Given: X(t) = random process = B cos(50t + φ)


B = a r.v. with mean 0 and variance 1
E[B] = Mean of B = 0
Var(B) = Variance of B = 1
⇒ E B −[E(B)]2 = 1
 2

⇒ E B2 = 1
 
[∵ E(B) = 0]
φ = a r.v. an U.D. in (−π, π)
1 1
f (φ) = pdf of r.v.φ = =
π − (−π) 2π

Mean of the R.P. X(t) :


h i
E[X(t)] = E B cos(50t + φ)
h i h i
= E B E cos(50t + φ)
h i
= (0) · E cos(50t + φ)
=0

Autocorrelation of the R.P. X(t) :


h i
E X(t) · X(t + τ )
h i
= E B cos(50t + φ) · B cos(50(t + τ ) + φ)
1 h h ii
= E B 2 · cos (50t + φ − 50t − 50τ − φ) + cos(50t + φ + 50t + 50τ + φ)
2
1 h 2i h i
= E B · E cos (−50τ ) + cos(100t + 50τ + 2φ)
2
1 h i 1 h i
= · E cos (−50τ ) + E cos(100t + 50τ + 2φ) (∵ E (B 2 ) = 1)
2 2

1 1 1
= cos (50τ ) + cos(100t + 50τ + 2φ) dφ (∵ cos(−ωτ ) = cos(ωτ ))
2 2 2π
−π
1 1h i−π
= cos (50τ ) + sin(100t + 50τ + 2φ)
2 8π π
1 1 h i
= cos (50τ ) + sin(100t + 50τ + 2π) − sin(100t + 50τ − 2π)
2 8π
1 1 h i
= cos (50τ ) + sin(100t + 50τ ) + sin(2π − (100t + 50τ ))
2 8π

13
1 1h i
= cos (50τ ) + sin(100t + 50τ ) − sin(100t + 50τ )
2 8π
1
= cos (50τ ) + 0
2
1
= cos (50τ )
2
Note : In the above problem, X(t) is WSS.
Show that the process X(t) = A cos λt + B sin λt is WSS, where A and B
are random variables if
(a) E(A) = E(B) = 0 (b) E(A2 ) = E(B 2 ) (c)E(AB) = 0.

Solution : Given: X(t) = random process = A cos λt + B sin λt


where A&B are random variables
(a) ⇒ E(A) = E(B) = 0
(b) ⇒ E A2 = E B 2 = σ 2 , say
  

(c) ⇒ E(AB) = 0
To show X(t) is WSS, we have to prove

(i)E[X(t)] = a constant
(ii)RXX (t, t + τ ) = a function of τ
(or)
RXX (t1 , t2 ) = a function of time difference

Now, to prove (i)

E[X(t)] = E[A cos λt + B sin λt]


= E[A] cos λt + E[B] sin λt
= (0) cos λt + (0) sin λt (∵ by (a))
=0

Now, to prove (ii)

RXX (t1 , t2 ) = E [X (t1 ) · X (t2 )]


= E [(A cos λt1 + B sin λt1 ) · (A cos λt2 + B sin λt2 )]
= E A2 cos λt1 cos λt2 + AB cos λt1 sin λt2 + AB cos λt2 sin λt1


+B 2 sin λt1 sin λt2




= E A2 cos λt1 cos λt2 + E [AB] cos λt1 sin λt2 + E [AB] cos λt2 sin λt1
 

+ E B 2 sin λt1 sin λt2


 

= σ 2 cos λt1 cos λt2 + 0 + 0 + σ 2 sin λt1 sin λt2 (∵ by (c))


2
= σ [cos λt1 cos λt2 + sin λt1 sin λt2 ] (∵ by (b))

14
= σ 2 cos λ (t1 − t2 ) (∵ cos(A − B) = cos A cos B + sin A sin B)
= σ 2 cos λτ
= a function of time difference

∴ (i) and (ii) are proved.


Hence X(t) is WSS.
If X(t) is a WSS process with auto correlation R(τ ) = Ae−α|τ | . Find second
order moment of the random variable X(8) − X(5).

Solution : Given: X(t) = WSS random process


R(τ ) = Autocorrelation of X(t) = E[X(t) · X(t + τ )]
= Ae−α|τ | , τ = time difference.

Second order moment of the r.v. X(8) − X(5) :


h i
= E [X(8) − X(5)]2
= E X 2 (8) − 2X(8) · X(5) + X 2 (5)
 

= E[X(8) · X(8)] + E[X(5) · X(5)] − 2E[X(8) · X(5)]


= Ae−α|8−8| + Ae−α|5−5| − 2Ae−α|8−5|
= A + A − 2Ae−3α (∵ R(τ ) = Ae−α|τ | )
= 2A − 2Ae−3α
= 2A 1 − e−3α
 

4.2 Examples of WSS for discrete random variables


The process X(t) whose probability distribution under certain conditions is
given by

(at)n−1
, n = 1, 2, · · ·



(1 + at) n+1
P [X(t) = n] =
at
, n = 0.



1 + at
Show that it is not stationary (or  evolutionary).
(at)n−1

 , n = 1, 2, · · ·
Solution : Given: P [X(t) = n] = (1 + at)n+1 (1)
at
, n = 0.


1 + at
where X(t) = a discrete random process
To show X(t) is not stationary, we have to show

E [X r (t)] = a function of t, r = 1, 2, · · · (2)

15
When r = 1
n→∞
X
E[X(t)] = nPn (t)
n=0
n→∞
X
= [nPn (t)]n=0 + nPn (t) (∵ by (1))
n=1
n→∞
(at)n−1
 
at X
= 0· + n
1 + at n=1
(1 + at)n+1
1 (at) (at)2
=0+1· + 2 · + 3 · + ···
(1 + at)2 (1 + at)3 (1 + at)4
"    2 #
1 at at
= 1+2 +3 + ···
(1 + at)2 1 + at 1 + at
  −2
1 at
= 1− (∵ 1 + 2x + 3x2 + · · · = (1 − x)−2 )
(1 + at)2 1 + at
 −2
1 1 + at − at
=
(1 + at)2 1 + at
 −2
1 1
=
(1 + at)2 1 + at
1
= [1 + at]2
(1 + at) 2

=1 (3)
= a constant

substitute r = 2 in (2)
n→∞
X
2
n2 Pn (t)
 
E X (t) =
n=0
n→∞
X
2
n2 Pn (t) (∵ by (1))
 
= n Pn (t) n=0
+
n=1
n→∞
X
=0+ [n(n + 1) − n]Pn (t)
n=1
n→∞
X n→∞
X
= [n(n + 1)]Pn (t) − [n]Pn (t)
n=1 n=1
n→∞
(at)n−1
X  
= [n(n + 1)] −1 (∵ by (3))
n=1
(1 + at)n+1
(at)2
 
1 (at)
= 1(2) + 2(3) + 3(4) + ··· − 1
(1 + at)2 (1 + at)3 (1 + at)4

16
"    2 #
2 at at
= 1+3 +6 + ··· − 1
(1 + at)2 1 + at 1 + at
  −3
2 at
= 1− −1 (∵ 1 + 3x + 6x2 + · · · = (1 − x)−3 )
(1 + at)2 1 + at
 −3
2 1 + at − at
= −1
(1 + at)2 1 + at
 −3
2 1
= −1
(1 + at)2 1 + at
2
= [1 + at]3 − 1
(1 + at) 2

= 2(1 + at) − 1
= 2 + 2at − 1
= 1 + 2at
= a function of t

⇒ X(t) is not stationary.

Note : Var[X(t)] = E X 2 (t) − {E[X(t)]}2


 

= 1 + 2at − (1)2
= 2at

Assume a random process X(t, s1 ) = cos t, X(t, s2 ) = − cos t, X(t, s3 ) =


sin t, X(t, s4 ) = − sin t, which are equally likely events. Show that it is WSS.
Solution : Given that the sample functions X(t, s1 ), X(t, s2 ), X(t, s3 ) and X(t, s4 ) are are
equally likely events.
1 

X (t, s1 ) = cos t ⇒ P [X (t, s1 )] = 
4 


1 


X (t, s2 ) = − cos t ⇒ P [X (t, s2 )] = 

4 (1)
1 
X (t, s3 ) = sin t ⇒ P [X (t, s3 )] = 
4 



1 

X (t, s4 ) = − sin t ⇒ P [X (t, s4 )] =


4
To show that X(t) is WSS, we have to show

(i) E[X(t)] = a constant


(ii) RXX (t, t + τ ) = a function of τ

Now, to prove (i)


i=4
X
E[X(t)] = X (t, si ) P [X (t, si )]
i=1
= X (t, s1 ) P [X (t, s1 )] + X (t, s2 ) P [X (t, s2 )]
+ X (t, s3 ) P [X (t, s3 )] + X (t, s4 ) P [X (t, s4 )]

17
       
1 1 1 1
= cos − cos + sin − sin (∵ by (1))
4 4 4 4
= 0 = a constant (2)

Now, to prove (ii)

RXX (t, t + τ )
= E[X(t) · E(t + τ )]
= E[X (t, si ) · X (t + τ, si )]
i=4
X
= [X (t, si ) · X (t + τ, si )] · P [X (t, si )]
i=1
h i h i h i h i
= X (t, s1 )·X (t+τ, s1 ) · P X (t, s1 ) + X (t, s2 )·X (t+τ, s2 ) · P X (t, s2 )
h i h i h i h i
+ X (t, s3 )·X (t+τ, s3 ) · P X (t, s3 ) + X (t, s4 )·X (t+τ, s4 ) · P X (t, s4 )
   
1 1
= (cos t)[cos(t + τ )] + (− cos t)[− cos(t + τ )]
4 4
   
1 1
+ (sin t)[sin(t + τ )] + (− sin t)[− sin(t + τ )]
4 4
1 h i
= 2 cos t cos(t + τ ) + 2 sin t sin(t + τ ) (∵ by (1))
4
1 h i
= cos t cos(t + τ ) + sin t sin(t + τ )
2
1 h i
= cos(t − (t + τ )) (∵ cos(A − B) = cos A cos B + sin A sin B)
2

1h i
= cos(−τ )
2
1
= cos τ
2
= a function of τ (3)

∴ (i) and (ii) are proved.


Hence X(t) is WSS.

5 Cross correlation and Jointly WSS process


Let {X(t)} and {Y (t)} be two random processes. Then the cross correlation between them
is defined as RXY (t, t + τ ) = E[X(t)Y (t + τ )] = RXY (τ )

18
5.1 Properties of cross correlation
1. Cross correlation function is an even function. i.e., RXY (τ ) = RXY (−τ ).

Proof : Consider RXY (t1 , t2 ) = E[X(t1 ).Y (t2 )]


i.e., RXY (τ ) = E[X(t).Y (t + τ )]
RXY (−τ ) = E[X(t).Y (t − τ )]
Put t − τ = P
⇒t=P +τ
∴ RXY (−τ ) = E[X(P + τ )Y (P )][
RXY (−τ ) = E[Y (P )X(P + τ )]
= RY X (τ )

Note: RY X (−τ ) = RXY (τ ).

2. If X(t) and Y (t) are two random processes and RXX (τ ) and RY Y (τ ) are their
respective auto correlation functions, then RXY (τ )2 ≤ RXX (0) .RY Y (0).
Proof : By Schwartz’ inequality

E{[X(t)Y (t + τ )]}2 ≤ E(X 2 (t))E(Y 2 (t + τ ))


≤ RXX (0)RY Y (0) (By the property of autocorrelation)

1 
3. If X(t) and Y (t) are two random processes, then |RXY (τ )| ≤ RXX (0) + RY Y (0) .
2
Proof :

W.K.T., RXX (0) = E[X 2 (t)]


RY Y (0) = E[Y 2 (t)] (By the property of autocorrelation)

Both are second moments also and we know that second moments are always positive.
Consider that the geometric mean of two positive quantities is less than their arithmetic
mean.
p 1
i.e., RXX (0) RY Y (0) ≤ [RXX (0) + RY Y (0)] (1)
2

p
W.K.T. |RXY (τ )| ≤ RXX (0) RY Y (0) (By property)
1
≤ [RXX (0) + RY Y (0)] (By (1))
2
1
∴ |RXY (τ )| ≤ [RXX (0) + RY Y (0)]
2

4. If the random processes X(t) and Y(t) are independent, then


RXY (τ ) = E(X) · E(Y ).

19
Proof :

RXY (τ ) = E[X(t) · Y (t+τ )]


= E[X(t)] · E[Y (t+τ )] [∵ X(t) and Y (t) are independent]
= E(X) · E(Y )

5. If the random processes X(t) and Y(t) are of zero mean, then
lim RXY (τ ) = lim RY X (τ ) = 0.
|τ | → ∞ |τ | → ∞
Proof : Consider lim RXY (τ ) = lim E [X (t) Y (t + τ )]
|τ | → ∞ |τ | → ∞
As τ → ∞, X(t) and Y (t) can be considered as independent.

lim RXY (τ ) = Lt E [X (t)] E [Y (t + τ )] = 0


|τ | → ∞ |τ | → ∞

Similarly, Lt RY X (τ ) = 0
|τ | → ∞

5.2 Jointly WSS process


Random processes X(t) and Y (t) are said to be jointly WSS, if
1. Each of X(t) and Y (t) are individually WSS.
2. Cross correlation RXY (t, t + τ ) = E[X(t).Y (t + τ )]= a function of τ .
Two random processes X(t) and Y (t) are defined by
X(t) = A cos λt + B sin λt,Y (t) = B cos λt − A sin λt, where A and B are
random variables and λ is a constant. If A and B are uncorrelated with
zero means and same variances. Prove that X(t) and Y (t) are jointly WSS.
Solution : Given: X(t) = a random process = A cos λt + B sin λt
Y (t) = a random process = B cos λt − A sin λt
where λ = a constant
Given A and B are random variables with zero
) means i.e.,
Mean of A = 0 ⇒ E[A] = 0
(1)
Mean of B = 0 ⇒ E[B] = 0
And given A and B are random variables with same variances i.e.,

Variance(A) = Variance(B)
E A − [E(A)]2 = E B 2 − [E(B)]2
2
  

E A2 − 0 = E B 2 − 0 (∵ by (1))
   

E A2 = E B 2 = σ 2 , say (2)
    

Also given A and B are uncorrelated random variables

i.e., Covariance(A, B) = 0 ⇒ Cov(A, B) = 0


E[AB] − E[A] · E[B] = 0
E[AB] = (0) · (0) (∵ by (1))
E[AB] = 0 (3)

20
To show X(t) is jointly WSS, we have to prove

(a) E[X(t)] = a constant


 
(i) X(t) is WSS ⇒
(b) RXX (t1 , t2 ) = a function of τ
(a) E[Y (t)] = a constant
 
(ii) Y (t) is WSS ⇒
(b) RY Y (t1 , t2 ) = a function of τ
(iii) RXY (t, t + τ ) = a function of τ
(or)
RXY (t1 , t2 ) = a function of time difference

Now (i)(a)

E[X(t)] = E[A cos λt + B sin λt]


= E[A] cos λt + E[B] sin λt
= (0) cos λt + (0) sin λt (∵ by (1))
=0

Now (i)(b)

RXX (t1 , t2 )
= E [X (t1 ) · X (t2 )]
= E [(A cos λt1 + B sin λt1 ) · (A cos λt2 + B sin λt2 )]
= E A2 cos λt1 cos λt2 +ABcos λt1 sin λt2 +ABcos λt2 sin λt1 +B 2 sin λt1 sin λt2
 
 2 h i
= E A cos λt1 cos λt2 +E AB cos λt1 sin λt2
h i h i
+E AB cos λt2 sin λt1 +E B 2 sin λt1 sin λt2
= σ 2 cos λt1 cos λt2 + 0 + 0 + σ 2 sin λt1 sin λt2 (∵ by (2) and (3))
2 2
= σ cos λt1 cos λt2 + σ sin λt1 sin λt2
= σ 2 cos λ (t1 − t2 ) (∵ cos(A − B) = cos A cos B + sin A sin B)
(or)
= σ 2 cos λτ (∵ τ = t1 − t2 or ∵ τ = t2 − t1 )
= a function of time difference

∴ (i)(a) and (i)(b) are proved.


Hence X(t) is WSS.

Similarly, we can prove (ii)(a) and (ii)(b)


∴ Y (t) is also WSS.
Now (iii)

RXY (t1 , t2 )
= E [X (t1 ) · Y (t2 )]
= E [(A cos λt1 + B sin λt1 ) · (B cos λt2 − A sin λt2 )]

21
= E ABcos λt1 cos λt2 −A2 cos λt1 sin λt2 +B 2 sin λt1 cos λt2 −ABsin λt1 sin λt2
 
h i h i
= E AB cos λt1 cos λt2 −E A2 cos λt1 sin λt2
h i h i
+E B 2 sin λt1 cos λt2 −E AB sin λt1 sin λt2
= 0−σ 2 cos λt1 sin λt2 +σ 2 sin λt1 cos λt2 − 0 (∵ by (1) and (2))
2 2
= −σ cos λt1 sin λt2 + σ sin λt1 cos λt2
= −σ 2 [cos λt1 sin λt2 − sin λt1 cos λt2 ]
= −σ 2 [sin λ (t2 − t1 )] (∵ sin(A − B) = sin A cos B + cos A sin B)
(or)
= −σ 2 sin λτ
= a function of time difference

∴ (iii) proved.
Hence X(t) and Y (t) are jointly WSS.

6 Ergodicity
When we wish to take a measurement of a variable quantity in the laboratory, we
usually obtain multiple measurements of the variable and average them to reduce
measurement errors. If the value of the variable being measured is constant and errors
are due to disturbances (noise) or due to the instability of the measuring instrument,
then averaging is, in fact, a valid and useful technique. ‘Time averaging’ is an extension
of this concept, which is used in the estimation of various statistics of random processes.
We normally use ensemble averages (or statistical averages) such as the mean and
autocorrelation function for characterising random processes. To estimate ensemble
averages, one has to compute a weighted average over all the member functions of the
random process.
For example, the ensemble mean of a discrete random process {X(t)} is computed by
the formula X
µx = xi p i .
If we have access only to a single sample function of the process, then we use its
time-average to estimate the ensemble averages of the process.
Time-average: If {X(t)} is a random process, then
T
1
Z
X(t) dt
2T −T

is called the time-average of {X(t)} over (−T, T ) and denoted by X T .


In general, ensemble averages and time averages are not equal except for a very special
class of random processes called ergodic processes. The concept of ergodicity deals
with the equality of time averages and ensemble averages.

22
Ergodic: A random process {X(t)} is said to be ergodic, if its ensemble averages are
equal to appropriate time averages. This definition implies that, with probability 1,
any ensemble average of {X(t)} can be determined from a single sample function of
{X(t)}.
Note Ergodicity is a stronger condition than stationarity and hence all random pro-
cesses that are stationary are not ergodic. Moreover, ergodicity is usually defined with
respect to one or more ensemble averages (such as mean and autocorrelation function)
as discussed below and a process may be ergodic with respect to one ensemble average
but not others.
Mean-ergodic Process: If the random process {X(t)} has a constant mean E{X(t)} =
µ and if Z T
1
XT = X(t) dt → µ as T → ∞,
2T −T
then {X(t)} is said to be mean-ergodic.

6.1 Mean-Ergodic Theorem


Statement:If {X(t)} is a random process with constant mean µ and if
T
1
Z
XT = X(t) dt,
2T −T

then {X(t)} is mean-ergodic (or ergodic in the mean), provided

lim Var X T = 0.
T →∞

Proof:
T
1
Z
XT = X(t) dt
2T −T
Z T
1
∴ E(X T ) = E{X(t)} dt
2T −T

By Tchebycheff’s inequality

Var(XT )
(2)

P |X T − E(X T )| ≤ ε ≤ 1 −
ε2

Taking limits as T → ∞ and using (1) we get


   limT →∞ Var(XT )
P lim X T − µ ≤ ε ≥ 1 −
T →∞ ε2

∴ When limT →∞ Var(XT ) = 0, (2) becomes


  
P lim X T − µ ≤ ε ≥ 1
T →∞

23
i.e. limT →∞ X T = E{X(t)} with probability 1.
Note This theorem provides a sufficient condition for the mean-ergodicity of a ran-
dom process. That is, to prove the mean-ergodicity of {X(t)}, it is enough to prove
limT →∞ Var(X T ) = 0.

24

You might also like