RVSP Unit 3
RVSP Unit 3
UNIT-III
STOCHASTIC PROCESS-TEMPORAL CHARACTERISTICS
INTRODUCTION
A random variable is a real valued function that assigns numerical values to the
outcomes of physical experiment. If time is added to a random variable then it is
called random process.
Random processes are used to describe the time varying nature of random variable.
They describe the statistical behavior of various real time signals like speech,
noise, atmosphere, etc…
Random processes are denoted by 𝑋(𝑡, 𝑠) 𝑜𝑟 𝑋(𝑡). If time is fixed i.e; if any
specific time instant is taken then random process becomes random variable.
𝐹𝑋 (𝑥1; 𝑡1 ) = P{𝑋(𝑡1) ≤ 𝑥1 }
Similarly the first order density function of random process is
For two random variables at time instant 𝑡1 𝑎𝑛𝑑 𝑡2 𝑋(𝑡1) = 𝑋1 𝑎𝑛𝑑 𝑋(𝑡2 ) = 𝑋2 ,
the second order distribution (joint distribution) function of a random process is
defined as
𝜕 2 𝐹𝑋 (𝑥1, 𝑥2 ; 𝑡1 , 𝑡2 )
𝑓𝑋 (𝑥1, 𝑥2 ; 𝑡1, 𝑡2 ) =
𝜕𝑥1 𝜕𝑥2
𝜕 𝑛 𝐹𝑋 (𝑥1, 𝑥2 … … 𝑥𝑛 ; 𝑡1, 𝑡2 … … 𝑡𝑛 )
𝑓𝑋 (𝑥1, 𝑥2 … … 𝑥𝑛 ; 𝑡1, 𝑡2 … … 𝑡𝑛 ) =
𝜕𝑥1 𝜕𝑥2 … … 𝜕𝑥𝑛
A random process X(t) is said to be first order stationary if its first order density
function does not change with time.
𝑓𝑋 (𝑥1; 𝑡1 ) = 𝑓𝑋 (𝑥1 ; 𝑡1 + Δ)
Whenever a random process is first order stationary then its average value or mean
is constant over time.
𝐸 [𝑋(𝑡1 )] = 𝐸 [𝑋 (𝑡2 )]
= 𝐸 [𝑋 (𝑡1 + Δ)]
= 𝑐𝑜𝑛𝑠𝑡𝑎𝑛𝑡
= 𝑐𝑜𝑛𝑠𝑡𝑎𝑛𝑡
Second order Stationary Process
A random process X(t) is said to be second order stationary if its second order
density function does not change with time.
Let 𝐸 [𝑋(𝑡1) 𝑋(𝑡2 )] denote the correlation between two random variables
𝑋1 𝑎𝑛𝑑 𝑋2 taken at time instants 𝑡1 𝑎𝑛𝑑 𝑡2 then
= 𝑅𝑋𝑋 (𝜏)
If this auto correlation function is constant i.e.; independent on time such a random
process is called second order stationary process.
𝐸 [𝑋 (𝑡)] = 𝑐𝑜𝑛𝑠𝑡𝑎𝑛𝑡
A random process X(t) is said to be nth order stationary if its nth order density
function does not change with time.
𝑓𝑋 (𝑥1, 𝑥2 … … 𝑥𝑛 ; 𝑡1 , 𝑡2 … … 𝑡𝑛 ) = 𝑓𝑋 (𝑥1, 𝑥2 … … 𝑥𝑛 ; 𝑡1 + Δ, 𝑡2 + Δ … … 𝑡𝑛 + Δ)
A random process is said to be nth order stationary is also called strict sense
stationary.
TIME AVERAGES:
The random process is also characterized by time average functions along with
statistical averages. Statistical average of a random process is calculated by
considering all sample functions at given time.
Time averages are calculated for any sample function. The time average of a
random process is defined as a
1 𝑇
𝐴[∎] = lim ∫ [∎] 𝑑𝑡
𝑇→∞ 2𝑇 −𝑇
Here A is used to denote time average in a manner analogous to E for the statistical
average.
1 𝑇
= lim ∫ 𝑥 (𝑡) 𝑥(𝑡 + 𝜏) 𝑑𝑡
𝑇→∞ 2𝑇 −𝑇
The time auto correlation function is used to calculate the similarity between two
random variables within a single random process.
The time cross correlation function measures the similarity between two different
random processes.
Ergodic Theorem
This theorem stated that the all time averages 𝑥̅ 𝑎𝑛𝑑 ℜ𝑥𝑥 (𝜏) of a random process
are equal to Statistical averages 𝑋̅ 𝑎𝑛𝑑 𝑅𝑋𝑋
𝐸 [𝑋(𝑡)] = 𝐴[𝑥(𝑡)]
𝑋̅ = 𝑥̅
A random process is said to be mean ergodic (or) ergodic in mean if the time
average of 𝑥 (𝑡) is equal to statistical average of X(t)
𝐸 [𝑋(𝑡)] = 𝐴[𝑥(𝑡)]
𝑋̅ = 𝑥̅
Proof: let X(t) be a wide sense stationary process such that 𝑋(𝑡1 ) =
𝑋1 𝑎𝑛𝑑 𝑋(𝑡2 ) = 𝑋2 then select a positive quantity such that
Let 𝑡1 = 𝑡 and 𝑡2 = 𝑡1 + 𝜏 = 𝑡 + 𝜏
We know that
Proof:
We know that
Since the process has no periodic components as |𝜏| → ∞, the random variables
becomes independent.
7. If X(t) is Ergodic, zero mean, and has no periodic components, then the
auto correlation function is given as
lim 𝑅𝑋𝑋 (𝜏) = 0
|𝜏|→∞
8. Let there be a random process w(t) such that 𝑤 (𝑡) = 𝑋(𝑡) + 𝑌(𝑡). Then the
auto correlation function of sum of random process is equal to
𝑅𝑊𝑊 (𝜏) = 𝑅𝑋𝑋 (𝜏) + 𝑅𝑋𝑌 (𝜏) + 𝑅𝑌𝑋 (𝜏) + 𝑅𝑌𝑌 (𝜏)
Proof:
We know that
𝑅𝑋𝑋 (𝜏) = 𝐸 [𝑋(𝑡) 𝑋(𝑡 + 𝜏)]
Given 𝑤 (𝑡) = 𝑋(𝑡) + 𝑌 (𝑡)
𝑅𝑊𝑊 (𝜏) = 𝐸 [𝑊 (𝑡) 𝑊(𝑡 + 𝜏)]
= 𝐸[(𝑋(𝑡) + 𝑌(𝑡)) (𝑋(𝑡 + 𝜏) + 𝑌 (𝑡 + 𝜏))]
= 𝐸[(𝑋(𝑡)) (𝑋(𝑡 + 𝜏))] + 𝐸[(𝑌(𝑡)) (𝑌(𝑡 + 𝜏))] + 𝐸[(𝑋(𝑡)) (𝑌(𝑡 + 𝜏))] + 𝐸[(𝑌(𝑡)) (𝑋(𝑡 + 𝜏))]
𝑅𝑊𝑊 (𝜏) = 𝑅𝑋𝑋 (𝜏) + 𝑅𝑋𝑌 (𝜏) + 𝑅𝑌𝑋 (𝜏) + 𝑅𝑌𝑌 (𝜏)
Consider two random processes X(t) and Y(t) that are at least wide sense
stationary.
Proof:
We know that
Let 𝜏 = −𝜏
𝑡−𝜏=𝑢
𝑡 = 𝑢+𝜏
Proof:
[𝑌 (𝑡 + 𝜏) ± 𝛼 𝑋(𝑡)]2 ≥ 0
𝐸 [𝑌 (𝑡 + 𝜏) ± 𝛼 𝑋(𝑡)]2 ≥ 0
The above equation of the form 𝑎𝑥 2 + 𝑏𝑥 + 𝑐. Hence the roots of the above
−𝑏±√𝑏2 −4𝑎𝑐
equation are given as
2𝑎
4. For two random processes X(t) and Y(t) having non-zero mean and are
statically independent
𝑅𝑋𝑌 (𝜏) = 𝑋̅ 𝑌̅
Proof:
We know that
As X(t) and Y(t) having non-zero mean and are statically independent
𝑅𝑋𝑌 (𝜏) = 𝑋̅ 𝑌̅
COVARIANCE FUNCTION
DESCRPTIVE QUESTIONS
1. Explain numerous categories of random processes with examples.
2. Explain stationarity of random processes.
3. Interpret about ergodic random processes.
4. Interpret the significance of time averages and ergodicity.
5. Choose necessary expressions to verify the properties of Auto correlation
function.
6. Choose relevant expressions to verify the properties of cross correlation
function.
7. Interpret the concepts of covariance with relevance to random processes.
PROBLEMS
1. A random process is described by X(t) = A, where A is a continuous random
variable and is uniformly distributed on (0,1). Show that X(t) is wide sense
stationary.
2. Verify the Sine wave process X(t) = B sin ((0t), where B is uniform
random variable on (-1,1) is wide sense stationary or not.
10. Show that X(t) & Y(t) are Jointly WSS, if random processes,
X (t ) A Cos(1t ) , Y (t ) B Cos(2t ) , where A,B, 1 & 2 are constants,
while Φ, θ are Statistically independent uniform random variables on (0,2Π).
Sol:
1
= = 𝑐𝑜𝑛𝑠𝑡𝑎𝑛𝑡
2
𝑅𝑋𝑋 (𝑡, 𝑡 + 𝜏) = 𝐸 [𝑋(𝑡) 𝑋(𝑡 + 𝜏)]
∞
= ∫ 𝑥 (𝑡) 𝑥(𝑡 + 𝜏) 𝑓𝐴 (𝐴) 𝑑𝐴
−∞
1
= ∫ 𝐴 𝐴 𝑑𝐴
0
1
= ∫ A2 𝑑𝐴
0
1
A3 1
= [ ] = Independent on time
2 0 2
Sol:
𝐸 [𝑋 (𝑡)] = 𝑐𝑜𝑛𝑠𝑡𝑎𝑛𝑡
1
𝑓𝑋 (𝑥 ) = 𝑎≤𝑋≤𝑏
𝑏−𝑎
1 1
𝑓𝜃 (𝜃 ) = =
2𝜋 − 0 2𝜋
2𝜋
1
𝐸 [𝑋(𝑡)] = ∫ A cos(𝑤0 𝑡 + 𝜃 ) 𝑑𝜃
0 2𝜋
2𝜋
sin(𝑤0 𝑡 + 𝜃 )
=𝐴 [ ]
2𝜋 0
A
= [ sin(𝑤0 𝑡 + 2𝜋) − sin(𝑤0 𝑡 + 0)]
2𝜋
A
= [ sin(𝑤0 𝑡) − sin(𝑤0 𝑡)] = 0
2𝜋
= 𝑐𝑜𝑛𝑠𝑡𝑎𝑛𝑡
1 2𝜋
= ∫ (𝐴 cos(𝑤0𝑡 + 𝜃 ) A cos(𝑤0 (𝑡 + 𝜏) + 𝜃 ) ) 𝑑𝜃
2𝜋 0
1 2𝜋
= ∫ ( 𝐴 cos(𝑤0 𝑡 + 𝜃 ) A cos(𝑤0 𝑡 + 𝜃 + 𝑤0 𝜏) ) 𝑑𝜃
2𝜋 0
A2 2𝜋
= ∫ ( cos(2𝑤0 𝑡 + 2𝜃 + 𝜏) + cos(𝑤0 𝜏) ) 𝑑𝜃
4𝜋 0
A2 2𝜋 2𝜋
= [∫ ( cos 2𝑤0 𝑡 + 2𝜃 + 𝜏 𝑑𝜃 + ∫ ( cos(𝑤0 𝜏) )𝑑𝜃 ]
( ) )
4𝜋 0 0
A2
= [ 2𝑤0 sin(2𝑤0 𝑡 + 2𝜃 + 𝜏) ]2𝜋 2𝜋
0 + cos(𝑤0 𝜏 ) 𝜃 ]0 ]
4𝜋
A2
= cos(𝑤0𝜏) 2𝜋
4𝜋
A2
= cos(𝑤0𝜏) Independent on time
2
Hence given RP is WSS.
3. Verify the Sine wave process X(t) = B sin ((0t), where B is uniform
random variable on (-1,1) is wide sense stationary or not.
Sol:
A random variable is said to be wide sense stationary process if
𝐸 [𝑥 (𝑡)] = 𝑐𝑜𝑛𝑠𝑡𝑎𝑛𝑡
𝑅𝑋𝑋 (𝜏) = 𝐸 [𝑥 (𝑡)𝑥(𝑡 + 𝜏)] = 𝑖𝑛𝑑𝑒𝑝𝑒𝑛𝑑𝑒𝑛𝑡 𝑜𝑛 𝑡𝑖𝑚𝑒
∞
𝐸 [𝑥 (𝑡)] = ∫ 𝑥 (𝑡) 𝑓𝑥 (𝑥 )𝑑𝑥
−∞
∞
𝐸 [𝑥 (𝑡)𝑥 (𝑡, 𝑡 + 𝜏)] = ∫ 𝑥 (𝑡)𝑥 (𝑡 + 𝜏) 𝑓𝑥 (𝑥 )𝑑𝑥
−∞
𝑠𝑖𝑛𝜔0 𝑡 1
= ∫ 𝐵 𝑑𝐵
2 −1
𝑠𝑖𝑛𝜔0 𝑡 𝐵 2 1
= [ ]
2 2 −1
𝑠𝑖𝑛𝜔0 𝑡 1 1
= [ − ] = 0 = 𝐶𝑜𝑛𝑠𝑡𝑎𝑛𝑡
2 2 2
(𝑠𝑖𝑛(𝜔0𝑡))(𝑠𝑖𝑛(𝜔0(𝑡 + 𝜏)) 1 2
= ∫ 𝐵 𝑑𝐵
2 −1
cos(𝐴 − 𝐵 ) − cos(𝐴 + 𝐵)
𝑠𝑖𝑛𝐴𝑠𝑖𝑛𝐵 =
2
(cos(𝜔0 𝜏) − cos(2𝜔0 𝑡 − 𝜔0 𝜏)) 𝐵 3 1
= [ ]
4 3 −1
Sol: Given,
𝑌(𝑡) = 𝑋(𝑡) − 𝑋(𝑡 + 𝜏)
Given 𝑋(𝑡) is wide sense stationary and 𝐸 [𝑋 (𝑡)] ≠ 0
(i) Mean of 𝑌(𝑡):
𝐸 [𝑌 (𝑡)] = 𝐸 [𝑋(𝑡) − 𝑋(𝑡 + 𝜏)]
∴ 𝐸 [𝑌 (𝑡)] = 0
𝜎𝑌 2 = 𝐸 [𝑌 2 (𝑡)] − (𝐸 [𝑌 (𝑡)])2
2
= 𝐸 [(𝑋 (𝑡) − 𝑋(𝑡 + 𝜏)) ] − 0[∵ 𝐸 [𝑌 (𝑡)] = 0]
(iii) Given,
𝑌(𝑡) = 𝑋(𝑡) + 𝑋(𝑡 + 𝜏)
Now,
𝐸 [𝑌 (𝑡)] = 𝐸 [𝑋(𝑡) + 𝑋(𝑡 + 𝜏)]
= 𝐸 [𝑋 (𝑡)] + 𝐸 [𝑋(𝑡 + 𝜏)]
= 𝐸 [𝑋(𝑡)] + 𝐸 [𝑋(𝑡)][∵ 𝑋(𝑡) 𝑖𝑠 𝑊𝑆𝑆 ]
= 2 𝐸 [𝑋 (𝑡)]
∴ 𝐸 [𝑌 (𝑡)] = 2 𝐸 [𝑋 (𝑡)]
Now,
𝜎𝑌 2 = 𝐸 [𝑌 2 (𝑡)] − (𝐸 [𝑌 (𝑡)])2
2
= 𝐸 [(𝑋 (𝑡) − 𝑋(𝑡 + 𝜏)) ] − (2 𝐸 [𝑋(𝑡)])2 [∵ 𝐸 [𝑌 (𝑡)] = 2 𝐸 [𝑋(𝑡)]]
= 𝑅𝑋𝑋 (0) + 𝑅𝑋𝑋 (0) + 2𝑅𝑋𝑋 (𝜏) − 4(𝐸 [𝑋 (𝑡)])2 [∵ 𝑅𝑋𝑋 (0) = ̅̅̅̅̅̅̅
𝑋2 (𝑡)]
𝐺𝑖𝑣𝑒𝑛,
𝐸 (𝐴𝐵 ) = 𝐸 (𝐴) 𝐸 (𝐵 )
𝐵𝑢𝑡,
𝐸 (𝐴) = 𝐸 (𝐵 ) = 0
𝐸 (𝐴𝐵 ) = 0
𝑁𝑜𝑤,
= 𝐸 [ 𝐴2 ]
𝜎𝐵2 = 𝐸 [𝐵 2 ] − (𝐸 [𝐵 ])2
= 𝐸 [𝐵 2 ]
𝐸[𝐴2] = 𝜎 2 𝑎𝑛𝑑 𝐸 [𝐵 2 ] = 𝜎 2
𝐸[𝑋(𝑡)] = 𝑐𝑜𝑛𝑠𝑡𝑎𝑛𝑡
𝑁𝑜𝑤,
= 0 (𝑆𝑖𝑛𝑐𝑒 𝐸 [𝐴] = 𝐸 [𝐵 ] = 0)
= 𝜎 2 cos(𝜔0 𝜏)
= 𝑅𝑋𝑋 (𝜏)
𝐼𝑛𝑑𝑒𝑝𝑒𝑛𝑑𝑒𝑛𝑡 𝑜𝑓 𝑡𝑖𝑚𝑒