0% found this document useful (0 votes)
74 views38 pages

6 Random Processes v7

ECE

Uploaded by

Mohamed Shabana
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
74 views38 pages

6 Random Processes v7

ECE

Uploaded by

Mohamed Shabana
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 38

CH 6: Random Processes: Temporal Characteristics

CH 7: Random Processes: Spectral Characteristics


CH 8: Linear Systems with Random Inputs

Random Processes: Temporal


Characteristics
Dr. Ali Muqaibel
Probability and Random Processes for
Electrical Engineers
Ver. 5

Dr. Ali Hussein Muqaibel 1


Chapter 6. Random Process – Temporal Characteristics

0. Introduction
1. The Random Process Concept

2. Joint Distribution, Independence, and Moments

3. Stationarity and Correlation Functions

4. Gaussian Random Process

5. Poisson Random Process ( up to 6.5-4, joint probability density is not included)

6. Time Averages and Ergodicity

7. Measurement of Correlation Functions


2
Introduction
• Real life: time (space) waveform (desired + undesired)
• Our progress and development relays on our ability to deal with such waveforms.
• The set of all the functions that are available (or the menu) is call the ensemble of the random process.
𝑋 𝑡, 𝑠1

The graph of the function 𝑋 𝑡, 𝑠 , versus 𝑡 for 𝑠


fixed, is called a realization, ensemble member,
or sample function of the random process.
𝑋 𝑡, 𝑠2

For each fixed 𝑡𝑘 from the indexed set 𝐼, 𝑋(𝑡𝑘 , 𝑠) 𝑋 𝑡, 𝑠3


is a random variable

Dr. Ali Hussein Muqaibel 3


Formal Definition
• Consider a random experiment specified by the outcomes 𝑠 from some sample
space 𝑆 , and by the probabilities on these events.
• Suppose that to every outcome 𝑠 ∈ 𝑆, we assign a function of time according to
some rule: 𝑋 𝑡, 𝑠 , 𝑡 ∈ 𝐼.
• We have created an indexed family of random variables, {𝑋 𝑡, 𝑠 , 𝑡 ∈ 𝐼}.
• This family is called a random process (stochastic processes).
• We usually suppress the 𝑠 and use 𝑋(𝑡) to denote a random process.

Dr. Ali Hussein Muqaibel 4


Classification of R.P

A continuous-time stochastic
process is one which 𝐼 is
continuous (thermal noise)

Continuous alphabet random process Discrete alphabet random process

A stochastic process is said to be


discrete-time (sequence) if the
index set 𝐼 is a countable set (i.e.,
the set of integers or the set of
nonnegative integers).
𝑋 𝑛𝑇 or 𝑋[𝑛]

Continuous alphabet random sequence Discrete alphabet random sequence


5
Deterministic and non-deterministic Processes
• Non-deterministic: future values cannot be predicted from current ones.
– most of the random processes are non-deterministic.
• Deterministic: Sinusoid with random amplitude [-1,1]

• like :

𝑋 𝑡, 𝑠 = 𝑠cos 2𝜋𝑡 m, −∞ < 𝑡 < ∞

𝑌(𝑡, 𝑠) = cos(2𝜋𝑡 + 𝑠)

Sinusoid with random phase (−𝜋, +𝜋)


Dr. Ali Hussein Muqaibel 6
6.2 Distribution and Density Functions
• A r.v. is fully characterized by a pdf or CDF. How do we characterize random
processes?
▪ To fully define a random processes, we need 𝑁 dimensional joint density function.
• Distribution and Density Functions:
• First order:
▪ 𝐹𝑋 𝑥1 ; 𝑡1 = 𝑃 𝑋 𝑡1 ≤ 𝑥1
• Second-order joint distribution function
▪ 𝐹𝑋 𝑥1 , 𝑥2 ; 𝑡1 , 𝑡2 = 𝑃 𝑋 𝑡1 ≤ 𝑥1 , 𝑋 𝑡2 ≤ 𝑥2
• 𝑵th order joint distribution function
▪ 𝐹𝑋 𝑥1 , … , 𝑥𝑁 ; 𝑡1 , … , 𝑡𝑁 = 𝑃 𝑋 𝑡1 ≤ 𝑥1 , … . , 𝑋 𝑡𝑁 ≤ 𝑥𝑁
𝜕
▪ 𝑓𝑋 𝑥1 , … , 𝑥𝑁 ; 𝑡1 , … , 𝑡𝑁 = 𝐹 𝑥1 , … , 𝑥𝑁 ; 𝑡1 , … , 𝑡𝑁
𝜕𝑥1 …𝜕𝑥𝑁 𝑋

Dr. Ali Hussein Muqaibel 7


Correlation & Covariance ( Auto & Cross)
Moments: mean 𝑚𝑋 𝑡 = 𝐸[𝑋 𝑡 ]
Autocorrelation function of 𝑋 𝑡
∞ ∞
𝑅𝑋𝑋 𝑡1 , 𝑡2 = 𝐸 𝑋 𝑡1 𝑋(𝑡2 ) = ‫׬‬−∞ ‫׬‬−∞ 𝑥1 𝑥2 𝑓𝑋𝑌 𝑥1 , 𝑥2 ; 𝑡1 , 𝑡2 𝑑𝑥1 𝑑𝑥2

Autocovariance function of 𝑋 𝑡
𝐶𝑋𝑋 𝑡1 , 𝑡2 = 𝐸[ 𝑋 𝑡1 − 𝐸 𝑋 𝑡1 𝑋 𝑡2 − 𝐸 𝑋 𝑡2 ]
= 𝑅𝑋𝑋 𝑡1 , 𝑡2 − 𝐸 𝑋 𝑡1 𝐸[𝑋 𝑡2 ]
Cross-Correlation function of 𝑋(𝑡) & 𝑌 𝑡
∞ ∞
𝑅𝑋𝑌 𝑡1 , 𝑡2 = 𝐸 𝑋 𝑡1 𝑌 𝑡2 =න න 𝑥1 𝑦1 𝑓𝑋𝑌 𝑥1 , 𝑦1 ; 𝑡1 , 𝑡2 𝑑𝑥1 𝑑𝑦1
−∞ −∞
Cross-Covariance function of 𝑋(𝑡) & 𝑌 𝑡
𝐶𝑋𝑌 𝑡1 , 𝑡2 = 𝐸[ 𝑋 𝑡1 − 𝐸 𝑋 𝑡1 𝑌 𝑡2 − 𝐸 𝑌 𝑡2 ]
= 𝑅𝑋𝑌 𝑡1 , 𝑡2 − 𝐸 𝑋 𝑡1 𝐸[𝑌 𝑡2 ]
8
Independence & Correlation
• Statistical Independence, 𝑋(𝑡) & 𝑌(𝑡) are independent
• 𝑓𝑋𝑌 𝑥1 , … , 𝑥𝑁 , 𝑦1 , … , 𝑦𝑀 ; 𝑡1 , … , 𝑡𝑁 , 𝑡1ƴ , … , 𝑡ƴ𝑀 =
𝑓𝑋 𝑥1 , … , 𝑥𝑁 , ; 𝑡1 , … , 𝑡𝑁 𝑓𝑌 𝑦1 , … , 𝑦𝑀 ; 𝑡1ƴ , … , 𝑡ƴ𝑀
• 𝑋 𝑡 & 𝑌(𝑡) uncorrelated ↔ 𝐶𝑋𝑌 𝑡1 , 𝑡2 = 0
• 𝑋 𝑡 & 𝑌 𝑡 𝑜𝑟𝑡ℎ𝑜𝑔𝑜𝑛𝑎𝑙 ↔ 𝑅𝑋𝑌 𝑡1 , 𝑡2 = 0
• Independence →uncorrelated

Dr. Ali Hussein Muqaibel 9


Stationary Random Process
• Stationary: If all statistical properties do not change with time
• First order Stationary Process
• 𝑓𝑋 𝑥1 ; 𝑡1 = 𝑓(𝑥1 ; 𝑡1 + ∆), stationary to order one → 𝐸 𝑋 𝑡 = 𝑋ഥ = 𝑐𝑜𝑛𝑠𝑡𝑎𝑛𝑡
• Proof: if 𝑓𝑋 𝑥1 ; 𝑡1 = 𝑓(𝑥1 ; 𝑡1 + ∆), then
➢ 𝑋1 = 𝑋 𝑡1 , 𝑋2 = 𝑋 𝑡2
+∞
➢ 𝐸 𝑋1 = 𝐸 𝑋 𝑡1 = ‫׬‬−∞ 𝑥𝑓𝑋 𝑥; 𝑡1 𝑑𝑥
+∞
➢ 𝐸 𝑋2 = 𝐸 𝑋 𝑡2 = ‫׬‬−∞ 𝑥𝑓𝑋 𝑥; 𝑡2 𝑑𝑥
+∞
➢ Let 𝑡2 = 𝑡1 + ∆, 𝑡ℎ𝑒𝑛 𝐸 𝑋2 = 𝐸 𝑋 𝑡2 = ‫׬‬−∞ 𝑥𝑓𝑋 𝑥; 𝑡1 + ∆ 𝑑𝑥
➢ 𝐸 𝑋 𝑡1 + ∆ = 𝐸[𝑋 𝑡1 ]

10
6.3 Stationarity: Second Order Stationarity
Second order Stationary Process (Stationary to order two)
𝑓𝑋 𝑥1 , 𝑥2 ; 𝑡1 , 𝑡2 = 𝑓𝑋 𝑥1 , 𝑥2 ; 𝑡1 + ∆, 𝑡2 + ∆ , ∀ 𝑡1 , 𝑡2 , ∆
By letting ∆= −𝑡1
𝑓𝑋 𝑥1 , 𝑥2 ; 𝑡1 , 𝑡2 = 𝑓𝑋 𝑥1 , 𝑥2 ; 0, 𝑡2 − 𝑡1

2nd − order stationary ⇒ 1st − order stationary


∞ ∞
𝑓𝑋 (𝑥1 ; 𝑡1 ) = න 𝑓𝑋 (𝑥1 , 𝑥2 ; 𝑡1 , 𝑡2 )𝑑𝑥2 = න 𝑓𝑋 (𝑥1 , 𝑥2 ; 𝑡1 + Δ, 𝑡2 + Δ)𝑑𝑥2 = 𝑓𝑋 (𝑥1 ; 𝑡1 + Δ)
−∞ −∞

2nd − order stationary ⇒ 𝑅𝑋𝑋 (𝑡1 , 𝑡1 + 𝜏) = 𝐸[𝑋(𝑡1 )𝑋(𝑡1 + 𝜏)] = 𝑅𝑋𝑋 (𝜏)
𝜏 = 𝑡2 − 𝑡1
11
Levels for being Stationary
If it is
𝐸 𝑋 𝑡 = 𝑋ത = 𝑐𝑜𝑛𝑠𝑡𝑎𝑛𝑡
• Widesense Stationary ֞ ቊ

• 2nd


𝐸 𝑋 𝑡 𝑋 𝑡 + 𝜏 = 𝑅𝑋𝑋 (𝜏)
-order stationary⇒ Widesense stationary Strict-Sense
2ndW.S.S
order
Stationary
• 𝑁th-order stationary
𝑓𝑋 𝑥1 , … , 𝑥𝑁 ; 𝑡1 , … , 𝑡𝑁 = 𝑓𝑋 𝑥1 , … , 𝑥𝑁 ; 𝑡1 + ∆, … . , 𝑡𝑁 + ∆ ,
• 𝑁th-order stationary ⇒ 𝑘th-order stationary (𝑘 < 𝑁)
• Strict-sense stationary=stationary to all orders
• Jointly Wide Sense Stationary : If 𝑋(𝑡) and 𝑌(𝑡) are individually w.s.s. and
𝑅𝑋𝑌 𝑡, 𝑡 + 𝜏 = 𝐸[𝑋 𝑡 𝑌 𝑡 + 𝜏 ]

12
Example: Wide Sense Stationary
Show that the given random process 𝑋 𝑡 = 𝐴𝑐𝑜𝑠(𝜔0 𝑡 + Θ) is wide sense stationary if it is assumed
that 𝐴 and 𝜔0 are constants and Θ is a uniformly distributed random variable on the interval 0,2𝜋 .
• 𝑋 𝑡 = 𝐴cos 𝜔0 𝑡 + Θ , 𝐴, 𝜔0 : constants
1
• 𝑓Θ (𝜃) = {𝑢(𝜃) − 𝑢(𝜃 − 2𝜋)}
2𝜋
2𝜋 1
• 𝐸[𝑋(𝑡)] = ‫׬‬0 𝐴cos(𝜔0 𝑡 + 𝜃) 𝑑𝜃 =0
2𝜋
• 𝑅𝑋𝑋 𝑡, 𝑡 + 𝜏 = 𝐸 𝑋 𝑡 𝑋 𝑡 + 𝜏 = 𝐴2 𝐸 cos 𝜔0 𝑡 + Θ cos 𝜔0 𝑡 + 𝜏 + Θ
𝐴2
• 𝑅𝑋𝑋 𝑡, 𝑡 + 𝜏 = 𝐸[cos(2𝜔0 𝑡 + 𝜔0 𝜏 + 2Θ) + cos(𝜔0 𝜏)]
2

𝐴2 𝐴2 𝐴2
• 𝑅𝑋𝑋 𝑡, 𝑡 + 𝜏 = 𝑐𝑜𝑠(𝜔0 𝜏) + 𝐸[cos(2𝜔0 𝑡 + 𝜔0 𝜏 + 2Θ)] = cos(𝜔0 𝜏)
2 2 2

⇒ 𝑋(𝑡) w. s. s. (widesensestationary)

Dr. Ali Hussein Muqaibel 13


Practice
Consider the random process 𝑋 𝑡 = 3𝑡 + 𝑏, where 𝑏 is uniformly
distributed r.v. in the range (−2,2). Determine the mean and the
autocorrelation of 𝑋 𝑡 . Is X t wide sense stationary? Justify your answer.

Dr. Ali Hussein Muqaibel 14


6.3 Correlation Functions
Properties for Autocorrelation function of w.s.s. r.p.:
𝑅𝑋𝑋 𝑡, 𝑡 + 𝜏 = 𝐸 𝑋 𝑡 𝑋(𝑡 + 𝜏) = 𝑅𝑋𝑋 (𝜏)
1) 𝑅𝑋𝑋 𝜏 ≤ 𝑅𝑋𝑋 (0)
2) 𝑅𝑋𝑋 −𝜏 = 𝑅𝑋𝑋 𝜏
3) 𝑅𝑋𝑋 0 = 𝐸 𝑋 2 𝑡
4) Stationary & ergodic 𝑋(𝑡) with no periodic components
⇒ lim 𝑅𝑋𝑋 𝜏 = 𝑋ത 2
𝜏 →∞
5) Stationary 𝑋 𝑡 has a periodic component
⇒ 𝑅𝑋𝑋 𝜏 has a periodic component with the same period.

For additional properties, see the textbook


15
Example 1: Correlation Functions
Example 6.3-1: Given the autocorrelation function for a
wide sense stationary processes, find the variance, 𝜎𝑋2 .

4
𝑅𝑋𝑋 (𝜏) = 25 +
1 + 6𝜏 2
2
lim 𝑅𝑋𝑋 (𝜏) = 𝑋 ⇒ 𝐸[𝑋(𝑡)] = 𝑋 = ± 25 = ±5
|𝜏|→∞

𝜎𝑋2 = 𝐸[𝑋 2 (𝑡)] − (𝐸[𝑋(𝑡)])2 = 𝑅𝑋𝑋 (0) − 𝑋ത 2


= 29 − 25 = 4

Dr. Ali Hussein Muqaibel 16


Example 2 :Correlation Functions
• Example :Find the autocorrelation of 𝑌 𝑡 , given that 𝑋 𝑡 is w.s.s with
𝑅𝑋𝑋 𝜏 = 𝑒 −𝑎 𝜏 , 𝑎 > 0
𝑌 𝑡 = 𝑋 𝑡 cos(𝜔0 𝑡 + Θ)
𝑋 𝑡 & Θ 𝑎𝑟𝑒 𝑖𝑛𝑑𝑒𝑝𝑒𝑛𝑑𝑒𝑛𝑡 , 𝜔0 = 𝑐𝑜𝑛𝑠𝑡𝑎𝑛𝑡
1
𝑓Θ 𝜃 = 𝑢 𝜃+𝜋 −𝑢 𝜃−𝜋
2𝜋
𝑅𝑌𝑌 (𝑡, 𝑡 + 𝜏) = 𝐸[𝑌(𝑡)𝑌(𝑡 + 𝜏)]
= 𝐸[𝑋(𝑡)𝑋(𝑡 + 𝜏)cos(𝜔0 𝑡 + Θ)cos(𝜔0 𝑡 + 𝜔0 𝜏 + Θ)] 𝑋(𝑡) & Θ indep.
= 𝐸[𝑋(𝑡)𝑋(𝑡 + 𝜏)]𝐸[cos(𝜔0 𝑡 + Θ)cos(𝜔0 𝑡 + 𝜔0 𝜏 + Θ)]
1
= 𝑅𝑋𝑋 (𝜏) 𝐸[cos(2𝜔0 𝑡 + 𝜔0 𝜏 + 2Θ) + cos(𝜔0 𝜏)]
2
1
= 𝑅𝑋𝑋 (𝜏)cos(𝜔0 𝜏) = 𝑅𝑌𝑌 (𝜏)
2

Dr. Ali Hussein Muqaibel 17


Cross-Correlation Function and its properties

𝑅𝑋𝑌 𝑡, 𝑡 + 𝜏 = 𝐸 𝑋 𝑡 𝑌 𝑡 + 𝜏
• If 𝑋 and 𝑌 are jointly w.s.s. , we may write 𝑅𝑋𝑌 𝜏 .
• Orthogonal processes 𝑅𝑋𝑌 𝑡, 𝑡 + 𝜏 =0
• If 𝑋 and 𝑌 are statistically independent
𝐸 𝑋 𝑡 𝑌 𝑡 + 𝜏 =𝐸 𝑋 𝑡 ]𝐸[𝑌 𝑡 + 𝜏
• If in addition to being independent they are at least w.s.s.
𝐸 𝑋 𝑡 𝑌 𝑡+𝜏 = 𝑋ത 𝑌ത

Dr. Ali Hussein Muqaibel 18


Some properties for 𝑅𝑋𝑌 for Jointly W.S.S.
• 𝑅𝑋𝑌 −𝜏 = 𝑅𝑌𝑋 (𝜏)
• 𝑅𝑋𝑌 𝜏 ≤ 𝑅𝑋𝑋 0 𝑅𝑌𝑌 0
1
• 𝑅𝑋𝑌 𝜏 ≤ 𝑅𝑋𝑋 0 + 𝑅𝑌𝑌 0
2
• The geometric mean is tighter than the arithmetic mean
1
𝑅𝑋𝑋 0 𝑅𝑌𝑌 0 ≤ 𝑅𝑋𝑋 0 + 𝑅𝑌𝑌 0
2

Dr. Ali Hussein Muqaibel 19


Example Signal Plus Noise
Suppose we observe a process 𝑌(𝑡), which consists of a desired signal 𝑋(𝑡) plus noise 𝑁(𝑡).
Find the cross-correlation between the observed signal and the desired signal assuming that 𝑋(𝑡)
and 𝑁(𝑡) are independent random processes.

• 𝑅𝑋,𝑌 (𝑡1 , 𝑡2 ) = 𝐸[𝑋(𝑡1 )𝑌(𝑡2 )൧ 𝑋(𝑡) 𝑌(𝑡)

• = 𝐸[𝑋(𝑡1 ){𝑋(𝑡2 ) + 𝑁(𝑡2 )}]


𝑁(𝑡)
• = 𝐸[𝑋(𝑡1 )𝑋(𝑡2 )] + 𝐸[𝑋(𝑡1 )𝑁(𝑡2 )]
• = 𝑅𝑋𝑋 (𝑡1 , 𝑡2 ) + 𝐸[𝑋(𝑡1 )]𝐸[𝑁(𝑡2 )] where the third equality
followed from the fact that
𝑁(𝑡) and 𝑋(𝑡) are
• = 𝑅𝑋𝑋 (𝑡1 , 𝑡2 ) + 𝑚𝑋 (𝑡1 )𝑚𝑁 (𝑡2 ) independent.

Dr. Ali Hussein Muqaibel 20


If 𝐴 and 𝐵 are uncorrelated zero-mean random
6.3 Correlation Functions variables with the same variance then show that
𝑋(𝑡) & 𝑌(𝑡) are W.S.S. and jointly W.S.S.

• Example 6.3-3 Are 𝑋(𝑡) and 𝑌(𝑡) 𝑗𝑜𝑖𝑛𝑡𝑙𝑦 𝑊. 𝑆. 𝑆.


𝐴, 𝐵: r. v. ′s 𝜔0 = const
𝐸[𝐴] = 𝐸[𝐵] = 0, 𝐸[𝐴𝐵] = 0, 𝐸[𝐴2 ] = 𝐸[𝐵2 ] = 𝜎 2
𝑋(𝑡) = 𝐴cos(𝜔0 𝑡) + 𝐵sin(𝜔0 𝑡), 𝑌(𝑡) = 𝐵cos(𝜔0 𝑡) − 𝐴sin(𝜔0 𝑡)
• 𝐸[𝑋(𝑡)] = 𝐸[𝐴cos(𝜔0 𝑡) + 𝐵sin(𝜔0 𝑡)] = 𝐸[𝐴]cos(𝜔0 𝑡) + 𝐸[𝐵]sin(𝜔0 𝑡) = 0
• 𝑅𝑋𝑋 (𝑡, 𝑡 + 𝜏) = 𝐸[𝑋(𝑡)𝑋(𝑡 + 𝜏)]
• = 𝐸[𝐴2 cos(𝜔0 𝑡)cos(𝜔0 𝑡 + 𝜔0 𝜏) + 𝐴𝐵cos(𝜔0 𝑡)sin(𝜔0 𝑡 + 𝜔0 𝜏) +𝐴𝐵sin(𝜔0 𝑡)cos(𝜔0 𝑡
+ 𝜔0 𝜏) + 𝐵2 sin(𝜔0 𝑡)sin(𝜔0 𝑡 + 𝜔0 𝜏)]
• = 𝜎 2 {cos(𝜔0 𝑡)cos(𝜔0 𝑡 + 𝜔0 𝜏) + sin(𝜔0 𝑡)sin(𝜔0 𝑡 + 𝜔0 𝜏)} = 𝜎 2 cos(𝜔0 𝜏)
• ⇒ 𝑋(𝑡) is W.S.S.

Dr. Ali Hussein Muqaibel 21


Continue Example 3: Joint Wide Sense Stationary
• 𝑌 𝑡 is also W.S.S.
• 𝑅𝑋𝑌 (𝜏) = 𝐸[𝑋(𝑡)𝑌(𝑡 + 𝜏)]
• = 𝐸{[𝐴cos(𝜔0 𝑡) + 𝐵sin(𝜔0 𝑡)][𝐵cos(𝜔0 (𝑡 + 𝜏)) − 𝐴sin(𝜔0 (𝑡 + 𝜏))]}
• = 𝐸[𝐴𝐵cos(𝜔0 𝑡)cos(𝜔0 𝑡 + 𝜔0 𝜏) + 𝐵2 sin(𝜔0 𝑡)cos(𝜔0 𝑡 + 𝜔0 𝜏)
− 𝐴2 cos(𝜔0 𝑡)sin(𝜔0 𝑡 + 𝜔0 𝜏) − 𝐴𝐵sin(𝜔0 𝑡)sin(𝜔0 𝑡 + 𝜔0 𝜏)]
• = 𝜎 2 [sin(𝜔0 𝑡)cos(𝜔0 𝑡 + 𝜔0 𝜏) − cos(𝜔0 𝑡)sin(𝜔0 𝑡 + 𝜔0 𝜏)]
• = −𝜎 2 sin(𝜔0 𝜏)
• ⇒ 𝑋(𝑡) & 𝑌(𝑡): jointly w. s. s.

Dr. Ali Hussein Muqaibel 22


Correlation Functions for Random Sequences
• Random Sequence (=Discrete-time R.P)
𝑋(𝑛𝑇𝑠 ) = 𝑋[𝑛]
• Mean = 𝐸(𝑋[𝑛])
• 𝑅𝑋𝑋 (𝑛, 𝑛 + 𝑘) = 𝐸(𝑋[𝑛]𝑋[𝑛 + 𝑘])

𝐶𝑋𝑋 (𝑛, 𝑛 + 𝑘) = 𝐸{(𝑋[𝑛] − 𝑋[𝑛])(𝑋[𝑛 ത + 𝑘])}
+ 𝑘] − 𝑋[𝑛

ത 𝑋[𝑛
= 𝑅𝑋𝑋 (𝑛, 𝑛 + 𝑘) − 𝑋[𝑛] ത + 𝑘]
• 𝑅𝑋𝑌 (𝑛, 𝑛 + 𝑘) = 𝐸(𝑋[𝑛]𝑌[𝑛 + 𝑘])
• ത
𝐶𝑋𝑌 (𝑛, 𝑛 + 𝑘) = 𝐸{(𝑋[𝑛] − 𝑋[𝑛])(𝑌[𝑛 ത + 𝑘])}
+ 𝑘] − 𝑌[𝑛
• ത 𝑌[𝑛
= 𝑅𝑋𝑌 (𝑛, 𝑛 + 𝑘) − 𝑋[𝑛] ത + 𝑘]

Dr. Ali Hussein Muqaibel 23


In class practice: Wide-Sense Stationary Random
Process
• Let 𝑋𝑛 be an iid sequence of Gaussian random variables with zero mean and
variance 𝜎 2 , and let 𝑌𝑛 be the average of two consecutive values of 𝑋𝑛 ,
𝑋𝑛 + 𝑋𝑛−1
𝑌𝑛 =
2
• Find the mean of 𝑌𝑛.
• Find the covariance 𝐶𝑌𝑌(𝑖, 𝑗)
• What is the distribution of the random variable 𝑌𝑛. Is it stationary?

Check the answer with Matlab

0.9995 0.5000
0.5000 0.5000
Dr. Ali Hussein Muqaibel 24
Gaussian Random Processes

A random process 𝑋 𝑡 is a Gaussian random process if the samples


𝑋1 = 𝑋 𝑡1 , 𝑋2 = 𝑋 𝑡2 , … 𝑋𝑡 = 𝑋(𝑡𝑘 )
are jointly Gaussian random variables for all 𝑘 ,and all choices of 𝑡1 , 𝑡2 , … 𝑡𝑘 .
This definition applies for discrete-time and continuous-time processes.
The joint pdf of jointly Gaussian random variables is determined by the vector of
means and by the covariance matrix:
1 𝑇 −1
𝑒 − ൗ2(𝑋−𝒎) 𝑪 (𝑋−𝒎)
𝑓𝑋1,𝑋2,.....,𝑋𝑘 (𝑥1 , . . . . , 𝑥𝑘 ) = 1
(2π)𝑘/2 𝑪 ൗ2
where

𝑚𝑋 (𝑡1 ) 𝐶𝑋 (𝑡1 , 𝑡1 ) 𝐶𝑋 (𝑡1 , 𝑡2 ) . . . 𝐶𝑋 (𝑡1, 𝑡𝑘 ൯


. 𝐶𝑋 (𝑡2 , 𝑡1 ) 𝐶𝑋 (𝑡2 , 𝑡2 ) . . . 𝐶𝑋 (𝑡2 , 𝑡𝑘 )
𝒎= . 𝑪= .. . .
. . .
.
𝑚𝑋 (𝑡𝑘 )
𝐶𝑋 (𝑡𝑘, 𝑡1 ൯ ... 𝐶𝑋 (𝑡𝑘, 𝑡𝑘 ൯

A Wide Sense Stationary Gaussian Process is also stationary in the strict sense . Why?
A Gaussian Process can be completely defined by the vector of means and the vector of correlations. Why? (see
next example)
Dr. Ali Hussein Muqaibel 25
Example of a Gaussian Random Process
A Gaussian Random Process which is W.S.S. 𝑋ത = 4 and 𝑅𝑋𝑋 𝜏 = 25𝑒 −3|𝜏| + 16
𝑖−1
Specify the joint density function for three r.v. 𝑋 𝑡𝑖 , 𝑖 = 1,2,3 , 𝑡𝑖 = 𝑡0 + , 𝑡0 is constant
2
𝑖−1 𝑘−1
• 𝑡𝑖 = 𝑡0 + , 𝑡𝑘 = 𝑡0 +
2 2
𝑘−𝑖
• 𝑡𝑘 − 𝑡𝑖 = , 𝑖 𝑎𝑛𝑑 𝑘 = 1,2,3, …
2
3 𝑘−𝑖

• 𝑅𝑋𝑋 𝑡𝑘 − 𝑡𝑖 = 25𝑒 2 + 16
3 𝑘−𝑖 3 𝑘−𝑖
− 2 −
• 𝐶𝑋𝑋 𝑡𝑘 − 𝑡𝑖 = 25𝑒 2 + 16 − 4 = 25𝑒 2

3 6
− −
1 𝑒 2 𝑒 2
3 3
• 𝐶𝑋 = 25 𝑒 −2 1 𝑒 −2
6 3
− −
𝑒 2 𝑒 2 1

Dr. Ali Hussein Muqaibel 26


Example iid Gaussian Sequence
Let the discrete-time random process 𝑋𝑛 be a sequence of independent Gaussian random variables
with mean 𝑚 and variance 𝜎 2 . The covariance matrix for the times 𝑡1 , … , 𝑡𝑘 is

𝐶𝑋 𝑡𝑖 , 𝑡𝑗 = 𝜎 2 𝛿𝑖𝑗 = 𝜎 2 𝐼,

where 𝛿𝑖𝑗 = 1 when 𝑖 = 𝑗 and 0 otherwise, and 𝐼 is the identity matrix. Thus the
corresponding joint pdf is
𝑘
1
𝑓𝑋1,.....𝑋𝑘 (𝑥1 , 𝑥2 , . . . . . , 𝑥𝑘 ) = 𝑘ൗ exp − ෍(𝑥𝑖 − 𝑚)2 /2σ2
2
(2πσ ) 2 𝑖=1
= 𝑓𝑋 (𝑥1 )𝑓𝑋 (𝑥2 ). . . 𝑓𝑋 (𝑥𝑘 )

Recall that the general form was


1ൗ (𝑋−𝒎)𝑇 𝑪−1 (𝑋−𝒎)
𝑒− 2
𝑓𝑋1,𝑋2,.....,𝑋𝑘 (𝑥1 , . . . . , 𝑥𝑘 ) = 1ൗ
(2π)𝑘/2 𝑪 2

Dr. Ali Hussein Muqaibel 27


EXAMPLES OF CONTINUOUS-TIME RANDOM PROCESSES:

Poisson Process
• Consider a situation in which events occur at 𝑁(𝑡)
random instants of time at an average rate
of a customer to a service station or the
breakdown of a component in some system.
• Let 𝑁(𝑡) be the number of event
occurrences in the time interval 0, 𝑡 .
𝑋1 𝑋2 𝑋3
• 𝑁(𝑡) is then a non-decreasing, integer- A sample path of the Poisson counting process. The
event occurrence times are denoted by 𝑆1 , 𝑆2 , … . .
valued, continuous-time random process as The jth inter-event time is denoted by 𝑋𝑗 = 𝑆𝑗 − 𝑆𝑗−1
shown in Figure. Poisson Process

Dr. Ali Hussein Muqaibel 28


Poisson Process.. From Binomial
• If we divide the interval [0, 𝑡] into 𝑛 subintervals. If the probability of an event occurrence in each
subinterval is 𝑝, then the expected number of event occurrences in the interval [0, 𝑡] is 𝑛𝑝.
• Since events occur at a rate of 𝜆 events per second, the average number of events in the interval
[0, 𝑡] is also 𝜆𝑡.
• Thus we must have that
𝜆𝑡 = 𝑛𝑝
• If we now let 𝑛 → ∞(𝑖. 𝑒. , 𝛿 → 0) and 𝑝 → 0 while 𝑛𝑝 = 𝜆𝑡 remains fixed, then the binomial distribution
approaches a Poisson distribution with parameter𝜆𝑡. The number of event occurrences 𝑁 𝑡 in the
interval [0, 𝑡] has a Poisson distribution with mean 𝜆𝑡:
𝜆𝑡 𝑘 −𝜆𝑡
𝑃 𝑁 𝑡 =𝑘 = 𝑒 , 𝑓𝑜𝑟 𝑘 = 0,1, …
𝑘!
• For this reason 𝑁(𝑡) is called the Poisson process.

Replace 𝑝 with 𝜆 𝑡/𝑛


For detailed derivation , please see
𝑛 𝑛−𝑗
http://www.vosesoftware.com/ModelRiskHelp/index.htm#P
robability_theory_and_statistics/Stochastic_processes/Derivi
𝑃[𝑆𝑛 = 𝑗] = 𝑗 𝑝 𝑗 (1 − 𝑝) for 0 ≤ 𝑗 ≤ 𝑛,
ng_the_Poisson_distribution_from_the_Binomial.htm

Dr. Ali Hussein Muqaibel 29


Poisson Random Process
• Also known as Poisson Counting Process
• Arrival of customers, failure of parts, lightning,….internet t>0
• Two conditions:
➢ Events do not coincide.
➢ # of occurrence in any given time interval is independent of the number in any non overlapping time interval. (independent
increments)
• Average rate of occurrence=𝜆.
𝜆𝑡 𝑘 𝑒 −𝜆𝑡
• 𝑃 𝑋 𝑡 =𝑘 = , 𝑘 = 0,1,2, … 0, 𝑡
𝑘!
∞ 𝜆𝑡 𝑘 𝑒 −𝜆𝑡
• 𝑓𝑋 𝑥 = σ𝑘=0 𝛿 𝑥−𝑘
𝑘!
• 𝐸 𝑋 𝑡 = 𝜆𝑡 and 𝑣𝑎𝑟𝑖𝑎𝑛𝑐𝑒 = 𝜆𝑡 = 𝑚𝑒𝑎𝑛
• 𝐸 𝑋 2 𝑡 = 𝜆𝑡[1 + 𝜆𝑡]
• The probability distribution of the waiting time until the next occurrence is an exponential distribution.
• The occurrences are distributed uniformly on any interval of time.
http://en.wikipedia.org/wiki/Poisson_process
Dr. Ali Hussein Muqaibel 30
𝜆𝑡 𝑘 𝑒 −𝜆𝑡
Example I 𝑃 𝑁 𝑡 =𝑘 =
𝑘!
, 𝑓𝑜𝑟 𝑘 = 0,1, …

Inquiries arrive at a recorded message device according to a Poisson process of rate 15 inquiries per
minute. Find the probability that in a 1-minute period, 3 inquiries arrive during the first 10 seconds
and 2 inquiries arrive during the last 15 seconds.
15 1
The arrival rate in seconds is 𝜆 = = inquiries per second.
60 4
Writing time in seconds, the probability of interest is
𝑃[𝑁 10 = 3 𝑎𝑛𝑑 𝑁 60 − 𝑁 45 = 2]
By applying first the independent increments property, and then the stationary
increments property, we obtain
𝑃[𝑁(10) = 3 and 𝑁(60) − 𝑁(45) = 2] = 𝑃 𝑁 10 = 3 𝑃[𝑁(60) − 𝑁(45) = 2]
= 𝑃 𝑁 10 = 3 𝑃[𝑁(60 − 45) = 2]

(10Τ4)3 𝑒 −10Τ4 (15Τ4)2 𝑒 −15Τ4


=
3! 2!
Dr. Ali Hussein Muqaibel 31
Practice Problem :Poisson Process
• Suppose that a secretary receives calls that arrive according to
a Poisson process with a rate of 10 calls per hour.
• What is the probability that no calls go unanswered if the
secretary is a way from the office for the first and last 15
minutes of an hour?

Dr. Ali Hussein Muqaibel 32


Time Averages
• Time Averaging
1 𝑇
𝐴[•] = lim න [•]𝑑𝑡
𝑇→∞ 2𝑇 −𝑇
Mean (ensemble average)
• Time average ∞

𝑇 𝐸[𝑋(𝑡)] = න 𝑥𝑓𝑋 (𝑥; 𝑡)𝑑𝑥


1 −∞
𝑥ҧ = 𝐴[𝑥(𝑡)] = lim න 𝑥(𝑡)𝑑𝑡 𝐸[𝑋(𝑡)] = ෍ 𝑥𝑖 𝑃{𝑋(𝑡) = 𝑥𝑖 }
𝑇→∞ 2𝑇 −𝑇
𝑖
• Time auto-correlation function
1 𝑇
ℜ𝑥𝑥 (𝜏) = 𝐴[𝑥(𝑡)𝑥(𝑡 + 𝜏)] = lim න 𝑥(𝑡)𝑥(𝑡 + 𝜏)𝑑𝑡
𝑇→∞ 2𝑇 −𝑇
• Time cross-correlation function
1 𝑇
ℜ𝑥𝑦 (𝜏) = 𝐴[𝑥(𝑡)𝑦(𝑡 + 𝜏)] = lim න 𝑥(𝑡)𝑦(𝑡 + 𝜏)𝑑𝑡
𝑇→∞ 2𝑇 −𝑇

Dr. Ali Hussein Muqaibel 34


Time Averages & Ergodicity
• What is the practical advantage of having Ergodic process?
▪ We can study sample across time instead of studying many samples.
• Mean-Ergodic ⇒ 𝑥ҧ = 𝑋ത
• Correlation-Ergodic ⇒ ℜ𝑥𝑥 (𝜏) = 𝑅𝑋𝑋 (𝜏)

𝑥ҧ = 𝑋ത
• Ergodic ↔ ቊ
ℜ𝑥𝑥 (𝜏) = 𝑅𝑋𝑋 (𝜏)

𝐵𝑜𝑡ℎ 𝑋 𝑡 and 𝑌 𝑡 𝑎𝑟𝑒 𝑒𝑟𝑔𝑜𝑑𝑖𝑐


• Jointly Ergodic ↔ ൝
ℜ𝑥𝑦 (𝜏) = 𝑅𝑋𝑌 (𝜏)

Dr. Ali Hussein Muqaibel 35


Measurement of Correlation Function
• In real life, we can never measure the true correlation.
• We assume ergodicity and use portion of the available time.
• Assume ergodicity, no need to prove mathematically “physical sense”
• Assume jointly ergodic => stationary

1 𝑡1 +2𝑇
Delay 𝑅𝑜 𝑡1 + 2𝑇 = න 𝑥 𝑡 𝑦 𝑡 + 𝜏 𝑑𝑡
𝑦(𝑡) 2𝑇 𝑡1
𝑇−𝜏
Similarly, we may
1 𝑡1 +2𝑇 𝑅𝑜 (𝑡1 + 2𝑇)
find 𝑅𝑋𝑋 (𝜏) & Product න . 𝑑𝑡
𝑅𝑌𝑌 𝜏 2𝑇 𝑡1
Let 𝑡1 = 0, and assume 𝑇 is very large,
𝑥(𝑡) Delay
𝑅0 2𝑇 = 𝑅𝑥𝑦 𝜏 = 𝑅𝑋𝑌 (𝜏)
𝑇
Dr. Ali Hussein Muqaibel 36
Example
• Use the above system to measure the 𝑅𝑋𝑋 𝜏 for 𝑋 𝑡 = 𝐴𝑐𝑜𝑠 𝜔0 𝑡 + 𝜃 .
1 𝑇 2
• 𝑅𝑜 2𝑇 = ‫׬‬ 𝐴 cos(𝜔0 𝑡 + 𝜃) cos 𝜔0 𝑡 + 𝜃 + 𝜔0 𝜏 𝑑𝑡
2𝑇 −𝑇

𝐴2 𝑇
• = ‫[ ׬‬cos 𝜔0 𝜏 + cos(2𝜔0 𝑡 + 2𝜃 + 𝜔0 𝜏)]𝑑𝑡
4𝑇 −𝑇
A2 A2 sin 2𝜔0 𝑇
• = cos(𝜔0 𝜏) + cos 𝜔0 𝜏 + 2𝜃 = 𝑅𝑋𝑋 𝜏 + 𝜖(𝑇)
2 2 2𝜔0 𝑇
• If we require the 𝜖 𝑇 to be at least 20 times less than the largest value of the true
autocorrelation 𝜖 𝑇 < 0.05 𝑅𝑋𝑋 (0)
A2 A2 1 10
• < 0.05 ⇒ ≤ 0.05 ⇒ 𝑇 ≥
4𝜔0 𝑇 2 2𝜔0 𝑇 𝜔0
• Wait enough time! Depending on the frequency

Dr. Ali Hussein Muqaibel 37


Matlab: Measuring the correlation
10

T=20, OMEGA=0.2
• % Measurement of Correlation function
𝑇≥
𝜔0
• clear all
• close all
• clc 10
• A=1; 20 < = 50
0.2
• T=50;
• omeg=0.2;
• t=-T:T;
• thet=2*pi*rand(1,1);
• X=A*cos(omeg*t+thet);
• [R,tau]=xcorr(X,'unbiased');
• %R=R/(2*T);
• True_R=A^2/2*cos(omeg*tau);

T=50, OMEGA=0.2
• Err=A^2/2*cos(omeg*tau+2*thet)*sin(2*omeg
*T)/(2*omeg*T);
• plot(tau,True_R,':',tau,R,'g-', tau,Err,'r.-', Note the
'LineWidth',2) error is less
• title ('Auto-correlation') than 5%
• legend ('Exact','Measured','Error')
• axis ([-40 40 -0.6 +0.6])

Dr. Ali Hussein Muqaibel 38


Practice Problems
• A random process 𝑋(𝑡) is known to be wide-sense stationary
with 𝐸 𝑋 2 𝑡 = 11.
• Give reasons why the following expressions cannot be the
autocorrelation function of the process.
sin(2𝜏)
a) 𝑅𝑋𝑋 𝑡, 𝑡 + 𝜏 = 1+𝜏2
(give three reasons)
− 𝑡+𝜏 2
b) 𝑅𝑋𝑋 𝑡, 𝑡 + 𝜏 = cos 3𝑡 𝑒 (give one reason)
Practice:
Quiz 7 112
Solution available

Dr. Ali Hussein Muqaibel 39

You might also like