0% found this document useful (0 votes)
7 views23 pages

TSKS01 Digital Communication: Lecture 2

Uploaded by

waleed ali
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
7 views23 pages

TSKS01 Digital Communication: Lecture 2

Uploaded by

waleed ali
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 23

TSKS01  

Digital  Communication
Lecture  2
Repetition  of  Probability  Theory  &  Introduction  to  Stochastic  Processes

Emil  Björnson

Department  of  Electrical  Engineering  (ISY)


Division  of  Communication  Systems
Two-­Dimensional  Stochastic  Variable
Properties
Sample  space:   Ω Distribution:  𝐹",$ 𝑥, 𝑦 = Pr 𝑋 ≤ 𝑥, 𝑌 ≤ 𝑦
𝜔 Stochastic  
./
variable Density:    𝑓",$ 𝑥, 𝑦 = 𝐹 𝑥, 𝑦
.0  .2 ",$

Properties:    𝑓",$ 𝑥, 𝑦 ≥ 0,   𝐹",$ 𝑥, 𝑦 ≥ 0


(𝑋 𝜔 , 𝑌 𝜔 ) 8 8
5 5 𝑓",$ 𝑥, 𝑦 𝑑𝑥𝑑𝑦 = 1
98 98

8
Marginal  dist.:   𝑓" 𝑥 = ∫98 𝑓",$ 𝑥, 𝑦 𝑑𝑦
8
𝑓$ 𝑦 = ∫98 𝑓",$ 𝑥, 𝑦 𝑑𝑥
Function:
8 8
𝐸 𝑔(𝑥, 𝑦) = ∫98 ∫98 𝑔(𝑥, 𝑦)𝑓",$ 𝑥, 𝑦 𝑑𝑥𝑑𝑦

2015-­09-­07 TSKS01  Digital  Communication  -­ Lecture  1 2


Dependencies
Definition: 𝑋 and  𝑌 are independent  if 𝐹",$ (𝑥, 𝑦) = 𝐹" 𝑥 𝐹$ (𝑦) holds.

Theorem: Independent ⇔ 𝑓",$ (𝑥, 𝑦)   =   𝑓" 𝑥 𝑓$ (𝑦) holds.

Definition: Covariance:          Cov 𝑋, 𝑌 = 𝐸 (𝑋 − 𝑚" )(𝑌 − 𝑚$ )  


Theorem: Cov 𝑋, 𝑌 =  𝐸 𝑋𝑌 − 𝑚" 𝑚$

Definition: 𝑋 and  𝑌 are uncorrelated if Cov 𝑋, 𝑌 = 0 holds.

Theorem: Independent ⇒ uncorrelated.

Note: Var 𝑋 = Cov 𝑋, 𝑋

Theorem: Uncorrelated ⇔ 𝐸{𝑋𝑌}   =  𝐸{𝑋}𝐸{𝑌}


⇔ Var{𝑋 + 𝑌}   =  Var{𝑋}   +  Var{𝑌}
2015-­09-­07 TSKS01  Digital  Communication  -­ Lecture  1 3
Bayes’  Theorem Thomas  Bayes

Joint  and  conditional  probability:


Pr{𝑋 = 𝑥, 𝑌 = 𝑦} = Pr 𝑋 = 𝑥 𝑌 = 𝑦 Pr{𝑌 = 𝑦}
                                                                   = Pr  {𝑌 = 𝑦|𝑋 = 𝑥} Pr{𝑋 = 𝑥}

𝑓",$ 𝑥, 𝑦 = 𝑓"|$ 𝑥|𝑦 𝑓$ 𝑦 = 𝑓$|" 𝑦|𝑥 𝑓" 𝑥

LM  {$N2|"N0}
Bayes’  theorem (discrete): Pr  {𝑋 = 𝑥|𝑌 = 𝑦} = Pr  {𝑋 = 𝑥}
LM  {$N2}

OP |Q 2|0
(continuous): 𝑓"|$ 𝑥|𝑦 = 𝑓" 𝑥
OP 2

OP |Q 2|0
(𝑋 discrete,  𝑌 cont.): Pr  {𝑋 = 𝑥|𝑌 = 𝑦} = Pr  {𝑋 = 𝑥}
OP 2

2015-­09-­07 TSKS01  Digital  Communication  -­ Lecture  1 4


Multi-­Dimensional  Stochastic  Variables

Distribution: 𝐹"R ,…,"T 𝑥U ,… , 𝑥V = Pr  {𝑋U ≤ 𝑥U ,… , 𝑋V ≤ 𝑥V }


.T
Density: 𝑓"R ,…,"T 𝑥U ,… , 𝑥V = 𝐹 𝑥U,… , 𝑥V
.0R WWW  . 0T "R ,…,"T

Vector  notation:    𝑋X = (𝑋U, … , 𝑋V ),      𝑥̅ = (𝑥U, … , 𝑥V ),        𝐹"X 𝑥̅ ,        𝑓"X 𝑥̅

Mutual  independence:
V V

𝐹"X 𝑥̅ = Z 𝐹"[ (𝑥\ )                 𝑓"X 𝑥̅ = Z 𝑓"[ (𝑥\ )


\NU \NU

Pairwise  uncorrelated: 𝑋\ and  𝑋] are  uncorrelated  for  all  𝑖 ≠ 𝑗

2015-­09-­07 TSKS01  Digital  Communication  -­ Lecture  1 5


Example:  Jointly  Gaussian  Variables

Definition: 𝑋X = (𝑋U ,… , 𝑋V )  is  jointly  Gaussian,  denoted  as  𝑁(𝑚


b, Λ "X ),  if
1 9
U
b lmR
0̅ 9k n
𝑓"X 𝑥̅ = 𝑒 j b 0̅ 9k
Q b
2𝜋 V det  (Λ
"X )

𝜆U,U ⋯ 𝜆U,V
b = E 𝑋X ,          Λ "X =
where          𝑚 ⋮ ⋱ ⋮ ,            𝜆\] = Cov 𝑋\ ,𝑋] .
𝜆V,U ⋯ 𝜆V,V

If  𝑋X = (𝑋U , … , 𝑋V )  are  pairwise  uncorrelated  à 𝜆\] = 0 for  𝑖 ≠ 𝑗


/ V
1 ∑[9 0[ 9k[
à      𝑓"X 𝑥̅ = 𝑒 ju[,[ = Z 𝑓"[ 𝑥\     à    Independence
2𝜋 V𝜆
U,U ⋯ 𝜆V,V \NU

2015-­09-­07 TSKS01  Digital  Communication  -­ Lecture  1 6


Stochastic Process 𝑋U (𝜔U ,𝑡)

𝑡
Sample  space:   Ω
𝜔U

𝜔j 𝑋j (𝜔j, 𝑡)
𝜔v
𝑡

𝑋v (𝜔v, 𝑡)

Stochastic  process  =  Stochastic  time-­continuous  signal  (𝑵 = ∞)

2015-­09-­07 TSKS01  Digital  Communication  -­ Lecture  1 7


Examples  of  Stochastic  Processes

Example  1: Finite  number  of  realizations:

     𝑋(𝑡) = sin  (𝑡 + 𝜙),   𝜙 ∈ {0, 𝜋/2, 𝜋, 3𝜋/2}.

Example  2:    Infinite  number  of  realizations:

     𝑋(𝑡)   =  𝐴 · sin  (𝑡), 𝐴~𝑁 0,1 .

2015-­09-­07 TSKS01  Digital  Communication  -­ Lecture  1 8


Examples  of  Stochastic  Processes  cont’d

Example  3:    Infinite  number  of  realizations:


cos  (𝜋𝑡), |𝑡| < 1/2
𝑋 𝑡 = ∑… 𝐴… 𝑔(𝑡 − 𝑘),                  𝑔 𝑡 = ‡
 0,                         elsewhere

Each    𝐴𝑘   is  independent  and  𝑁(0,1)

One  realization:

2015-­09-­07 TSKS01  Digital  Communication  -­ Lecture  1 9


Distributions  and  Densities

One  time  instant:  𝑡


§ Distribution: 𝐹" • 𝑥 = Pr  {𝑋(𝑡) ≤ 𝑥}
Ž
§ Density: 𝑓" • 𝑥 = Ž0 𝐹" • 𝑥

Two  time  instants:  𝑡U, 𝑡j


§ Distribution: 𝐹" •R ,"(•/ ) 𝑥U, 𝑥j = Pr  {𝑋(𝑡U) ≤ 𝑥U,𝑋(𝑡j ) ≤ 𝑥j }
./
§ Density: 𝑓" •R ,"(•/) 𝑥U,𝑥j = 𝐹 𝑥U ,𝑥j
.0R .0/ " •R ,"(•/ )

2015-­09-­07 TSKS01  Digital  Communication  -­ Lecture  1 10


Example

𝑋(𝑡)   =  𝐴 · sin  (𝑡), for  some  stochastic  variable  𝐴,  𝐹• (𝑎).

𝐹"
𝑥 = Pr 𝑋 𝑡 ≤ 𝑥 = Pr 𝐴 sin 𝑡 ≤ 𝑥

𝑥 𝑥
Pr 𝐴 ≤ = 𝐹• ,                    𝑡 :   sin 𝑡 > 0
sin  (𝑡) sin  (𝑡)
= Pr 0 ≤ 𝑥 = 𝑢(𝑥) ,                                                                𝑡:  sin 𝑡 = 0
𝑥 𝑥
Pr 𝐴 ≥ = 1 − 𝐹• ,    𝑡 :   sin 𝑡 < 0
sin  (𝑡) sin  (𝑡)
1 𝑥
𝑓                          𝑡 :   sin 𝑡 > 0
sin  (𝑡) • sin  (𝑡)
𝑑
𝑓" • 𝑥 = 𝐹 𝑥 = 𝛿(𝑥),                                                        𝑡:  sin 𝑡 = 0
𝑑𝑥 " •
1 𝑥
− 𝑓 , 𝑡:  sin 𝑡 < 0
sin  (𝑡) • sin 𝑡
2015-­09-­07 TSKS01  Digital  Communication  -­ Lecture  1 11
Multiple  Time  Instants

Vector  notation
§ Time  instants: 𝑡 ̅ = (𝑡U, … , 𝑡V )
§ Variable: 𝑋(𝑡̅) = 𝑋(𝑡U ), … , 𝑋(𝑡V )
§ Realizations: 𝑥̅ = (𝑥U ,… , 𝑥V)

𝑁 time  instants
§ Distribution: 𝐹"(•̅) 𝑥̅ = Pr  {𝑋(𝑡U) ≤ 𝑥U,… , 𝑋(𝑡V ) ≤ 𝑥V }
.T
§ Density: 𝑓"(•̅) 𝑥̅ = 𝐹 ̅ 𝑥̅
.0R ⋯.0T "(•)

2015-­09-­07 TSKS01  Digital  Communication  -­ Lecture  1 12


Ensemble  Averages
“Observe  many  realizations  and  make  an  average”  – functions  of  time

Expectation  (mean): Auto-­correlation  function  (ACF):


𝑚" 𝑡 = 𝐸 𝑋(𝑡) 𝑟" 𝑡U, 𝑡j = 𝐸 𝑋 𝑡U 𝑋(𝑡j ) =
8 8 8
= 5 𝑥𝑓"(•) 𝑥 𝑑𝑥 5 5 𝑥U𝑥j𝑓" •R ," •/ 𝑥U, 𝑥j 𝑑𝑥U 𝑑𝑥j
98 98 98

Quadratic  mean  (power): Symmetry:    𝑟" 𝑡U, 𝑡j = 𝑟" 𝑡j, 𝑡U


8
Power:   𝑟" 𝑡, 𝑡 = 𝐸 𝑋j (𝑡)
𝐸 𝑋j (𝑡) = 5 𝑥 j 𝑓"(•) 𝑥 𝑑𝑥
98

Variance Special  case:  𝐴 is  stochastic  var.


j j 𝑋 𝑡 = 𝑔(𝑡, 𝐴)
𝜎"(•) =𝐸 𝑋(𝑡) − 𝑚" 𝑡
𝑟" 𝑡U,𝑡j =
8
          = 𝐸 𝑋j (𝑡) − 𝑚j" 𝑡 ∫98 𝑔(𝑡U, 𝐴)𝑔(𝑡j ,𝐴)𝑓• 𝑎 𝑑𝑎
2015-­09-­07 TSKS01  Digital  Communication  -­ Lecture  1 13
Example

𝑋(𝑡)   =  𝐴 · sin  (𝑡), for  some  stochastic  variable  𝐴,  𝐹• (𝑎).

8 8
𝑚" 𝑡 = 5 𝑥𝑓"(•) 𝑥 𝑑𝑥 = 5 𝑎 sin 𝑡 𝑓• 𝑎 𝑑𝑎 = sin 𝑡 𝑚•
98 98
8 8
𝐸{𝑋j (𝑡)} = 5 𝑥 j 𝑓"(•) 𝑥 𝑑𝑥 = 5 𝑎 sin 𝑡 j𝑓
• 𝑎 𝑑𝑎 = sinj 𝑡 𝐸 𝐴j
98 98

j
𝜎"(•) = 𝐸 𝑋j (𝑡) − 𝑚j" 𝑡 = sinj 𝑡 𝐸 𝐴j − 𝑚•j = sinj 𝑡 𝜎•j
8
𝑟" 𝑡U, 𝑡j = 5 𝑎 sin 𝑡U 𝑎 sin 𝑡j 𝑓• 𝑎 𝑑𝑎 = sin 𝑡U sin 𝑡j 𝐸 𝐴j
98

2015-­09-­07 TSKS01  Digital  Communication  -­ Lecture  1 14


Strict-­Sense  Stationarity

Stationarity is  statistical  invariance  to  a  shift  of  the  time  origin.

Definition:
Consider  time  instants  𝑡 ̅ = (𝑡U, … , 𝑡V ) and  shifted  time  instances  
𝑢X = 𝑡 ̅ + Δ = (𝑡U + Δ, … , 𝑡V + Δ).  The  process  𝑋(𝑡) is  said  to  be  
strict-­sense  stationary  (SSS)  if
𝐹"(•̅) 𝑥̅ = 𝐹"(b̃) 𝑥̅
holds  for  all  𝑁 and  all  choices  of  𝑡 ̅ and  Δ.

Equivalence:
𝐹" (•̅) 𝑥̅ = 𝐹"(b̃) 𝑥̅ ⇔ 𝑓"(•̅) 𝑥̅ = 𝑓"(b̃) 𝑥̅

2015-­09-­07 TSKS01  Digital  Communication  -­ Lecture  1 15


Wide-­Sense  Stationarity

Definition:  A  stochastic  process  𝑋(𝑡) is  said  to  be  


wide-­sense  stationary  (WSS)  if
§ Mean  satisfies  𝑚" 𝑡 = 𝑚" 𝑡 + Δ for  all  Δ.
§ ACF  satisfies  𝑟" 𝑡U,𝑡j = 𝑟" 𝑡U + Δ, 𝑡j + Δ for  all  Δ.

Interpretation:
Constant  mean,  ACF  only  depends  on  time  difference  𝜏 = 𝑡U − 𝑡j

Notation: Mean  𝑚"


ACF  𝑟" 𝜏

2015-­09-­07 TSKS01  Digital  Communication  -­ Lecture  1 16


Gaussian  Processes
Recall: 𝑋X = (𝑋U , … , 𝑋V )  is  jointly  Gaussian,  denoted  as  𝑁(𝑚
b, Λ "X ),  if
1 9
U
b lmR
0̅ 9k n
𝑓"X 𝑥̅ = 𝑒 j b 0̅ 9k
Q b
2𝜋 V det  (Λ
"X )

Definition: A  stochastic  process  is  called  Gaussian if  𝑋(𝑡)̅ =


𝑋(𝑡U ),… , 𝑋(𝑡V ) is  jointly  Gaussian  for  any  𝑡 ̅ = (𝑡U ,… , 𝑡V )

Theorem: A  Gaussian  process  that  is  wide  sense  stationary  is  also  
stationary  in  the  strict  sense.

2015-­09-­07 TSKS01  Digital  Communication  -­ Lecture  1 17


Power-­Spectral Density (PSD)

Definition: Fourier transform  of the  ACF  of a  WSS  process:


8

𝑅" 𝑓 = ℱ 𝑟" (𝜏) = 5 𝑟" (𝜏)𝑒 9]jœO• 𝑑𝜏


98
Inverse:
8

𝑟" 𝜏 = ℱ 9U 𝑅" 𝑓 = 5 𝑅" 𝑓 𝑒]jœO• 𝑑𝑓


98
Power:
8

𝐸 𝑋j (𝑡) = 𝑟" 0 = 5 𝑅" 𝑓 𝑑𝑓


98

2015-­09-­07 TSKS01  Digital  Communication  -­ Lecture  1 18


Filtering  of  Stochastic  Processes
Stochastic Stochastic  
LTI-­‐‑system
process process
ℎ 𝑡
𝑋(𝑡) 𝑌(𝑡)

Input-­output  relation:
8

𝑌(𝑡) = (𝑋 ∗ ℎ)(𝑡) = 5 ℎ 𝜏 𝑋(𝑡 − 𝜏)𝑑𝜏


98

8
Requires  stability: ∫98 |ℎ 𝜏 |𝑑𝜏 < ∞

Holds  regardless  of  stationarity

2015-­09-­07 TSKS01  Digital  Communication  -­ Lecture  1 19


Expectation  of  the  Filtered  Output
Stochastic  process
8
WSS  stochastic LTI-­‐‑system
process  𝑋(𝑡) ℎ 𝑡 𝑌 𝑡 = 5 ℎ 𝜏 𝑋(𝑡 − 𝜏)𝑑𝜏
98
8
Notation:  𝐻 𝑓 = ℱ ℎ(𝑡) = ∫98 ℎ(𝑡)𝑒 9]jœO• 𝑑𝑡

8
Expectation:    𝑚$ 𝑡 = 𝐸{∫98 ℎ 𝜏 𝑋(𝑡 − 𝜏)𝑑𝜏}
8 8
= ∫98 ℎ 𝜏 𝐸 𝑋(𝑡 − 𝜏) 𝑑𝜏 = 𝑚" ∫98 ℎ 𝜏 𝑑𝜏 = 𝑚" 𝐻(0)

Expectation  is  linear  operation 𝑋(𝑡) is  WSS

Output  has  constant  mean  when  the  input  is  WSS!

2015-­09-­07 TSKS01  Digital  Communication  -­ Lecture  1 20


ACF  and  PSD  of  the  Filtered  Output

ACF:
𝑟$ 𝑡U ,𝑡j = E 𝑌 𝑡U , 𝑌(𝑡j)                        
8 8

                                   = 𝐸 5 ℎ 𝜏U 𝑋 𝑡 − 𝜏U 𝑑𝜏U 5 ℎ 𝜏j 𝑋 𝑡 − 𝜏j 𝑑𝜏j
98 98
                                   = Compute  expectation + Change  of  variables
                                   = (ℎ ∗ ℎ̄ ∗ 𝑟" )(𝜏)
where  ℎ̄ 𝑡 = ℎ(−𝑡). Output  is  WSS  when  the  input  is  WSS!

PSD:
𝑅$ 𝑓 = ℱ 𝑟$ (𝜏) = 𝐻 𝑓 𝐻 ∗ 𝑓 𝑅" 𝑓 = 𝐻 𝑓 j 𝑅 (𝑓)
"

2015-­09-­07 TSKS01  Digital  Communication  -­ Lecture  1 21


Example  Filtering
Let 𝑋(𝑡) be  a  WSS  process  with 𝑟"   𝑡 = 𝑒 9 •

LTI-­‐‑system 1,             𝑓 ≤ 1
𝑋(𝑡)
𝐻(𝑓) 𝑌(𝑡) 𝐻 𝑓 =‡
 0, elsewhere

Determine  the  output  power 𝐸{𝑌j (𝑡)}:


8 8 U
𝐸 𝑌j 𝑡 = 𝑟$ 0 = 5 𝑅$ 𝑓 𝑑𝑓 = 5 𝐻 𝑓 j𝑅
" 𝑓 𝑑𝑓 = 5 𝑅" 𝑓 𝑑𝑓
98 98 9U
2
= Calculator = arctan  (2𝜋)
𝜋

2015-­09-­07 TSKS01  Digital  Communication  -­ Lecture  1 22


www.liu.se

You might also like