7 Probability Communication
7 Probability Communication
1 Introduction
3 Random Processes
Introduction 2
Introduction
1 Introduction
3 Random Processes
Events are subsets of the sample space for which measures of their
occurrences, called probabilities, can be defined or determined.
1
The events E1 , E2 , E3 ,. . . are mutually exclusive if Ei ∩ Ej = for all i 6= j,
where is the null set.
Probability and Random Variables 7
Important Properties of the Probability Measure
We observe or are told that event E1 has occurred but are actually
interested in event E2 : Knowledge that of E1 has occurred changes
the probability of E2 occurring.
If it was P (E2 ) before, it now becomes P (E2 |E1 ), the probability of
E2 occurring given that event E1 has occurred.
This conditional probability is given by
(
P (E2 ∩E1 )
P (E2 |E1 ) = P (E1 ) , if P (E1 ) 6= 0 .
0, otherwise
Bayes’ rule:
P (A|Ei )P (Ei ) P (A|Ei )P (Ei )
P (Ei |A) = = Pn .
P (A) j=1 P (A|Ej )P (Ej )
ω1 ω2
ω4
ω3
R
x(ω4 ) x(ω1 ) x(ω3 ) x(ω2 )
R
1 2 3 4 5 6
1.0
(a)
x
−∞ 0 ∞
Fx ( x )
1.0
(b)
x
−∞ 0 ∞
Fx ( x )
1.0
(c)
x
−∞ 0 ∞
fx ( x ) Fx ( x )
1− p
( p)
(1 − p )
x x
0 1 0 1
0.30
0.25
0.20
0.15
0.10
0.05
x
0 2 4 6
fx ( x ) Fx ( x )
1
1
b−a
x x
a 0 b a 0 b
1
2
x x
0 µ 0 µ
0.4
0.2
Signal amplitude (volts)
−0.2
−0.4
−0.6
−0.8
0 0.2 0.4 0.6 0.8 1
t (sec)
1.5
0.5
0
−1 −0.5 0 0.5 1
x (volts)
2
1 (x−m )
− 2σ2x
fx (x) = p e x (Gaussian)
2πσx2
a −a|x|
fx (x) = e (Laplacian)
2
0.25
fx(x)
0.2
0.15
0.1
0.05
0
−15 −10 −5 0 5 10 15
x
Range (±kσx ) k=1 k=2 k=3 k=4
P (mx − kσx < x ≤ mx − kσx ) 0.683 0.955 0.997 0.999
Error probability 10−3 10−4 10−6 10−8
Distance from the mean 3.09 3.72 4.75 5.61
∂ 2 Fx,y (x, y)
fx,y (x, y) = .
∂x∂y
When the joint pdf is integrated over one of the variables, one
obtains the pdf of other variable, called the marginal pdf:
Z ∞
fx,y (x, y)dx = fy (y),
Z−∞
∞
fx,y (x, y)dy = fx (x).
−∞
Note that:
Z ∞ Z ∞
fx,y (x, y)dxdy = F (∞, ∞) = 1
−∞ −∞
Fx,y (−∞, −∞) = Fx,y (−∞, y) = Fx,y (x, −∞) = 0.
The conditional pdf of the random variable y, given that the value
of the random variable x is equal to x, is defined as
(
fx,y (x,y)
fy (y|x) = fx (x) , fx (x) 6= 0 .
0, otherwise
cov{x, y}
ρx,y = .
σx σy
ρx,y indicates the degree of linear dependence between two random
variables.
It can be shown that |ρx,y | ≤ 1.
ρx,y = ±1 implies an increasing/decreasing linear relationship.
If ρx,y = 0, x and y are said to be uncorrelated.
It is easy to verify that if x and y are independent, then ρx,y = 0:
Independence implies lack of correlation.
However, lack of correlation (no linear relationship) does not in
general imply statistical independence.
Probability and Random Variables 32
Examples of Uncorrelated Dependent Random Variables
1
1 1 P (y = −1, z = 0) = 0
2 z
4 4 P (y = −1, z = 1) = P (x = −1) = 1/4
1
P (y = 0, z = 0) = P (x = 0) = 1/2
−1 P (y = 0, z = 1) = 0
0 1 y P (y = 1, z = 0) = 0
P (y = 1, z = 1) = P (x = 1) = 1/4
Therefore, E{yz} = (−1)(1) 14 + (0)(0) 12 + (1)(1) 41 =0
⇒ cov{y, z} = E{yz} − my mz = 0 − (0)1/2 = 0!
Probability and Random Variables 34
Jointly Gaussian Distribution (Bivariate)
1 1
fx,y (x, y) = exp −
2(1 − ρ2x,y )
q
2πσx σy 1 − ρ2x,y
(x − mx )2 2ρx,y (x − mx )(y − my ) (y − my )2
× − + ,
σx2 σx σy σy2
where mx , my , σx , σy are the means and variances.
ρx,y is indeed the correlation coefficient.
Marginal density is Gaussian: fx (x) ∼ N (mx , σx2 ) and
fy (y) ∼ N (my , σy2 ).
When ρx,y = 0 → fx,y (x, y) = fx (x)fy (y) → random variables x
and y are statistically independent.
Uncorrelatedness means that joint Gaussian random variables are
statistically independent. The converse is not true.
Weighted sum of two jointly Gaussian random variables is also
Gaussian.
Probability and Random Variables 35
Joint pdf and Contours for σx = σy = 1 and ρx,y = 0
ρ =0
x,y
0.15
2.5
2
0.1
fx,y(x,y)
0.05
0
y
0
−1
2 3
2
0 1
0 −2
−2 −1
−2 −2.5
y −3 −2 −1 0 1 2
x x
ρ =0.30
x,y
a cross−section
0.15 2.5
0.1
fx,y(x,y)
0.05
0
y
0
−1
2 3
2
0 1
0 −2
−2 −1
−2 −2.5
y −3 −2 −1 0 1 2
x x
ρ =0.70
x,y
0.2 2.5
2
0.15
a cross−section
fx,y(x,y)
0.1 1
0.05 0
y
0
−1
2 3
2
0 1
0 −2
−2 −1
−2 −2.5
y −3 −2 −1 0 1 2
x x
ρx,y=0.95
0.5
2.5
0.4
2
0.3
fx,y(x,y)
1
0.2
0.1 0
y
0
−1
2 3
2
0 1
0 −2
−2 −1
−2 −2.5
y −3 −2 −1 0 1 2
x x
Define →
−x = [x1 , x2 , . . . , xn ], a vector of the means
→
−
m = [m1 , m2 , . . . , mn ], and the n × n covariance matrix C with
Ci,j = cov(xi , xj ) = E{(xi − mi )(xj − mj )}.
The random variables {xi }ni=1 are jointly Gaussian if:
1
fx1 ,x2 ,...,xn (x1 , x2 , . . . , xn ) = p ×
n
(2π) det(C)
1 →
− →
− −1 →
− →
− >
exp − ( x − m)C ( x − m) .
2
1 Introduction
3 Random Processes
Random Processes 41
Random Processes I
ωj
ω
3
ω ω
1 M
ω
2
x(t,ω)
Real number
x(t ,ω)
k
x1(t,ω1)
x (t,ω )
2 2
. .
. .
xM(t,ωM) . .
. .
. .
t1 t2 ... tk Time
. .
Random Processes 42
Random Processes II
0 t 0 t
Random Processes 44
Examples of Random Processes II
Tb
0 t 0 t
Random Processes 45
Classification of Random Processes
Based on whether its statistics change with time: the process is
non-stationary or stationary.
Different levels of stationarity:
I Strictly stationary: the joint pdf of any order is independent of a shift
in time.
I N th-order stationarity: the joint pdf does not depend on the time
shift, but depends on time spacings:
fx(t1 ),x(t2 ),...x(tN ) (x1 , x2 , . . . , xN ; t1 , t2 , . . . , tN ) =
fx(t1 +t),x(t2 +t),...x(tN +t) (x1 , x2 , . . . , xN ; t1 + t, t2 + t, . . . , tN + t).
Random Processes 47
Mean Value or the First Moment
The average is across the ensemble and if the pdf varies with time
then the mean value is a (deterministic) function of time.
If the process is stationary then the mean is independent of t or a
constant: Z ∞
mx = E{x(t)} = xfx (x)dx.
−∞
Random Processes 48
Mean-Squared Value or the Second Moment
This is defined as
Z ∞
2
MSVx (t) = E{x (t)} = x2 fx(t) (x; t)dx (non-stationary),
−∞
Z ∞
2
MSVx = E{x (t)} = x2 fx (x)dx (stationary).
−∞
Random Processes 49
Correlation
Random Processes 51
Power Spectral Density of a Random Process I
Taking the Fourier transform of the random process does not work.
Time−domain ensemble Frequency−domain ensemble
x (t,ω ) |X1(f,ω1)|
1 1
0 t
f
0
x2(t,ω2) |X2(f,ω2)|
0 t
f
. 0 .
xM(t,ωM) . |XM(f,ωM)|
.
. .
0 t
f
. 0 .
. .
. .
Random Processes 52
Power Spectral Density of a Random Process II
It follows that
Z T
1
E x2T (t) dt
MSVx = lim
T →∞ 2T −T
n o
Z ∞ E |XT (f )|2
= lim df (watts).
−∞ T →∞ 2T
Random Processes 54
Power Spectral Density of a Random Process IV
Finally,
n o
E |XT (f )|2
Sx (f ) = lim (watts/Hz),
T →∞ 2T
is the power spectral density of the process.
It can be shown that the power spectral density and the
autocorrelation function are a Fourier transform pair:
Z ∞
Rx (τ ) ←→ Sx (f ) = Rx (τ )e−j2πf τ dτ.
τ =−∞
Random Processes 55
Time Averaging and Ergodicity
Random Processes 57
Random Processes and LTI Systems
Rx , y (τ )
mx , Rx (τ ) ←
→ Sx ( f ) my , Ry (τ ) ←
→ Sy ( f )
Z ∞
my = E{y[n]} = E h(λ)x(t − λ)dλ = mx H(0)
−∞
Sy (f ) = |H(f )|2 Sx (f )
Ry (τ ) = h(τ ) ∗ h(−τ ) ∗ Rx (τ ).
Random Processes 58
Thermal Noise in Communication Systems
e−|τ |/t0
Rw (τ ) = kθG (watts),
t0
2kθG
Sw (f ) = (watts/Hz).
1 + (2πf t0 )2
Random Processes 59
(a) Power Spectral Density, S (f)
w
N0/2
Sw(f) (watts/Hz)
White noise
Thermal noise
0
−15 −10 −5 0 5 10 15
f (GHz)
(b) Autocorrelation, Rw(τ)
N0/2δ(τ)
Rw(τ) (watts)
White noise
Thermal noise
−0.1 −0.08 −0.06 −0.04 −0.02 0 0.02 0.04 0.06 0.08 0.1
τ (pico−sec)
Random Processes 60
The noise PSD is approximately flat over the frequency range of 0
to 10 GHz ⇒ let the spectrum be flat from 0 to ∞:
N0
Sw (f ) = (watts/Hz),
2
where N0 = 4kθG is a constant.
Noise that has a uniform spectrum over the entire frequency range
is referred to as white noise
The autocorrelation of white noise is
N0
Rw (τ ) = δ(τ ) (watts).
2
Since Rw (τ ) = 0 for τ 6= 0, any two different samples of white
noise, no matter how close in time they are taken, are uncorrelated.
Since the noise samples of white noise are uncorrelated, if the noise
is both white and Gaussian (for example, thermal noise) then the
noise samples are also independent.
Random Processes 61
Example
x(t ) R y (t )
Random Processes 62
R 1
H(f ) = = .
R + j2πf L 1 + j2πf L/R
N0 1 N0 R −(R/L)|τ |
Sy (f ) = ←→ Ry (τ ) = e .
2 1+ 2πL 2 2 4L
R f
S y ( f ) (watts/Hz) Ry (τ ) (watts)
N0
2 N0R
4L
0 f (Hz) 0 τ (sec)
Random Processes 63