0% found this document useful (0 votes)
21 views67 pages

Part 1 Random Processes For

Uploaded by

imslepting123
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
21 views67 pages

Part 1 Random Processes For

Uploaded by

imslepting123
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 67

Part 1 Random Processes for

Communications
System Models
o A good mathematical model for a system is the
basis of its analysis.
o Two models are often considered:
n Deterministic model
o No uncertainty about its time-dependent behavior at
any instance of time
n Random or stochastic model
o Uncertain about its time-dependent behavior at any
instance of time
o but certain on the statistical behavior at any instance of
time
© Po-Ning [email protected] 1-2
Examples of Stochastic Models
o Channel noise and interference
o Source of information, such as voice

© Po-Ning [email protected] 1-3


Notion of Relative Frequency
o How to determine the probability of “head
appearance” for a coin?
o Answer: Relative frequency.
Specifically, by carrying out n coin-tossing
experiments, the relative frequency of head
appearance is equal to Nn(A)/n, where Nn(A) is
the number of head appearance in these n
random experiments.

© Po-Ning [email protected] 1-4


Notion of Relative Frequency
o Is relative frequency close to the true
probability (of head appearance)?
n It could occur that 4-out-of-10 tossing results are
“head” for a fair coin!
o Can one guarantee that the true “head
appearance probability” remains unchanged
(i.e., time-invariant) in each experiment
performed at different time instance?

© Po-Ning [email protected] 1-5


Notion of Relative Frequency
o Similarly, the previous question can be
extended to “In a communication system, can
we estimate the noise by repetitive
measurements at consecutive but different time
instance?”

o Some assumptions on the statistical models are


necessary!

© Po-Ning [email protected] 1-6


Conditional Probability
o Definition of conditional probability
æ N n ( A ! B ) ö P( A ! B )
P ( B | A) çç » ÷÷ =
è N n ( A) ø P ( A)

o Independence of events P ( B | A) = P ( B )
n A knowledge of occurrence of event A tells us no
more about the probability of occurrence of event B
than we knew without this knowledge.
n Hence, they are statistically independent.

© Po-Ning [email protected] 1-7


Random Variable
o A non-negative function fX(x) satisfies
$
𝐹! 𝑥 = Pr X ≤ 𝑥 = ( 𝑓! 𝑡 d𝑡
"#
is called the probability density function (pdf) of
random variable X.
o If the pdf of X exists, then
¶FX ( x )
f X ( x) =
¶x
© Po-Ning [email protected] 1-8
Random Variable
o It is not necessarily true that
n If ¶FX ( x )
f X ( x) = ,
¶x
then the pdf of X exists and equals fX(x).

© Po-Ning [email protected] 1-9


Random Vector
n If its joint density fX,Y(x,y) exists, then
@ 2 FX,Y (x,y)
fX,Y (x, y) = @x@y

where 𝐹!,# 𝑥, 𝑦 = Pr[𝑋 ≤ 𝑥 and 𝑌 ≤ 𝑦]

n The conditional density of Y given that [X = x] is


fX,Y (x,y)
fY |X (y|x) = fX (x)

provided that fX(x) ¹ 0.


© Po-Ning [email protected] 1-10
Random Process
o Random process: An extension of multi-
dimensional random vectors
n Representation of two-dimensional random vector
o (X,Y) = (X(1), X(2)) = {X(j), jÎI}, where the index set I
equals {1, 2}.
n Representation of m-dimensional random vector
o {X(j), jÎI}, where the index set I equals {1, 2,…, m}.

© Po-Ning [email protected] 1-11


Random Process
n How about {X(t), tÎÂ}?
o It is no longer a random vector since the index set is
continuous!
o This is a suitable model for, e.g., a noise because a
noise often exists continuously in time.

© Po-Ning [email protected] 1-12


Stationarity
o The statistical property of a random process
encountered in real world is often independent
of the time at which the observation (or
experiment) is initiated.
o Mathematically, this can be formulated as that
for any t1, t2, …, tk and t:
FX ( t +t ), X ( t +t ),..., X ( t
1 2 k +t ) ( x1 , x2 ,..., xk )
= FX ( t ), X ( t
1 2 ),..., X ( tk ) ( x1 , x2 ,..., xk )
© Po-Ning [email protected] 1-13
Stationarity
o Why introducing “stationarity?”
n With stationarity, we can be certain that the
observations made at different instances of time have
the same distributions!
n For example, X(0), X(T), X(2T), X(3T), ….

n Suppose that Pr[X(0) = 0] = Pr[X(0)=1] = ½. Can we


guarantee that the relative frequency of “1’s
appearance” for experiments performed at several
different instances of time approach ½ by stationarity?
No, we need an additional assumption!
© Po-Ning [email protected] 1-14
Mean Function
o The mean of a random process X(t) at time t is
equal to:
¥
µ X (t ) = E[ X (t )] = ò x × f X ( t ) ( x )dx

where fX(t)(×) is the pdf of X(t) at time t.

o If X(t) is stationary, µX(t) is a constant for all t.

© Po-Ning [email protected] 1-15


Autocorrelation
o The autocorrelation function of a (possibly
complex) random process X(t) is given by:

o If X(t) is stationary, the autocorrelation


function RX(t1, t2) is equal to RX(t1 - t2, 0).
© Po-Ning [email protected] 1-16
Autocorrelation

A short-hand for
autocorrelation function
of a stationary process
© Po-Ning [email protected] 1-17
Autocorrelation
o Conceptually,
n Autocorrelation function = “power correlation”
between two time instances t1 and t2.
n “Variance” is the degree of variation to the standard
value (i.e., mean).

© Po-Ning [email protected] 1-18


Autocovariance

© Po-Ning [email protected] 1-19


Autocovariance
o If X(t) is stationary, CX(t1, t2) becomes

© Po-Ning [email protected] 1-20


Wide-Sense Stationary (WSS)
o Since in most cases of practical interest, only
the first two moments (i.e., µX(t) and CX(t1, t2))
are concerned, an alternative definition of
stationarity is introduced.
o Definition (Wide-Sense Stationarity) A
random process X(t) is WSS if
ì µ X (t ) = constant; ì µ X (t ) = constant;
í or í
îC X (t1 , t2 ) = C X (t1 - t2 ) î RX (t1 , t2 ) = RX (t1 - t2 ).
© Po-Ning [email protected] 1-21
Wide-Sense Stationary (WSS)
o Alternative names for WSS
n weakly stationary
n stationary in the weak sense
n second-order stationary

o If the first two moments of a random process


exist (i.e., are finite), then strictly stationary
implies weakly stationary (but not vice versa).

© Po-Ning [email protected] 1-22


Cyclostationarity
o Definition (Cyclostationarity) A random
process X(t) is cyclostationary if there exists a
constant T such that
ì µ X (t + T ) = µ X (t );
í
îC X (t1 + T , t2 + T ) = C X (t1 , t2 ).

© Po-Ning [email protected] 1-23


Properties of Autocorrelation Function for
WSS Random Process
1. Mean Square Value: RX(0) = E[|X(t)|2]
2. Conjugate Symmetry:

n Recall that autocorrelation function = “power


correlation” between two time instances t1 and t2.
n For a WSS process, this “power correlation” only
depends on time difference.
n Hence, we only need to deal with RX(t) here.

© Po-Ning [email protected] 1-24


Properties of Autocorrelation Function for
WSS Random Process
3. Real Part Peaks at zero: |Re{RX(t)}| ≦ RX(0)
Proof:

Hence,
with equality holding when

© Po-Ning [email protected] 1-25


Properties of Autocorrelation Function for
WSS Random Process
o Operational meaning of autocorrelation
function:
n The “power” correlation of a random process at t
seconds apart.
n The smaller RX(t) is, the less the correlation
between X(t) and X(t+t).
o Here, we assume 𝑋(𝑡) is a real-valued random process.

© Po-Ning [email protected] 1-26


Properties of Autocorrelation Function for
WSS Random Process
n If RX(t) decreases faster, the correlation decreases
faster.
RX(t)

© Po-Ning [email protected] 1-27


Example: Signal with Random Phase
o Let X(t) = A cos(2pfct + Q), where Q is
uniformly distributed over [-p, p).
n Application: A local carrier at the receiver side may
have a random “phase difference” with respect to
the phase of the carrier at the transmitter side.

© Po-Ning [email protected] 1-28


Example: Signal with Random Phase
…0110 …,-m(t), m(t), m(t), -m(t)
Channel
Modulator
Encoder

m(t)

T
x(t)=A cos(2pfct)
Local carrier Carrier wave
w(t)=0 Accos(2pfct)
cos(2pfct+Q)
0110… >0 yT T s(t)
< ò0
dt Ä Å Ä
correlator

An equivalent view: Local carrier X(t)=A cos(2pfct+Q) 1-29


cos(2pfct)
Example: Signal with Random Phase
Then µ X (t ) = E[ A cos(2pf c t + Q)]
p 1
= ò A cos(2pf c t + q ) dq
-p 2p
A p
=
2p ò-p
cos(q + 2pf c t )dq

A
= (sin(q + 2pf ct ) ) -p
p

2p
A
= (sin(p + 2pf ct ) - sin( -p + 2pf ct ) )
2p
= 0.
© Po-Ning [email protected] 1-30
Example: Signal with Random Phase
RX (t1 , t2 ) = E [ A cos(2pf c t1 + Q) × A cos(2pf c t2 + Q)]
p 1
= A ò cos(q + 2pf c t1 ) cos(q + 2pf c t2 )
2
dq
-p 2p
A2 p 1
= ò (cos[(q + 2pf c t1 ) + (q + 2pf c t2 )]
2p -p 2
+ cos[(q + 2pf c t1 ) - (q + 2pf c t2 )])dq
A2 p 1
= ò (cos(2q + 2pf c (t1 + t2 ) ) + cos(2pf c (t1 - t2 ) ))dq
2p -p 2 =0
A2
= cos(2pf c (t1 - t2 ) ). Hence, X(t) is WSS.
2
© Po-Ning [email protected] 1-31
Example: Signal with Random Phase

RX(t)

© Po-Ning [email protected] 1-32


Example: Signal with Random Delay
o Let
¥
X (t ) = å A× I
n = -¥
n × p(t - nT - td )

where …, I-2, I-1, I0, I1, I2, … are independent,


and each Ij is either -1 or +1 with equal
probability, and
ì1, 0 £ t < T
p(t ) = í
î0, otherwise
© Po-Ning [email protected] 1-33
Example: Signal with Random Delay

I-4 I-3 I-2 I-1 I0 I1 I2 I3

I-4 I-2 I-1 I1 I2 I3

I-3 I0

© Po-Ning [email protected] 1-34


Example: Signal with Random Delay
…0110 …,-m(t), m(t), m(t), -m(t)
Channel
Modulator
Encoder

m(t) = p(t)

No (or ignore)
x(t) = A p(t) w(t)=0 carrier wave
0110… >0 yT RT s(t)
Å
td
< td
dt

correlator
RT
An equivalent view: dt X(t) = A p(t−td) 1-35
0
Example: Signal with Random Delay
n By assuming that td is uniformly distributed over [0,
T), we obtain:

é ¥
ù
µ X (t ) = E ê å A × I n × p(t - nT - td )ú
ën = -¥ û
¥
= å A × E[ I
n = -¥
n
] × E[ p(t - nT - td )]
¥
= å A × 0 × E[ p(t - nT - t )]
n = -¥
d

=0
© Po-Ning [email protected] 1-36
Example: Signal with Random Delay
n A useful probabilistic rule: E[X] = E[E[X|Y]]
So, we have:

E [ X (t1 ) X (t2 )] = E [E [X (t1 ) X (t2 ) td ]]

E[X|Y ] = x fX|Y (x|y)dx = g(y)


Note: X
E E[X|Y ] = g(y) fY (y)dy
Y

© Po-Ning [email protected] 1-37


E [X (t1 ) X (t2 ) td ]
éæ ¥ öæ ¥ ö ù
= E êç å A × I n × p(t1 - nT - td ) ÷ç å A × I m × p(t2 - mT - td ) ÷ td ú
êëè n = -¥ øè m = -¥ ø úû
¥ ¥
= A2 å å E [ I n I m | td ]E [ p(t1 - nT - td ) p(t2 - mT - td ) | td ]
n = -¥m = -¥
¥ ¥
= A2 å å E [ I n I m ] p(t1 - nT - td ) p(t2 - mT - td )
n = -¥m = -¥
¥
= A2 å E [ I n2 ] p(t1 - nT - td ) p(t2 - nT - td )
n = -¥
¥
= A2 å p(t1 - nT - td ) p(t2 - nT - td )
n = -¥

Since E [ I n I m ] = E [ I n ]E [ I m ] = 0 for n ¹ m.
© Po-Ning [email protected] 1-38
Among <n< , there is at most one n that can make

p(t1 nT td )p(t2 nT td ) = 1.

© Po-Ning [email protected] 1-39


© Po-Ning [email protected] 1-40
As a result,

© Po-Ning [email protected] 1-41


Example: Signal with Random Delay

RX(t)

© Po-Ning [email protected] 1-42


Cross-Correlation
o The cross-correlation between two processes
X(t) and Y(t) is:

o Sometimes, their correlation matrix is given


instead for convenience:
é RX (t , u ) RX ,Y (t , u )ù
R X ,Y (t , u ) = ê ú
ë RY , X (t , u ) RY (t , u ) û
© Po-Ning [email protected] 1-43
Cross-Correlation
o If X(t) and Y(t) are jointly WSS, then

R X ,Y (t , u ) = R X ,Y (t - u )
é RX (t - u ) RX ,Y (t - u )ù
=ê ú
ë RY , X (t - u ) RY (t - u ) û

© Po-Ning [email protected] 1-44


Example: Quadrature-Modulated Random
Delay Processes
o Consider a pair of quadrature decomposition of
X(t) as:
ì X I (t ) = X (t ) cos(2pf c t + Q)
í
î X Q (t ) = X (t ) sin(2pf c t + Q)
where Q is independent of X(t) and is uniformly
distributed over [0, 2p), and
¥
X (t ) = å A× I
n = -¥
n × p(t - nT - td ) .

© Po-Ning [email protected] 1-45


Example: Quadrature-Modulated Random
Delay Processes

RX I ,XQ
(t , u ) = E[ X I (t ) X Q (u )]
= E[ X (t ) cos(2pf c t + Q) × X (u ) sin(2pf c u + Q)]
= E[ X (t ) X (u )]E[sin( 2pf c u + Q) cos(2pf c t + Q)]
=0
é sin(2pf c (t + u ) + 2Q) + sin(2pf c (u - t )) ù
= RX (t , u ) E ê úû
ë 2
1
= - sin(2pf c (t - u )) RX (t , u )
2

© Po-Ning [email protected] 1-46


Example: Quadrature-Modulated Random
Delay Processes

© Po-Ning [email protected] 1-47


Example: Quadrature-Modulated Random
Delay Processes
n Notably, if t = u, i.e., two quadrature components
are synchronized, then

RX I ,XQ (t , t ) = 0

which indicates that simultaneous observations of


the quadrature-modulated processes are “orthogonal”
to each other!
(See Slide 1-59 for a formal definition of
orthogonality.)

© Po-Ning [email protected] 1-48


Ergodicity
o For a random-process-modeled noise (or random-
process-modeled source) X(t), how can we know its
mean and variance?
n Answer: Relative frequency.
n How can we get the relative frequency?
o By measuring X(t1), X(t2), …, X(tn), and calculating their average,
it is expected that this time average will be close to its mean.
o Question: Will this time average be close to its mean,
if X(t) is stationary ?
n Even if for a stationary process, the mean function µX(t) is
independent of time t, the answer is negative!
© Po-Ning [email protected] 1-49
Ergodicity
n An additional ergodicity assumption is necessary
for time average converging to the ensemble
average µX.

© Po-Ning [email protected] 1-50


Time Average versus Ensemble Average
o Example
n X(t) is stationary.
n For any t, X(t) is uniformly distributed over {1, 2, 3,
4, 5, 6}.
n Then, ensemble average is equal to:
1 1 1 1 1 1
1 × + 2 × + 3 × + 4 × + 5 × + 6 × = 3.5
6 6 6 6 6 6

© Po-Ning [email protected] 1-51


Time Average versus Ensemble Average
n We make a series of observations at time 0, T,
2T, …, 10T to obtain 1, 2, 3, 4, 3, 2, 5, 6, 4, 1.
(These observations are deterministic!)
n Then, the time average is equal to:
1+ 2 + 3+ 4 + 3+ 2 + 5 + 6 + 4 +1
= 3.1
10

© Po-Ning [email protected] 1-52


Ergodicity
o Definition. A stationary process X(t) is ergodic
in the mean if Time average Ensemble

[ ]
average

1. Pr lim µ X (T ) = µ X = 1, and
T ®¥

2. lim Var [µ X (T )] = 0
T ®¥

where
1 T
µ X (T ) =
2T ò
-T
X (t )dt

© Po-Ning [email protected] 1-53


Ergodicity
o Definition. A stationary process X(t) is ergodic
in the autocorrelation function if
Ensemble
Time average average

[
1. Pr lim RX (t ; T ) = RX (t ) = 1, and
T ®¥
]
2. lim Var [RX (t ; T )] = 0
T ®¥

where 1 #
𝑅! 𝜏; 𝑇 = ( 𝑋 𝑡 + 𝜏 𝑋 ∗ 𝑡 d𝑡
2𝑇 "#
© Po-Ning [email protected] 1-54
Ergodicity
o Experiments (or observations) on the same
process can only be performed at different time.
o “Stationarity” only guarantees that the
observations made at different time come from
the same distribution.
n Example. Rolling two different fair dices will get
two results but the two results have the same
distribution.

© Po-Ning [email protected] 1-55


Statistical Average of Random Variables
o Alternative names of ensemble average
n Mean
n Expected value, or expectation value
n Sample average
o How about the sample average of a function g( )
of a random variable X ?

¥
E [ g ( X )] = ò g ( x ) f X ( x )dx

© Po-Ning [email protected] 1-56


Statistical Average of Random Variables
o The nth moment of random variable X
n
R1 n
E[X ] = 1
x fX (x)dx
n The 2nd moment is also named mean-square value.
o The nth central moment of random variable X

n The 2nd central moment is also named variance.


n Square root of the 2nd central moment is also named
standard deviation.

© Po-Ning [email protected] 1-57


Joint Moments
o The joint moment of X and Y is given by:

n When i = j = 1, the joint moment is specifically


named correlation.
n The correlation of centered random variables is
specifically named covariance.

© Po-Ning [email protected] 1-58


Joint Moments
n Two random variables, X and Y, are uncorrelated if
Cov[X, Y] = 0.
n Two random variables, X and Y, are orthogonal if
E[XY*] = 0.
n The covariance, normalized by two standard
deviations, is named correlation coefficient of X
and Y.
Cov[ X , Y ]
r=
s XsY

© Po-Ning [email protected] 1-59


Stable Linear Time-Invariant (LTI) System
Impulse
X(t) Response Y(t)
h(t)
o Linear
n Y(t) is a linear function of X(t).
n Specifically, Y(t) is a weighted sum of X(t).
o Time-invariant
n The weights are time-independent.
o Stable ¥

n Dirichlet’s condition (defined later) and ò-¥ t dt < ¥


2
| h ( ) |
n “Stability” implies that if the input is an energy function (i.e., finite
energy), the output is an energy function.

© Po-Ning [email protected] 1-60


Example of LTI Filter: Mobile Radio Channel
(a1 ,t 1 )

X (t ) Y (t )
(a 2 ,t 2 )
Transmitter Receiver

(a 3 ,t 3 )

© Po-Ning [email protected] 1-61


Example of LTI Filter: Mobile Radio Channel

X (t ) Y (t )
Transmitter Receiver

¥
Y (t ) = ò h(t ) X (t - t )dt

© Po-Ning [email protected] 1-62


Stable Linear Time-Invariant (LTI) System
o What are the mean and autocorrelation
functions of the LTI filter output Y(t)?
n Suppose X(t) is stationary and has finite mean.
¥
n Suppose ò | h(t ) | dt < ¥

n Then

µY (t ) = E[Y (t )] = E ò h(t ) X (t - t )dt ù


é ¥

êë - ¥ úû
¥ ¥
= ò h(t ) E [ X (t - t )]dt = µ X ò h(t )dt
-¥ -¥

© Po-Ning [email protected] 1-63


Zero-Frequency (ZF) or Direct Current
(DC) Response

Impulse
1 Response
h(t)

n The mean of the LTI filter output process is equal to


the mean of the stationary filter input multiplied by
the DC response of the system.
¥
µY = µ X ò h(t )dt

© Po-Ning [email protected] 1-64


Autocorrelation Relation of LTI system

© Po-Ning [email protected] 1-65


Important Fact: WSS Input Induces
WSS Output
o From the above derivations, we conclude:
n For a stable LTI filter, a WSS input guarantees to
induce a WSS output.
n In general (not necessarily WSS),
Z 1
µY (t) = h(⌧ )µX (t ⌧ )d⌧
1
Z 1 Z 1
RY (t, u) = h(⌧1 )h⇤ (⌧2 )RX (t ⌧1 , u ⌧2 )d⌧2 d⌧1
1 1
n As the above two quantities also relate in the
“convolution” form, a spectrum analysis is perhaps
better in characterizing their relationship.
© Po-Ning [email protected] 1-66
Summary
o Random variable, random vector and random process
o Autocorrelation and crosscorrelation
o Definition of WSS
o Why ergodicity?
n Time average as a good “estimate” of ensemble average

© Po-Ning [email protected] 1-67

You might also like