0% found this document useful (0 votes)
57 views

Random Process: Some Examples

The document discusses several examples of random processes including quadrature modulation where two random variables X1(t) and X2(t) are defined in terms of a random variable Θ that is uniformly distributed from 0 to 2π. It also discusses the correlation function of such processes. It describes the difference between ensemble and time averages for random processes and the concept of ergodicity. Finally, it introduces linear shift invariant systems and defines their key properties of linearity and shift invariance.

Uploaded by

anurag3069
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
57 views

Random Process: Some Examples

The document discusses several examples of random processes including quadrature modulation where two random variables X1(t) and X2(t) are defined in terms of a random variable Θ that is uniformly distributed from 0 to 2π. It also discusses the correlation function of such processes. It describes the difference between ensemble and time averages for random processes and the concept of ergodicity. Finally, it introduces linear shift invariant systems and defines their key properties of linearity and shift invariance.

Uploaded by

anurag3069
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 7

' $

Random Process: Some Examples


• Quadrature Modulation Process Given two random variables
X1 (t) and X2 (t)

X1 (t) = X(t) cos(2ω0 t + Θ)


X2 (t) = X(t) sin(2ω0 t + Θ)

where ω0 is a constant, and Θ is a random variable that is


uniformly distributed over a range of 0 to 2π, that is,

1
fΘ (θ) = , 0 ≤ θ ≤ 2π

= 0, elsewhere

The correlation function of X1 (t) and X2 (t) is


& %
' $

R12 (τ ) = E[X1 (t)X2 (t + τ )]


= E[X(t) cos(ω0 t + Θ)X(t − τ ) sin(2ω0 (t − τ ) + Θ)]
Z 2π
1
= X(t)X(t − τ ) cos(ω0 t + Θ) sin(ω0 (t − τ ) + Θ) dΘ
0 2π
1
= RX (τ ) sin(ω0 τ )
2

& %
' $
Random Process: Time vs. Ensemble Averages
Ensemble averages
• Difficult to generate a number of realisations of a random
process
• =⇒ use time averages
• Mean

Z +T
1
µx (T ) = x(t) dt
2T −T

• Autocorrelation

Z +T
1
Rx (τ, T ) = x(t)x(t + τ ) dt
2T
& %
−T
' $

• Ergodicity A random process is called ergodic if


1. it is ergodic in mean:

lim µx (T ) = µX
T →+∞
lim var[µx (T )] = 0
T →+∞

2. it is ergodic in autocorrelation:

lim Rx (τ, T ) = RX (τ )
T →+∞
lim var[Rx (τ, T )] = 0
T →+∞

where µX and RX (τ ) are the ensemble averages of the same


random process.
& %
' $
Random Processes and Linear Shift Invariant
Systems(LSI)
• The communication channel can be thought of as a system
• The signal that is transmitted through the channel is a
realisation of the random process
• It is necessary to understand the behaviour of a signal that is
input to a system.
• For analysis purposes it is assumed that a system is LSI
• Linear Shift Invariant(LSI) Systems

& %
Figure 1: An LSI system
' $
In Figure 1 , h[n] is an LSI system if it satisfies the following
properties
– Linearity The system is called linear, if the following
equation holds for all signals x1 [n] and x2 [n] and any a and
b:

x1 [n] → y1 [n]
x2 [n] → y2 [n]
=⇒ a.x1 [n] + b.x2 [n] → a.y1 [n] + b.y2 [n]

– Shift Invariance The system is called Shift Invariant, if the


following equation holds for any signal x[n]

x[n] → y[n]

& %
=⇒ x[n − n0 ] → y[n − n0 ]
' $

∗ The assumption is that the output of the system is linear,


in that if the input scaled, the output is scaled by the
same factor.
∗ The system supports superposition
· When two signals are added in the time domain, the
output is equal to the sum of the individual responses
∗ If the input to the system is delayed by n0 , the output is
also delayed by n0 .

& %

You might also like