0% found this document useful (0 votes)
43 views8 pages

Tut10 2

Uploaded by

Abhishek meena
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
43 views8 pages

Tut10 2

Uploaded by

Abhishek meena
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 8

EE320A

Tutorial 10
Date: 13th Oct. 2023

1. Consider a random process X(t) defined by

X(t) = sin(2πF t)

in which the frequency F is a random variable with the probability density


function

1/W for 0 ≤ f ≤ W
fF (f ) =
0 elsewhere.

Is X(t) WSS?
2. A random process X(t) is defined by

X(t) = A cos(2πfc t)

where A is a Gaussian distributed random variable with zero mean and


2
variance σA . This random process is applied to an ideal integrator pro-
ducing an output Y (t) defined by
Z t
Y (t) = X(τ ) dτ.
τ =0

(a) Determine the probability density function of the output Y (t) at a


particular time tk .
(b) Determine whether Y (t) is WSS.
(c) Determine whether Y (t) is ergodic in the mean and the autocorrela-
tion.

x(t)
T0 /2

T0
td

Figure 1: A periodic waveform with a random timing phase.

3. The square wave x(t) of constant amplitude A, period T0 and delay td rep-
resents the sample function of a random process X(t). This is illustrated
in Figure 1. The delay is random and is described by the pdf

1/T0 for 0 ≤ td < T0
fTd (td ) =
0 otherwise.
K Vasudevan Faculty of EE IIT Kanpur ([email protected]) 2

(a) Determine the mean and autocorrelation of X(t) using ensemble av-
eraging.
(b) Determine the mean and autocorrelation of X(t) using time averag-
ing.
(c) Is X(t) WSS?
(d) Is X(t) ergodic in the mean and the autocorrelation?

SX (f )

δ(f )

1
f

−f0 f0

Figure 2: Power spectral density of a random process X(t).

4. The psd of a random process X(t) is shown in Figure 2.


(a) Determine RX (τ ).
(b) Determine the dc power in X(t).
(c) Determine the ac power in X(t).
(d) What sampling rates will give uncorrelated samples of X(t)? Are the
samples statistically independent?
5. Using ensemble averaging, find the mean and the autocorrelation of the
random process given by:

X
X(t) = Sk p(t − kT − α)
k=−∞

where Sk denotes a discrete random variable taking values ±A with equal


probability, p(·) denotes a real-valued waveform, 1/T denotes the bit-rate
and α denotes a random timing phase uniformly distributed in [0, T ).
Assume that Sk is independent of α and Sk is independent of Sj for k 6= j.
EE320A
Solutions for Tutorial 10
Date: 13th Oct. 2023
1. Solution: Let us first compute the mean value of X(t).
Z W
1
E[X(t)] = sin(2πF t) dF
W f =0
−1
= (cos(2πW t) − 1) (1)
2W πt
which is a function of time. Hence X(t) is not WSS.
2. Solution: The random variable Y (tk ) is given by:
A
Y (tk ) = sin(2πfc tk ). (2)
2πfc
Since all the terms in the above equation excepting A are constants for
the time instant tk , Y (tk ) is also a Gaussian distributed random variable
with mean and variance given by
E[A]
E[Y (tk )] = sin(2πfc tk ) = 0
2πfc
E[A2 ]
E[Y 2 (tk )] = sin2 (2πfc tk )
4π 2 fc2
2
σA
= sin2 (2πfc tk ). (3)
4π 2 fc2
Since the variance is a function of time, Y (tk ) is not WSS. Hence it is not
ergodic in the autocorrelation. However, it can be shown that the time
averaged mean is zero. Hence Y (t) is ergodic in the mean.
3. Solution: Instead of solving this particular problem we try to solve a
more general problem. Let p(t) denote an arbitrary pulse shape. Consider
a random process X(t) defined by:

X
X(t) = p(t − kT0 − td ) (4)
k=−∞

where td is a uniformly distributed random variable in the range [0, T0 ).


Clearly, in the absence of td , X(t) is no longer a random process and it
simply becomes a periodic waveform. We wish to find out the mean and
autocorrelation of the random process defined in (4). The mean of X(t)
is given by:
" ∞ #
X
E[X(t)] = E p(t − kT0 − td )
k=−∞
K Vasudevan Faculty of EE IIT Kanpur ([email protected]) 4

T0 ∞
1
Z X
= p(t − kT0 − td ) dtd . (5)
T0 td =0 k=−∞

Interchanging the order of integration and summation and substituting

t − kT0 − td = x (6)

we get
∞ Z t−kT0
1 X
E[X(t)] = p(x) dx. (7)
T0 x=t−kT0 −T0
k=−∞

Combining the summation and the integral we get:


Z ∞
1
E[X(t)] = p(x) dx. (8)
T0 x=−∞
For the given problem:
A
E[X(t)] = . (9)
2
The autocorrelation can be computed as:
" ∞
X
E[X(t)X(t − τ )] = E p(t − kT0 − td )
k=−∞


X
p(t − τ − jT0 − td )
j=−∞
∞ ∞
1 X X
=
T0
k=−∞ j=−∞
Z T0
p(t − kT0 − td )p(t − τ − jT0 − td ) dtd .
td =0
(10)

Substituting

t − kT0 − td = x (11)

we get
∞ ∞
1 X X
E[X(t)X(t − τ )] =
T0
k=−∞ j=−∞
Z t−kT0
p(x)p(x + kT0 − τ − jT0 ) dx.
x=t−kT0 −T0
(12)
K Vasudevan Faculty of EE IIT Kanpur ([email protected]) 5

Let
kT0 − jT0 = mT0 (13)
Substituting for j we get
∞ ∞
1 X X
E[X(t)X(t − τ )] =
T0
k=−∞ m=−∞
Z t−kT0
p(x)p(x + mT0 − τ ) dx. (14)
x=t−kT0 −T0

Now we interchange the order of summation and combine the summation


over k and the integral to obtain

1 X
Z ∞
E[X(t)X(t − τ )] = p(x)p(x + mT0 − τ ) dx
T0 m=−∞ x=−∞

1 X
= Rp (τ − mT0 ) = RX (τ ) (15)
T0 m=−∞
where Rp (τ ) is the autocorrelation of p(t). Thus, the autocorrelation of
X(t) is also periodic with a period T0 , hence X(t) is a cyclostationary
random process. This is illustrated in Figure 3(c).

p(t)
(a)
A

T0 /2 T0

Rp (τ )
(b)
A2 T0 /2

−T0 /2 T0 /2

RX (τ )
(c)
A2 /2

−T0 /2 T0 /2

Figure 3: Computing the autocorrelation of a periodic wave with random timing


phase.

Since the random process is periodic, the time-averaged mean is given by:
Z td +T0
1
< x(t) > = x(t) dt
T0 t=td
K Vasudevan Faculty of EE IIT Kanpur ([email protected]) 6

A
= (16)
2
independent of td . Comparing (9) and (16) we find that X(t) is ergodic
in the mean.
The time-averaged autocorrelation is given by:
T0 /2
1
Z
< x(t)x(t − τ ) > = x(t)x(t − τ ) dt
T0 t=−T0 /2

1 X
= Rg (τ − mT0 ) (17)
T0 m=−∞

where Rg (τ ) is the autocorrelation of the generating function of x(t) (the


generating function has been discussed earlier in Chapter 2 of Haykin 2nd
ed). The generating function can be conveniently taken to be

p(t − td ) for td ≤ t < td + T0
g(t) = (18)
0 elsewhere

where p(t) is illustrated in Figure 3(a). Let P (f ) be the Fourier transform


of p(t). We have

↽ |P (f )|2 ⇀
Rg (τ ) = g(t) ⋆ g(−t) ⇀ ↽ p(t) ⋆ p(−t) = Rp (τ ). (19)

Therefore, comparing (15) and (17) we find that X(t) is ergodic in the
autocorrelation.
4. Solution: The psd of X(t) can be written as:
 
|f |
SX (f ) = δ(f ) + 1 − . (20)
f0

We know that

A rect (t/T0 ) ⇀
↽ AT0 sinc (f T0 )
⇀ A2 T02 sinc2 (f T0 )
⇒ A rect (t/T0 ) ⋆ A rect (−t/T0 ) ↽
 
|t|
⇒ A2 T0 1 − ⇀
↽ A2 T02 sinc2 (f T0 ). (21)
T0

Applying duality
 
|f |
2
A f0 1− ⇀
↽ A2 f02 sinc2 (tf0 ). (22)
f0

Given that A2 f0 = 1. Hence


 
|f |
1− ⇀
↽ f0 sinc2 (tf0 ). (23)
f0
K Vasudevan Faculty of EE IIT Kanpur ([email protected]) 7

Thus

RX (τ ) = 1 + f0 sinc2 (f0 τ ). (24)

Now, consider a real-valued random process Z(t) given by:

Z(t) = A + Y (t) (25)

where A is a constant and Y (t) is a zero-mean random process. Clearly

E[Z(t)] = A
RZ (τ ) = E[Z(t)Z(t − τ )] = A2 + RY (τ ). (26)

Thus we conclude that if a random process has a dc component equal to


A, then the autocorrelation has a constant component equal to A2 . Hence
from (24) we get:

E[X(t)] = 1. (27)

Hence the dc power (contributed by the delta function of the psd) is unity.
The ac power (contributed by the triangular part of the psd) is f0 .
The covariance of X(t) is

KX (τ ) = cov (X(t)X(t − τ )) = E[(X(t) − 1)(X(t − τ ) − 1)]


= f0 sinc2 (f0 τ ). (28)

It is clear that

f0 for τ = 0
KX (τ ) = (29)
0 for τ = n(k/f0 ), n, k 6= 0

where n and k are positive integers. Thus, when X(t) is sampled at a rate
equal to f0 /k, the samples are uncorrelated. However the samples may
not be statistically independent.
5. Solution: Since Sk and α are independent:

X
E[X(t)] = E[Sk ]E[p(t − kT − α)]
k=−∞
= 0 (30)

where, for the given binary phase shift keying (BPSK) constellation

E[Sk ] = (1/2)A + (1/2)(−A)


= 0. (31)
K Vasudevan Faculty of EE IIT Kanpur ([email protected]) 8

The autocorrelation of X(t) is

RX (τ ) = E[X(t)X(t − τ )]
 

X ∞
X
= E Sk p(t − kT − α) Sj p(t − τ − jT − α)
k=−∞ j=−∞

X ∞
X
= E[Sk Sj ]E[p(t − kT − α)p(t − τ − jT − α)]
k=−∞ j=−∞
X∞ X∞
= A2 δK (k − j)
k=−∞ j=−∞

1 T
Z
× p(t − kT − α)p(t − τ − jT − α) dα
T α=0
A2 T

X Z
= p(t − kT − α)p(t − τ − kT − α) dα (32)
T α=0
k=−∞

where we have assumed that Sk and α are independent and



1 for k = j
δK (k − j) = (33)
0 for k 6= j

is the Kronecker delta function. Let

x = t − kT − α. (34)

Substituting (34) in (32) we obtain:

A2 t−kT

X Z
RX (τ ) = p(x)p(x − τ ) dx. (35)
T x=t−kT −T
k=−∞

Combining the summation and the integral, (35) becomes

A2 ∞
Z
RX (τ ) = p(x)p(x − τ ) dx
T x=−∞
A2
= Rp (τ ) (36)
T
where Rp (·) is the autocorrelation of p(·).

You might also like