0% found this document useful (0 votes)
2 views

Tutorial2_Ans (1)

This document is a tutorial for ECN-311: Principles of Digital Communication, containing 12 problems related to random variables, their distributions, and statistical properties. Each problem provides a detailed mathematical analysis including finding probability density functions, means, variances, and applying concepts like the Cauchy-Schwartz inequality. The tutorial is designed for comprehensive practice and is due on August 16, 2024.

Uploaded by

mayankkapil2130
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
2 views

Tutorial2_Ans (1)

This document is a tutorial for ECN-311: Principles of Digital Communication, containing 12 problems related to random variables, their distributions, and statistical properties. Each problem provides a detailed mathematical analysis including finding probability density functions, means, variances, and applying concepts like the Cauchy-Schwartz inequality. The tutorial is designed for comprehensive practice and is due on August 16, 2024.

Uploaded by

mayankkapil2130
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 7

ECN-311: Principles of Digital Communication

Department of Electronics and Communication Engineering


Indian Institute of Technology Roorkee

Tutorial 2
By
Prof. Ekant Sharma

TAs: Abhilash Ranjan and Malay Chakraborty

Date: Aug 7, 2024 Due date: Aug 16, 2024


Instructions for students:

• This tutorial consists of 12 questions.

• Attempt all questions for comprehensive practice.

• In case of any doubt, kindly contact the TAs assigned for the respective Tutorial only.

Problem 1.
A random variable X is defined by

fX (x) = ae−b|x| , ∀x ∈ R

where a and b are positive constants.


i) Find the value of ab if fX (x) is pdf of random variable X.
ii) Compute the CDF for this function for x ≥ 0.
iii) Find the mean and variance of the given double exponential random variable.

Answer
R∞ 1.
i) −∞ fX (x) dx = 1 =⇒ ab = 2
Rx
ii)FX (x) = −∞ fX (x) dx = ab 2 − e−bx = 12 2 − e−bx , x ≥ 0
 
R∞ R∞
iii) Mean, E(X) = −∞ xfX (x) dx = 0, and Var(X) = E(X 2 ) − [E(X)]2 = −∞ x2 fX (x) dx =
4a
b3

Problem 2.
The output of two noise sources each producing uniformly distributed noise U(−a, a) are
added. Obtain the pdf of the resultant noise. Compute their mean and variance.

1
Answer 2.
Concept: The pdf of the sum of two independent random variables is the convolution of
two individual pdfs. (
1 1
2z + , for − 2a ≤ z ≤ 0,
The pdf of the resultant noise Z = X + Y is fZ (z) = 4a 1 2a 1
− 4a2 z + 2a , for 0 ≤ z ≤ 2a.
R∞ R∞ 2
Mean, E(Z) = −∞ zfZ (z) dz = 0, and Var(Z) = E(Z 2 ) − [E(Z)]2 = −∞ z 2 fZ (z) dz = 2a3

Figure 1: The pdf of Z

Shortcut:
a2 − ab
P P P
M ean = 3 a , and V ariance = 18

Problem 3.
Two independent random signals X and Y are known to be Gaussian with their correspond-
ing mean values µx and µy , and corresponding variances σx2 and σy2 . A signal Z = X − 2Y
is obtained from them.
i) Can you comment on the pdf, fZ (z), of the signal Z?
ii) Find the mean and variance of the random variable Z. Also write the expression of its pdf.

Answer 3.
i) Clearly, Z is a linear combination of two independent Gaussian RV. Hence, Z is also
Gaussian.
ii) Taking expectation to both sides of Z = X − 2Y , we get
Mean of Z = µX − 2µY , and
Var(Z) = Var(X − 2Y ) = Var(X) + Var(2Y ) = Var(X) + 22 Var(Y ) = σX
2
+ 4σY2
The pdf of Z is given as  
(z−(µX −2µY ))2
fZ (z) = √ 21 2
exp − 2 2
2(σ +4σ )
, −∞ < z < ∞
2π(σX +4σY ) X Y

2
Problem 4.
Let a random variable Z = X + Y , were X and Y are independent random variables, with
X ∼ N (0, 1) and Y ∼ Laplace(1). Find the cov(X, Z) and V ar(Z).

Answer 4.
cov(X, Z) = 1 and V ar(Z) = 3
Note:
1 − 1b |y|2
i) Y ∼ Laplace(b) =⇒ fY (y) = 2b e , fory ∈ R.
2
ii) Mean of Y =0 and Variance of Y = 2b

Problem 5.
Let K be a discrete uniform random variable with parameters n = −2 and m = 3. Define
the random variable I = 5K + 2. Determine the following expectations by treating E[.] as
a linear operator:
i) E[K]
ii) V ar[K]
iii) E[I]
iv) E[K + I]
v) E[KI]
vi) V ar[I]

Answer 5. 2
For Continuous Uniform RV, X ∼ U (a, b),M ean = a+b2
, V ariance = (b−a)
12
2 −1
But for discrete Uniform RV, X ∼ U (a, b),M ean = a+b
2
, V ariance = (b−a+1)
12
i) E[K] = a+b2
= 12
2 −1
ii) Var[K] = (b−a+1)
12
35
= 12
iii) E[I] = E[5K + 2] = E[5K] + E[2] = 5E[K] + 2 = 92
iv) E[K + I] = E[K + 5K + 2] = 6E[K] + 2 = 5
v) E[KI] = E[K(5K+2)] = 5E[K 2 ]+2E[K] = 5 (Var[K] + (E[K])2 )+2E[K] = 5( 35 12
+ 41 )+1 =
101
6
875
vi) Var[I] = Var[5K + 2] = Var[5K] + Var[2] = 52 Var[K] + 0 = 12

Problem 6.
Cauchy-Schwartz inequality:
For any random variables X and Y ,
p
E(XY ) ≤ E(X 2 )E(Y 2 )),

where equality holds if and only if for some c and d (c2 + d2 ̸= 0), P (cX = dY ) = 1.
Use Cauchy-Schwartz inequality for random variables to show that if the correlation coeffi-
cient between two random variables X and Y is unity, i.e., ρ(X, Y ) = 1, then Y = aX + b
for some a > 0, and b.

3
Answer 6.
Define two random variables M and N as
X − E[X]
M=
σX
Y − E[Y ]
N=
σY
Since M and N are the standard forms of random variables X and Y, we have
E[M ] = E[N ] = 0
Var(M ) = Var(N ) = 1 = E[M 2 ] = E[N 2 ]
Applying Cauchy-Schwartz inequality, we get

ρ(X, Y ) ≤ 1
Equality holds for N = αM , substituting and re-arranging, we get
Y = aX + b, a > 0
 
ασY ασY
where a = σX
and b = E[Y ] − σX E[X]

Problem 7.
The joint pmf of random variables X and Y is given by
(
x+y
21
if x = 1, 2, 3 and y = 1, 2
fXY (x, y) =
0 otherwise
i) Find out the marginal distributions of X and Y .
ii) Check whether they are independent.
iii) Find out the conditional distribution fX|Y =y (x|y).
iv) Find the conditional mean, E(X|Y = y).
v) Find the covariance, Cov(X, Y ).

Answer 7.
i)
x 1 2 3
5 7 9
fX (x) 21 21 21

Table 1: The pmf of X

ii) fXY (1, 1) ̸= fX (1)fY (1). Hence, they are not independent.
iii) Using fX|Y =y (x | y) = fXY (x,y)
fY (y)
, x = 1, 2, 3, the conditional pdf is given by Table (3)
and Table(4). P
iv) Using E(X | Y = y) = x xfX|Y =y (x | y), we get
E(X | Y = 1) = 20 9
, and E(X | Y = 2)P = 26
12
6
P P
v) Cov(X, Y ) = E[XY ]−E[X]E[Y ] = (x,y) xyfXY (x, y)−( x xfX (x))( y yfY (y)) = − 441

4
y 1 2
9 12
fY (y) 21 21

Table 2: The pmf of Y

x 1 2 3
2 3 4
fX|Y =1 (x|y) 9 9 9

Table 3: The pmf of X|Y = 1

Problem 8.
The joint pdf of random variable X and Y is given by

(
k(x + y) if 0 < x < 2, 0 < y < 1
fXY (x, y) =
0 otherwise

i) Find the value of k.


ii) Give marginal pdfs.
iii) Are they independent? Justify.
iv) Find the conditional pdf fY |X (y|x).
v) Find the conditional mean, E(Y |X = x).
vi) Obtain the covariance, Cov(X, Y ).

Answer 8.
i) (x,y)∈D fXY (x, y) dx dy = 1 =⇒ k = 31
RR
(
2x+1
R∞ 6
if 0 < x < 2
ii) fX (x) = −∞ fXY (x, y) dy =
0 otherwise
(
2
R∞ 3
(y + 1) if 0 < y < 1
fY (y) = −∞ fXY (x, y) dx =
0 otherwise
iii) Since fX (x)fY (y) ̸= fXY ((x, y). Hence, X and Y are not independent.
2(x+y)
if 0 < x < 2 and 0 < y < 1
iv) fY |X (y | x) = fXY (x,y)
fX (x)
= 2x+1
0 otherwise
(
1 3x+2

R1 3 2x+1
if 0 < x < 2
v) E[Y | X = x] = 0 y fY |X (y | x) dy =
0 otherwise
RR R R
vi) Cov(X, Y ) = E[XY ] − E[X]E[Y ] = xyfXY (x, y) dx dy − ( xfX (x) dx)( yfY (y) dy) =
1
− 81

x 1 2 3
3 4 5
fX|Y =2 (x|y) 12 12 12

Table 4: The pmf of X|Y = 2

5
Problem 9.
The noise X in a certain electronic system is a Gaussian Random Variable with mean 0 and
variance σ 2 such that
1 −x2
fX (x) = √ e 2σ2 , −∞ < x < ∞
2πσ 2
The normalized instantaneous power of the noise is given by

X2
Y = ,
σ2
where parameter σ 2 can be interpreted as the average noise power. Obtain the pdf of the
noise power.

Answer 9.
1
fY (y) = √2πy e−y/2 , 0 ≤ y < ∞.
P fX (xi )
Method 1: Given, Y = g(X) and fX (x), pdf of Y, fY (y) =
| dx
dy
| x =g−1 (y)
all i
i i
Method 2: Using the definition of the CDF: FY (y) = P (Y ≤ y) = P (A ≤ X ≤ B)
=⇒ FY (y) = FX (B) − FX (A) On Differentiating FY (y) with respect to y, we get fY (y).

Problem 10.
Find the pdf of Y = sin−1 (X) if X is uniformly distributed in (−1, 1).

Answer(10.
1
2
cos(y) for − π2 ≤ y ≤ π
2
fY (y) =
0 otherwise

Problem 11.
The received voltage in a digital communication system is Z = X + Y , where X ∼
Bernoulli(p) is a random message, and Y ∼ N (0, 1) is a Gaussian noise voltage. Assume X
and Y are independent. Find:
i) the conditional CDF FZ|X (z|i) for i = 0, 1.
ii) the CDF FZ (z)
iii) the pdf fZ (z).

Answer 11.
i) FZ|X (z | i) = Φ(z − i), i = 0, 1.
ii) FZ (z) = Φ(z)(1 − p) + Φ(z − 1)p, where Φ(.) is the CDF of standard normal RV.
2 (z−1)2
z
iii) fZ (z) = √1 e− 2 (1 − p) + √1 e− 2 p.
2π 2π

6
Problem 12.
Let X ∼ U(0, 1), Y ∼ U(0, 1). Suppose X and Y are independent. Define Z = X + Y ,
Z = X − Y . Show that Z and W are not independent but uncorrelated random variables.

Answer 12.
( |J(x, y)| fX (x)fY (y)
i) fZW (z, w) =
1
for 0 ≤ z ≤ 2, −1 ≤ w ≤ 1, z + w ≤ 2, z − w ≤ 2, |w| ≤ z
fZW (z, w) = 2
0 otherwise
R z
1
R∞ R−z 2 dw = z
 for 0 < z < 1
2−z 1
fZ (z) = −∞ fZW (z, w) dw =⇒ fZ (z) = z−2 2
dw = 2 − z for 1 < z < 2

0 otherwise

(R 2−|w|
1
R∞ |w| 2
dz = 1 − |w| for − 1 < w < 1
fW (w) = −∞ fZW (z, w) dz =⇒ fW (w) =
0 otherwise
Since fZ (z)fW (w) ̸= fZW (z, w). Hence, Z and W are not independent.
ii)E[ZW ] = E[(X + Y )(X − Y )] = E[X 2 − Y 2 ] = E[X 2 ] − E[Y 2 ] = 0.
E[W ] = E[X − Y ] = E[X] − E[Y ] = 0.
Now, Cov(Z, W ) = E[ZW ] − E[Z]E[W ] = 0. Hence, Z and W are uncorrelated.
Note:
1. Two random variables are uncorrelated, if and only if

ρ(X, Y ) = 0 =⇒ Cov(X, Y ) = 0 =⇒ E[XY ] = E[X]E[Y ]

2. Independence of RV ⇒ Uncorrelatedness of RVs


3. In general, Uncorrelatedness of RVs ⇏ Independence of RVs. However, for jointly normal
RVs, Uncorrelatedness of RVs ⇒ Independence of RVs.

You might also like