0% found this document useful (0 votes)
46 views2 pages

Tutorial1 20

This document contains 5 questions about entropy and mutual information. Question 1 asks to show that the entropy of a function of a random variable is less than or equal to the entropy of the random variable. Question 2 provides a joint probability distribution and asks to calculate various entropies and mutual information. Question 3 defines a measure of correlation in terms of conditional entropy and proves some properties. Question 4 asks to derive results from theorems in a reference book. Question 5 states an inequality for the joint entropy of random variables and condition for equality.

Uploaded by

Er Rudrasen Pal
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
46 views2 pages

Tutorial1 20

This document contains 5 questions about entropy and mutual information. Question 1 asks to show that the entropy of a function of a random variable is less than or equal to the entropy of the random variable. Question 2 provides a joint probability distribution and asks to calculate various entropies and mutual information. Question 3 defines a measure of correlation in terms of conditional entropy and proves some properties. Question 4 asks to derive results from theorems in a reference book. Question 5 states an inequality for the joint entropy of random variables and condition for equality.

Uploaded by

Er Rudrasen Pal
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 2

EE677: MIMO Wireless Communications EE Dept.

IITK

Tutorial-1
Entropy/MI Sep. 2020

1. Entropy of functions of a random variable. Let X be a discrete random variable. Show


that the entropy of a function (you know the function) of X is less than or equal to the
entropy of X by justifying the following steps:

(a)
H(X, g(X)) = H(X) + H(g(X)|X)
(b)
= H(X);

(c)
H(X, g(X)) = H(g(X)) + H(X|g(X))
(d )
≥ H(g(X).

Thus H(g(X)) ≤ H(X).

2. Example of joint entropy. Let p(x, y) be as shown in the table below.

Y \X 1 2 3 4
1 1/8 1/16 1/32 1/32
2 1/16 1/8 1/32 1/32
3 1/16 1/16 1/16 1/16
4 1/4 0 0 0

Find

(a) H(X), H(Y ).


(b) H(X|Y ), H(Y |X).
(c) H(X, Y ).
(d) I(X; Y ).

3. A measure of correlation. Let X1 and X2 be identically distributed, but not necessarily


independent. Let
H(X2 |X1 )
ρ=1− .
H(X1 )
I(X1 ;X2 )
(a) Show ρ = H(X1 ) .
(b) Show 0 ≤ ρ ≤ 1.
(c) When is ρ = 0?
(d) When is ρ = 1?

1
2

4. Reading exercise: Please derive the results in Theorem 2.5.1 and Theorem 2.5.2 of the
Thomas & Cover book. Also derive the result in Corollary in Eq. 2.21.

5. Let X1 , X2 , . . . , Xn are distributed according to pmf p(x1 , x2 , . . . , xn ). Then


n
X
H(X1 , X2 , . . . , Xn ) ≤ H(Xi )
i=1

with equality if and only if Xi are independent.

You might also like