0% found this document useful (0 votes)
6 views2 pages

Problem Set1

This document contains 6 problems related to information theory. Problem 1 asks to calculate the entropy of a coin flipping experiment and compares it to an efficient yes/no questioning strategy. Problem 2 shows that the entropy of a function of a random variable is less than or equal to the entropy of the variable. Problem 3 shows that if the conditional entropy of Y given X is 0, then Y is a function of X. Problem 4 calculates various entropies related to the outcome of a World Series. Problem 5 calculates joint, conditional, and mutual entropies based on a joint probability table. Problem 6 defines a measure of correlation between two random variables in terms of conditional entropy and proves some properties.

Uploaded by

Abhijith AS
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
6 views2 pages

Problem Set1

This document contains 6 problems related to information theory. Problem 1 asks to calculate the entropy of a coin flipping experiment and compares it to an efficient yes/no questioning strategy. Problem 2 shows that the entropy of a function of a random variable is less than or equal to the entropy of the variable. Problem 3 shows that if the conditional entropy of Y given X is 0, then Y is a function of X. Problem 4 calculates various entropies related to the outcome of a World Series. Problem 5 calculates joint, conditional, and mutual entropies based on a joint probability table. Problem 6 defines a measure of correlation between two random variables in terms of conditional entropy and proves some properties.

Uploaded by

Abhijith AS
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 2

EE6340: Information Theory

Problem Set 1

1. Coin flips. A fair coin is flipped until the first head occurs. Let X denote the number of
flips required.

(a) Find the entropy H(X) in bits. The following expressions may be useful:
∞ ∞
X r X r
rn = , nrn = .
(1 − r) (1 − r)2
n=1 n=1

(b) A random variable X is drawn according to this distribution. Find an “efficient”


sequence of yes-no questions of the form, “Is X contained in the set S?” Compare
H(X) to the expected number of questions required to determine X.

2. Entropy of functions of a random variable. Let X be a discrete random variable. Show


that the entropy of a function of X is less than or equal to the entropy of X by justifying
the following steps:

(a)
H(X, g(X)) = H(X) + H(g(X)|X)
(b)
= H(X);

(c)
H(X, g(X)) = H(g(X)) + H(X|g(X))
(d )
≥ H(g(X).

Thus H(g(X)) ≤ H(X).

3. Zero conditional entropy. Show that if H(Y |X) = 0, then Y is a function of X, i.e., for
all x with p(x) > 0, there is only one possible value of y with p(x, y) > 0

4. World Series. The World Series is a seven-game series that terminates as soon as either
team wins four games. Let X be the random variable that represents the outcome of a
World Series between teams A and B; possible values of X are AAAA, BABABAB, and
BBBAAAA. Let Y be the number of games played, which ranges from 4 to 7. Assuming
that A and B are equally matched and that the games are independent, calculate H(X),
H(Y ), H(Y |X), and H(X|Y ).

5. Example of joint entropy. Let p(x, y) be as shown in the table below.


Find

(a) H(X), H(Y ).


X \Y 0 1
0 1/3 1/3
1 0 1/3

(b) H(X|Y ), H(Y |X).


(c) H(X, Y ).
(d) H(Y ) − H(Y |X).
(e) I(X; Y ).
(f) Draw a Venn diagram for the quantities in (a) through (e).

6. A measure of correlation. Let X1 and X2 be identically distributed, but not necessarily


independent. Let

H(X2 |X1 )
ρ=1− .
H(X1 )
I(X1 ;X2 )
(a) Show ρ = H(X1 ) .
(b) Show 0 ≤ ρ ≤ 1.
(c) When is ρ = 0?
(d) When is ρ = 1?

You might also like