0% found this document useful (0 votes)
108 views

Assignment - 1 PDF

This document contains 10 questions related to information theory and coding. The questions cover topics like: 1. Finding an upper bound on the number of coins that can be used to find a counterfeit coin using weighings. 2. Calculating entropy, conditional entropy, and mutual information for random variables. 3. Simplifying the mutual information between variables in a Markov chain. 4. Verifying properties of Huffman codes like their optimality and relationship to entropy. 5. Comparing average code lengths for Huffman codes over different alphabets.

Uploaded by

Jyda MatSoch
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
108 views

Assignment - 1 PDF

This document contains 10 questions related to information theory and coding. The questions cover topics like: 1. Finding an upper bound on the number of coins that can be used to find a counterfeit coin using weighings. 2. Calculating entropy, conditional entropy, and mutual information for random variables. 3. Simplifying the mutual information between variables in a Markov chain. 4. Verifying properties of Huffman codes like their optimality and relationship to entropy. 5. Comparing average code lengths for Huffman codes over different alphabets.

Uploaded by

Jyda MatSoch
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 2

Subject: Information Theory & Coding

ASSIGNMENT- I

Submission Date: Feb 13, 2020

1. Suppose that one has n coins, among which there may or may not be one counterfeit coin. If there
is a counterfeit coin, it may be either heavier or lighter than the other coins. The coins are to be
weighed by a balance. Find an upper bound on the number of coins n so that k weighings will find
the counterfeit coin (if any) and correctly declare it to be heavier or lighter.

2. Let 𝑝(𝑥, 𝑦) be given by


Find:
(a) 𝐻(𝑋), 𝐻(𝑌 ).
(b) 𝐻(𝑋 | 𝑌), 𝐻(𝑌 | 𝑋).
(c) 𝐻(𝑋, 𝑌).
(d) 𝐻(𝑌) − 𝐻(𝑌 | 𝑋).
(e) 𝐼 (𝑋; 𝑌).
(f) Draw a Venn diagram for the quantities in parts (a) through (e).

3. Let 𝑋1 → 𝑋2 → 𝑋3 →· · ·→ 𝑋𝑛 form a Markov chain in this order; that is, let 𝑝(𝑥1 , 𝑥2 , . . . , 𝑥𝑛 ) =
𝑝(𝑥1 )𝑝(𝑥2 |𝑥1 ) · · · 𝑝(𝑥𝑛 |𝑥𝑛 − 1).
Reduce 𝐼 (𝑋1 ; 𝑋2 , . . . , 𝑋𝑛 ) to its simplest form.

4. Consider a fair coin flip. What is the mutual information between the top and bottom sides of the
coin?

5. Consider a random variable 𝑋 that takes on four values with probabilities (1/3, 1/3, 1/4, 1/12).
(a) Construct a Huffman code for this random variable.
(b) Show that there exist two different sets of optimal lengths for the codewords; namely, show
that codeword length assignments (1, 2, 3, 3) and (2, 2, 2, 2) are both optimal.
(c) Conclude that there are optimal codes with codeword lengths for some symbols that exceed
the Shannon code length ⌈𝑙𝑜𝑔 1 /𝑝(𝑥)⌉

6. Let the random variable 𝑋 have five possible


outcomes {1, 2, 3, 4, 5}. Consider two
distributions 𝑝(𝑥) and 𝑞(𝑥) on this random
variable.
(a) Calculate 𝐻(𝑝), 𝐻(𝑞), 𝐷(𝑝||𝑞), and
𝐷(𝑞||𝑝).
(b) The last two columns represent codes for
the random variable. Verify that the average
length of 𝐶1 under 𝑝 is equal to the entropy
𝐻(𝑝). Thus, 𝐶1 is optimal for 𝑝. Verify that 𝐶2 is optimal for 𝑞.
7. Consider the random variable

(a) Find a binary Huffman code for X.


(b) Find the expected code length for this encoding.

8. Which of these codes cannot be Huffman codes for any probability assignment?
(a) {0, 10, 11}
(b) {00, 01, 10, 110}
(c) {01, 10}

9. Consider codes that satisfy the suffix condition, which says that no codeword is a suffix of any
other codeword. Show that a suffix condition code is uniquely decodable, and show that the
minimum average length over all codes satisfying the suffix condition is the same as the average
length of the Huffman code for that random variable

10. Consider a random variable X that takes six values {𝐴, 𝐵, 𝐶, 𝐷, 𝐸, 𝐹} with probabilities
0.5, 0.25, 0.1, 0.05, 0.05, 𝑎𝑛𝑑 0.05, respectively.
(a) Construct a binary Huffman code for this random variable. What is its average length?
(b) Construct a quaternary Huffman code for this random variable [i.e., a code over an alphabet
of four symbols (call them 𝑎, 𝑏, 𝑐 and 𝑑)]. What is the average length of this code?
(c) One way to construct a binary code for the random variable is to start with a quaternary code
and convert the symbols into binary using the mapping 𝑎 → 00, 𝑏 → 01, 𝑐 → 10, 𝑎𝑛𝑑 𝑑 → 11.
What is the average length of the binary code for the random variable above constructed by this
process?
(d) For any random variable 𝑋, let 𝐿𝐻 be the average length of the binary Huffman code for the
random variable, and let 𝐿𝑄𝐵 be the average length code constructed by first building a
quaternary Huffman code and converting it to binary. Show that 𝐿𝐻 ≤ 𝐿𝑄𝐵 < 𝐿𝐻 + 2
(e) The lower bound in the example is tight. Give an example where the code constructed by
converting an optimal quaternary code is also the optimal binary code.
(f) The upper bound (i.e., 𝐿𝑄𝐵 < 𝐿𝐻 + 2) is not tight. In fact, a better bound is LQB ≤ LH + 1.
Prove this bound, and provide an example where this bound is tight.

You might also like