Problem Source Coding...
Problem Source Coding...
What conditions have to be satisfied by K and the code-word length for the coding to be 100 percent? [Haykin 9.9b] 2. Consider the four codes listed below: Symbol s0
s1 s2
s3
s4
(a) Two of these four codes are prefix codes. Identify them, and construct their individual decision trees. (b) Apply the Kraft-McMillan inequality to codes I, II, III, and IV. Discuss your results in light of those obtained in part (a). [Haykin 9.10] 3. A discrete memoryless source has an alphabet of seven symbols whose probabilities of occurrence are as described here: Symbol Probability s0 0.25
s1 s2
s3 0.125
s4
s5 0.0625
s6 0.0625
0.25
0.125
0.125
Compute the Huffman code for this source, moving a "combined" symbol as high as possible. Explain why the computed source code has an efficiency of 100 percent. [Haykin 9.12] 4. Consider a discrete memoryless source with alphabet {so , s1, s2 } and statistics {0.7,0.15,0.15} for its output. (a) Apply the Huffman algorithm to this source. Hence, show that the average code-word length of the Huffman code equals 1.3 bits/symbol. (b) Let the source be extended to order two. Apply the Huffman algorithm to the resulting extended source, and show that the average code-word length of the new code equals 1.1975 bits/symbol. (c) Compare the average code-word length calculated in part (b) with the entropy of the original source. [Haykin 9.13]
5. A computer executes four instructions that are designated by the code words (00, 01, 10, 11). Assuming that the instructions are used independently with probabilities (1/2. 1/8, 1/8. 1/4), calculate the percentage by which the number of bits used for the instructions may be reduced by use of an optimum source code. Construct a Huffman code to realize the reduction. [Haykin 9.15] 6. Consider the following binary sequence 11101001100010110100 Use the Lempel-Ziv algorithm to encode this sequence. Assume that the binary symbols 0 and 1 are already in the codebook.
2. (a)
3.
4. (a)
(b) For the extended source, we have p ( s 0 s 0 ) = 0.7 0.7 = 0.49, p ( s 0 s1 ) = 0.105, p ( s 0 s 2 ) = 0.105, p ( s1 s 0 ) = 0.105, p( s1 s1 ) = 0.0225, p( s1 s 2 ) = 0.0225, p ( s 2 s 0 ) = 0.105, p( s 2 s1 ) = 0.0225, p( s 2 s 2 ) = 0.0225 Applying the Huffman algorithm to the extended source, we obtain the following source code:
Therefore, H ( X )
5.
6.