0% found this document useful (0 votes)
141 views

CH3 Sourcecoding

1) The document covers source coding techniques including Huffman coding and Lempel-Ziv coding. It discusses entropy bounds for discrete memoryless sources and properties of mutual information. 2) Examples are provided to calculate entropy, average code length, and mutual information for specific probability distributions. 3) Questions ask to prove properties of entropy, mutual information, and independence as well as design Huffman codes and apply Lempel-Ziv coding.
Copyright
© Attribution Non-Commercial (BY-NC)
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
141 views

CH3 Sourcecoding

1) The document covers source coding techniques including Huffman coding and Lempel-Ziv coding. It discusses entropy bounds for discrete memoryless sources and properties of mutual information. 2) Examples are provided to calculate entropy, average code length, and mutual information for specific probability distributions. 3) Questions ask to prove properties of entropy, mutual information, and independence as well as design Huffman codes and apply Lempel-Ziv coding.
Copyright
© Attribution Non-Commercial (BY-NC)
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 2

4

th
edition CH3 source coding(CH5)

3.5 The output of a DMS consists of the possible letters
1 2
, ,...,
n
x x x , which occur
with probabilities
1 2
, ,...
n
p p p ,respectively. Prove that the entropy ( ) H x of the source
is at most logn .

3.7 A DMS has an alphabet of eight letters,
i
x , i =1, 2,,8, with probabilities 0.25,
0.20, 0.15, 0.12, 0.10, 0.08, 0.05, and 0.05.
a) Use the Huffman encoding procedure to determine a binary code for the source
output.
b) Determine the average number R of binary digits per source letter.
c) Determine the entropy of the source and compare it with R .

3.9 Recall Equations 3.2-6:
( ; ) ( ) ( )
i j i i j
I x y I x I x y =
Prove that
a) ( ; ) ( ) ( )
i j j j i
I x y I y I y x = .
b) ( ; ) ( ) ( ) ( )
i j i j i j
I x y I x I y I x y = + , where ( ) log ( , )
i j i j
I x y P x y = .

3.11 Let X and Y denote two jointly distributed discrete valued random variables.
a) Show that
,
( ) ( , ) log ( )
x y
H X P x y P x =


,
( ) ( , ) log ( )
x y
H Y P x y P y =


b) Use the above result to show that
( , ) ( ) ( ) H X Y H X H Y s +
When does equality hold?
c) Show that
( ) ( ) H X Y H X s
With equality if and only if X and Y are independent.

3.12 Two binary random variables X and Y are distributed according to the joint
distributions ( ) ( ) ( )
1
0 0, 1 1
3
P X Y P X Y P X Y = = = = = = = = = . Compute
( ) H X ,
( ) H Y ,
( ) | H X Y ,
( ) | H Y X , and
( ) , H X Y .

3.15 Show that ( ; ) ( ) ( ) ( ) I X Y H X H Y H XY = +


3.16 Show that, for statistically independent events,
( ) ( )
1 2
1
...
n
n i
i
H X X X H X
=
=



3.18 Show that
3 2 1 3 1 3 1 2
( ; ) ( ) ( ) I X X X H X X H X X X =
and that
3 1 3 1 2
( ) ( ) H X X H X X X >

3.21 The optimum four-level nonuniformquantizer for a Gaussian-distributed signal
amplitude results in the four levels
1 2 3
, , a a a and
4
a , with corresponding
probabilities of occurrence
1 2
0.3365 p p = = and
3 4
0.1635 p p = = .
a) Design a Huffman code that encodes a single level at a time and determine the
average bit rate.
b) Design a Huffman code that encodes two output levels at a time and determine the
average bit rate.
c) What is the minimum rate obtained by encoding J output levels at a time as
J ?

3.25 Find the Lempel-Ziv source code for the binary source sequences
000100100000011000010000000100000010100001000000110100000001100
Recovery the original sequence back from the Lempel-Ziv source code.
[Hint:You require two passes of the binary sequence to decide on the size of the
dictionary.]

You might also like