0% found this document useful (0 votes)
31 views

69fc9ITC Assn

This document contains 6 questions related to information coding and theory. The questions cover topics such as: proving transinformation of a continuous system is non-negative, verifying an expression for conditional entropy, proving the amount of information carried by equally likely and independent messages, calculating the information rate of a telegraph source, finding the information rate and error-free transmission requirements of a sampled and quantized analog signal, and constructing and comparing Shannon-Fano and Huffman codes for a discrete source.

Uploaded by

RAHUL
Copyright
© Attribution Non-Commercial (BY-NC)
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOC, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
31 views

69fc9ITC Assn

This document contains 6 questions related to information coding and theory. The questions cover topics such as: proving transinformation of a continuous system is non-negative, verifying an expression for conditional entropy, proving the amount of information carried by equally likely and independent messages, calculating the information rate of a telegraph source, finding the information rate and error-free transmission requirements of a sampled and quantized analog signal, and constructing and comparing Shannon-Fano and Huffman codes for a discrete source.

Uploaded by

RAHUL
Copyright
© Attribution Non-Commercial (BY-NC)
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOC, PDF, TXT or read online on Scribd
You are on page 1/ 1

ASSIGNMENT 1

INFORMATION CODING AND THEORY


SEM : V / VII

1. Show that transinformation of a continuous system is non – negative.

2. Verify the following expression:


m n
H(Y/X) = - ∑ ∑ p( xj ,yk ) log p(yk / xj )
j =1 k =1

3. If there are M equally likely and independent messages, then prove that amount
of information carried by each message will be I=N bits.

4. Consider a telegraph source having two symbols dot and dash. The dot duration
is 0.2 sec and the dash duration is twice that of dash. Time between the symbols
is 0.2 seconds. Calculate information rate of the telegraph source.

5. An analog signal having 4 kHz bandwidth is sampled at 1.25 times the Nyquist
rate and each sample is quantized into one of 256 equally likely levels.
(i) Find out information rate of this source?
(ii) Can the output of this source be transmitted without error over an AWGN channel
with a bandwidth of 10 kHz and S/N ratio of 20 dB? If not find out the S/N ratio
required for error free transmission.

6. A discrete source has five symbols x1, x2, x3, x4 and x5 with probabilities 0.4, 0.19,
0.16, 0.15 and 0.15 respectively. Construct a Shannon-Fano code for the source
and calculate code efficiency. Compare the results with that obtained using
Huffman coding.

You might also like