0% found this document useful (0 votes)
66 views4 pages

TTE2008: Information Theory

This document contains a model question paper for an M. Tech examination in Electronics and Communication Engineering. It covers three modules: Information Theory, Source Coding, and Rate Distortion Theory. The paper tests students on key concepts like entropy, mutual information, channel capacity, Huffman coding, typical sets, rate distortion function, and the rate distortion theorem. Students must answer 6 questions, choosing 2 from each module. Questions involve proving theorems, calculating information measures, deriving coding schemes, and more. The exam evaluates students' comprehensive understanding of fundamental information theory topics.

Uploaded by

stephan
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
66 views4 pages

TTE2008: Information Theory

This document contains a model question paper for an M. Tech examination in Electronics and Communication Engineering. It covers three modules: Information Theory, Source Coding, and Rate Distortion Theory. The paper tests students on key concepts like entropy, mutual information, channel capacity, Huffman coding, typical sets, rate distortion function, and the rate distortion theorem. Students must answer 6 questions, choosing 2 from each module. Questions involve proving theorems, calculating information measures, deriving coding schemes, and more. The exam evaluates students' comprehensive understanding of fundamental information theory topics.

Uploaded by

stephan
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 4

Model Question Paper

Second Semester M. Tech Degree Examination in


Electronics and Communication Engineering
Stream: Telecommunication Engineering (2013 Scheme)

TTE2008: Information Theory


Time: 3 hours

Max. Marks: 60

Instructions: Answer any 2 questions from each module (Each Carries 10 Marks)
Module I
1. (a) Consider a source S emitting M messages S = {S1, S2, , SM} whose probabilities
of occurrences are given by P = {p1, p2, , pM}. Prove that the entropy of the source
H(S) is bounded within the limits 0 H(S) log2 M.

(4)

(b) Prove that the relative entropy between two probability mass functions p(x) and q(x)
is non-negative and is equal to zero only when p(x) = q(x).

(4)

(c) If the random variables X, Y and Z form a Markov chain, prove that :
p(x, z/y) = p(x/y). p(z/y).

(2)

2. (a) The channel matrix between the source X and the destination Y in a
communication system is given by :
X

1
2
3
4
Y
1 14 14 14 14
2 18 12 14 14
1
3 1
2
8 14 12
4

1
2

Draw the channel matrix. If the marginal entropies for X and Y are given by P(X) =

12 , 14 , 18 , 18 and P(Y) = 14 , 14 , 14 , 14, find H(X), H(Y), H(X,Y),
H(X/Y), H(Y/X), I(X; Y).

(6)

(PTO)

(b) What do you mean by relative entropy? Derive the formula for the relative entropy
between two probability mass functions p(x) and q(x).

(2)

(c) Prove that the mutual information between two random variables X and Y is nonnegative using Jensens inequality.

(2)

3. (a) Let X and Y be random variables that take on values x1, x2, , xr and y1, y2, ,ys
respectively. Let Z = X + Y.
i) Show that H(Z/X) = H(Y/X).
ii) If X and Y are independent, prove that H(Y) H(Z) and H(X) H(Z).

(4)

(b) Let a source X emits messages with probability distribution, X ~ p(x), where X = {1,

2, 3, 4, 5, 6, 7, 8 } and p(X) = 14 , 14 , 18 , 18 , 116 , 116 , 116 , 116. Find a
ternary Huffman code. Also find its efficiency and redundancy.

(4)

(c) Derive the chain rules for entropy, mutual information and relative entropy.

(2)

Module II
4. (a) A discrete memoryless source emits a sequence of statistically independent binary
digits with probabilities p(1) = 0.005 and p(0) = 0.995. The digits are taken 100 at a time
and a binary codeword is provided for every sequence of 100 digits containing three or
fewer ones.
i) Assuming that all codewords are of the same length, find the minimum length
required to provide codewords for all sequences with three or fewer ones.
ii) Calculate the probability of observing a source sequence for which no
codeword has been assigned.
iii)Let

1
1, 2
2, 14


1
3, 4

If X1, X2, be drawn iid according to the above distribution, find P(X1, X2, ...Xn)
for large values of n. Note: The first element in the definition of X represents the
value it takes and the second element corresponds to its probabilities.
(6)

(b) Briefly explain the properties of typical set.


5. (a)

(4)

i) Prove that ||  0with equality iff f = g almost everywhere, where ||
represents the relative entropy between probability density functions.
ii) Derive the chain rule for differential entropy.

iii) Prove that      log || where   represents the differential
entropy.
iv) Derive the differential entropy of a normal distribution.

(5)

(b) Consider a 26 key typewriter,


i) If pushing a key results in printing the associated letter, what is the capacity in
bits?
ii) Now suppose that pushing a key results in printing that letter or the next (with
equal probability). Hence" # " $% &, , ( # ( $% ", what is the capacity?
iii) What is the highest rate code with block length one that achieves zero
probability of error for the above channel?
(5)
6. (a) State and prove Fano's inequality with respect to channel coding theorem.

(6)

(b) Find the mutual information ) ; +, where :


3 0 43 0

, - ~/0 10, 2
56
+
43 0 3 0

Evaluate ) ; +for 4 1, 4 0, 4 71.

(4)

Module III
7. (a) What do you mean by rate distortion function? Derive the rate distortion function for
a Bernoulli source with Hamming distortion.
8. State and prove the converse to the rate distortion theorem.

(10)
(10)

9. (a) Consider a source X uniformly distributed on the set {1, 2, , m}. Find the rate
distortion function for this source with Hamming distortion,
ie,89, 9: ;

0 < 9 9:,
>
1 < 9 = 9:

(5)

(PTO)

(b)Briefly explain the reverse water filling process of rate distortion function for a
parallel Gaussian source.
(5)

You might also like