15EC54
15EC54
OR
2 a. Mention different properties of entropy and prove external property. (07 Marks)
b. A source emits one of the four symbols S1 S2 S3 and S4 with probabilities of
7 , 5 , 1 & 1 . Show that H(S2) = 2H(S). (04 Marks)
16 16 8 8
c. In a facsimile transmission of a picture, there are about 2.25 × 10 6 pixels/frame. For a good
reproduction at the receiver 12 brightness levels are necessary. Assume all these levels are
equally likely to occur. Find the rate of information if one picture is to be transmitted every
3 min. Also compute the source efficiency. (05 Marks)
Module-2
3 a. A discrete memory less source has an alphabet of five symbols with their probabilities as
given below : (10 Marks)
Symbol S0 S1 S2 S3 S4
Probabilities 0.55 0.15 0.15 0.1 0.05
Compute Huffman code by placing composite symbol as high as possible and by placing
composite symbol as low as possible. Also find i) The average codeword length
ii) The variance of the average code word for both the cases.
b. Using Shannon Fano – coding, find code words for the probability distribution
P = 1 , 1 , 1 , 1 . Find average code word length and efficiency. (06 Marks)
2 4 8 8
OR
4 a. Write a short note on Lempel Ziv algorithm. (05 Marks)
b. Derive Source coding theorem. (05 Marks)
c. Apply Shannon’s encoding algorithm and generate binary codes for the set of messages
given below. Also find variance, code efficiency and redundancy. (06 Marks)
M1 M2 M3 M4 M5
1/8 1/16 3/16 1/4 3/8
Module-3
5 a. Find the capacity of the discrete channel whose noise matrix is (04 Marks)
0.8 0.2 0
P y = 0.1 0.8 0.1
x
0 0.2 0.8
1 of 2
15EC54
b. Define Mutual Information. List the properties of Mutual information and prove that
I( x ; y) = H(x) + H(y) – H(xy) bits/system. (06 Marks)
c. A channel has the following characteristics :
1 1 1 1
y
P
x = 3 3 6 6 & P(x1) = p(x2) = 1 . Find H(x) , H(y) , H(x, y) and Channel
1 1 1 1 2
6 6 3 3
capacity if r =1000 symbols/sec. (06 Marks)
OR
6 a. A binary symmetric channel has the following noise matrix with source probabilities of
3 1
and P y = 4
2 1 4 .
P(x1) = and P(x2) = (08 Marks)
3 3 x 13
4 4
i) Determine H(x) , H(y) , H(x, y) , H(y/x) , H(x/y) and I(x, y).
ii) Find channel capacity C. iii) Find channel efficiency and redundancy.
b. Derive an expression for channel efficiency for a Binary Erasure channel. (05 Marks)
c. Write a note on Differential Entropy. (03 Marks)
Module-4
7 a. For a systematic (6.3) linear block code generated by C4 = d1 ⊕ d 3 , C5 = d2 ⊕ d3 ,
C6 = d1 ⊕ d2.
i) Find all possible code vectors ii) Draw encoder circuit and syndrome circuit
iii) Detect and correct the code word if the received code word is 110010.
iv) Hamming weight for all code vector, min hamming distance. Error detecting and
correcting capability. (14 Marks)
b. Define the following : i) Block code and Convolutional code ii) Systematic and non –
systematic code. (02 Marks)
OR
8 a. A linear Hamming code for (7, 4) is described by a generator polynomial g(x) = 1 + x + x3.
Determine Generator Matrix and Parity check matrix. (03 Marks)
b. A generator polynomial for a (15, 7) cyclic code is g(x) = 1 + x4 + x6 + x 7 + x8.
i) Find the code vector for the message D(x) = x2 + x3 + x4. Using cyclic encoder circuit.
ii) Draw syndrome calculation circuit and find the syndrome of the received polynomial
Z(x) = 1 + x + x3 + x6 + x8 + x9 + x11 + x14. (13 Marks)
Module-5
9 a. Consider the (3, 1, 2) convolutional code with g1 = 110 , g 2 = 101 , g3 = 111. (12 Marks)
i) Draw the encoder block diagram ii) Find the generator matrix
iii) Find the code word corresponding to the information sequence 11101 using time
domain and transform Domain approach.
b. Write short note on BCH code. (04 Marks)
OR
10 For a (2,1, 3) convolutional encoder with g1 = 1011 , g2 = 1101. (16 Marks)
a. Draw the state diagram b. Draw the code tree.
c. Draw trellis diagram and code word for the message 1 1 1 0 1.
d. Using Viterbi decoding algorithm decode the obtained code word if first bit is erroneous.
*****
2 of 2