0% found this document useful (0 votes)
95 views5 pages

Model Question Paperl7B

The document is a model question paper for an Electronics and Communication Engineering course on Information Theory and Coding. It contains 20 multiple choice or numerical questions covering topics such as entropy, Shannon's theorem, channel capacity, error detection and correction codes, Huffman coding and CIRC encoding. Some sample questions include calculating the entropy of a random variable, finding the channel capacity of an AWGN channel, constructing a Huffman code given message probabilities, and designing a linear block code by specifying its generator and parity matrices.

Uploaded by

rameshaarya99
Copyright
© Attribution Non-Commercial (BY-NC)
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
95 views5 pages

Model Question Paperl7B

The document is a model question paper for an Electronics and Communication Engineering course on Information Theory and Coding. It contains 20 multiple choice or numerical questions covering topics such as entropy, Shannon's theorem, channel capacity, error detection and correction codes, Huffman coding and CIRC encoding. Some sample questions include calculating the entropy of a random variable, finding the channel capacity of an AWGN channel, constructing a Huffman code given message probabilities, and designing a linear block code by specifying its generator and parity matrices.

Uploaded by

rameshaarya99
Copyright
© Attribution Non-Commercial (BY-NC)
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 5

SREE NARAYANA GURUKULAM COLLEGE OF ENGINEERING KADAYIRUPPU, KOLENCHERY MODEL QUESTION PAPER SEVENTH SEMESTER ELECTRONICS AND COMMUNICATION

INFORMATION THEOREY AND CODING


Time: 3 hrs Maximum marks: 100

Part A Answer all Questions 1. Define Entropy for a discrete random variable and state its properties? 2. A source emits one of the 4 possible symbols during each signalling intervals. The symbols occur with probabilities p0=0.4, p1=0.3, p2=0.2, p3=0.1. Find the entropy of the source? 3. State Shannon Theorem and explain its significance? 4. Explain the trade between bandwidth and signal to noise ratio? 5. Write short notes on different types of codes? 6. State and prove Source coding Theorem? 7. Discuss on error detection and correcting capabilities of linear block codes? 8. How syndrome decoding performed in cyclic codes? 9. Write short notes on various interleaving techniques? 10. Explain sequential decoding? (10 X4 = 40 marks)

Part B 11. Messages Q1 to Qm have probabilities of P1 to Pm of occurring a. Write the expression for Entropy? b. If m=3 write H in terms of p1 and p2 using the result P1+P2+P3=1 ? c. Find P1 and p2 for H=Hmax by setting H = 0 and H = 0 ? P1 P2
OR

12. For the given channel matrix calculate H(X), H(Y),I(X,Y) given that p(x1)=0.6,p(x2)=0.3,p(x3)=0.1 P(Y/X)= 0 0 0

13. Derive an expression for the channel capacity of a band limited AWGN channel?

OR

14. A BSC has the error probability of 0.25 with source probabilities p(x1)=2/3 and p(x2)=1/3 a. Determine H(X), H(Y), H(Y/X), H(XY) and I(X;Y)? b. Find the Channel capacity and Efficiency? 15. Explain with a suitable example the procedure of Huffman coding?
OR

16. Given the messages x1, x2, x3, x4, x5 and x6 with probabilities 0.4, 0.2, 0.2, 0.1, 0.07 and 0.03, construct a binary code by applying Shannon Fanon coding procedure. Determine the code efficiency of the code? 17. Design a (4,2) linear block code a. Find the generator and parity matrix b. Find the minimum Hamming Distance c. What are the error detecting and correcting capabilities of the code?
OR

18. Consider the (31,15) Reed Solomon code a. How many bits are there in a symbol of the code? b. What is the block length? c. What is the minimum distance of the code? d. How many symbols in error can the code correct? 19. For a (2, 1, 2) convolution encoder the information sequence is d=10011. g(1)=[111], g(2)=[101] a. Draw the encoder diagram and find its output sequence? b. Find the generator matrix? c. Draw its state diagram representation?
OR

20. Explain the CIRC Encoding and Decoding? (5 X12 = 60 marks)

ANSWER KEY 1. Entropy H(X) = i=1m pi log(1/pi) , Definition and formula (2 marks) Four properties (2 marks). 2. H(X)=1.0228 (4 marks) 3. Shanon theorem states that given a source of Mequally likely messages, with M>>1 which is generating information at a rate R. Given a channel with channel capacity C. Then if R<=C, there exists a coding technique such that the output of the source may be transmitted over a channel with a probability of error of arbitrarily small. (3 marks) Significance (1 mark) 4. As the bandwidth decreases noise decreases and signal power increase. Explanation refer notes. (4 marks) 5. Four Codes Definition & one example eachNon singular codes, Uniquely decodable codes, Instantaneous codes & optimal codes (4 marks) 6. Source coding theorem statement and proof (2+2 marks) (Refer notes) 7. Error detection and correction capabilities 3 Theorems. (4 marks) (Refer notes) 8. Syndrome decoding C.HT =0, R.HT=e.HT=Syndrome. Formula derivation ,explanation and (n-k) shift register decoder diagram (2+1+1 marks) 9. Block and Convolutional interleaving (2+2 marks) 10. Sequential decoding algorithm and explanation (2+2 marks) 11. a. Entropy formula H(X)= p1 log(1/p1)+ p2 log(1/p2)+............+pm log(1/pm) (3 marks) b. H= p1 log(1/p1) + p2 log(1/p2)+(1- (p1+ p2 )) log(1/ (1- (p1+ p2 ))) (4 marks) c. When H= Hmax p1=p2=p3=1/3 (5 marks) 12. H(x)=1.295 bits/message, H(Y)=1.459 bits/message, H(Y/X)=0.999 bits/message, I(X;Y)= H(Y)-H(Y/X) =0.459 bits/message (4+4+4 marks) 13. Channel capacity of a band limited AWGN channel.(refer notes) (12 marks) 14. a. H(X)=0.9183 bits/message H(Y)=0.9799 bits/message, H(Y/X)= 0.8113 bits/message, H(X,Y)= H(X) + H(Y/X)=1.7296 bits/message I(X;Y)= H(Y)-H(Y/X) =0.1686 bits/message (6 marks) a. Channel capacity C=1- H(Y/X)=0.1887 bits/message (3 marks) Efficiency= I(X;Y)/C=89.35% (3 marks) 15. Huffman coding definition, procedure (6 steps) and example (3+4+4 marks) 16. Shannon fanon coding X1 0.4 0 0 X2 0.2 0 1 X3 0.2 1 0 X4 0.1 1 1 0 X5 0.07 1 1 1 0 X6 0.03 1 1 1 1 (6 marks)

L = 2.3 bits/message, H(X)= 2.209 bits/message , =97.15% (2+2+2 marks) 17. Linear block code a. [G]= 1 0 1 1 [H]= 1 1 1 0 0 1 1 0 0 1 1 0 C1=0000 C2=0110 C3=1011 C4=1101 (4+4 marks) b. Dmin=2 (1 mark) c. Error correcting capability t=(dmin-1)/2 = 1/2 .so code is incapable of correcting errors (3 marks) 18. Reed Solomon codes a. N=2m-1,m=5 bits b. Block length=31 x 5=155 bits c. Minimum distance of the code= n-k+1=31-15+1=17 d. Errors correctable t=1/2(n-k)=8 (3+3+3+3 marks)

19. a.
+ D= 1011

C1 C FF1 FF2 C2 +

Code word C= 11 10 11 11 01 01 11 b. Generator Matrix [G]= 11 00 00 00 00 10 11 11 00 00 00 00 11 10 11 11 00 00 00 00 11 10 11 11 00 00 00 00 11 10 11 11 00 00 00 00 11 10 11 11

00

c.

State Diagram
11 00 11 00 01 10 10

01

11

01

10

(4+4+4 marks) 20. CIRC Encoding and decoding (refer Digital Communication Techniques By Bernard Skelar) (12 marks)

You might also like