0% found this document useful (0 votes)
153 views4 pages

ECE ITC Model Paper 1

This document contains a model paper for an Information Theory and Coding examination. It includes 3 sections: Section A contains 10 short answer questions about concepts like uncertainty, information, mutual information, binary channels, codewords, Kraft inequality, syndromes, Hamming codes, entropy, and block codes. Section B contains 5 long answer questions to attempt, related to channel coding theorem, channel matrices, mutual information properties, AWGN channel capacity, and cascaded BSCs. Section C contains 2 long answer questions to attempt on topics like Hamming codes, decoding, DMCs, and probabilities for a noisy binary channel.

Uploaded by

Saurabh
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
153 views4 pages

ECE ITC Model Paper 1

This document contains a model paper for an Information Theory and Coding examination. It includes 3 sections: Section A contains 10 short answer questions about concepts like uncertainty, information, mutual information, binary channels, codewords, Kraft inequality, syndromes, Hamming codes, entropy, and block codes. Section B contains 5 long answer questions to attempt, related to channel coding theorem, channel matrices, mutual information properties, AWGN channel capacity, and cascaded BSCs. Section C contains 2 long answer questions to attempt on topics like Hamming codes, decoding, DMCs, and probabilities for a noisy binary channel.

Uploaded by

Saurabh
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 4

Printed Pages-04 NEC-031

(Following Paper ID and Roll No. to filled in your Answer book)

PAPER ID: Roll No.

B.TECH

(SEM.VII) ODD SEMESTER THEORY EXAMINATION 2018-19


Model Paper 1st
Information Theory And Coding
Time: 3 Hours Total Marks: 100

Note: (1) Attempt all Questions

(2) Marks allotted to each question are indicated on right


hand side

Section-A

Q.No.1:- Attempt all parts of the following. All parts carry equal
marks. Write answer of each part in short. (2*10=20)

(a) Define uncertainty and how it is related to information.


(b) Define information and properties of information.
(c) Explain the term mutual information and state its properties.
(d) What is a binary communication channel? Why is it called
memory-less and symmetric.
(e) Define code word, block length, code rate?
(f) Explain kraft inequality?
(g) What is the use of syndromes? Explain syndrome decoding.
(h) What are the hamming codes? Write the properties of hamming
codes.
(i) Define entropy, conditional entropy, and joint entropy? Explain
(j) What are hadamard and extended block codes.
Section-B
Note: Attempt any five parts from the following. (10*5=50)

Q2. (i) Explain channel coding theorem?


(ii) Importance of channel coding theorem
(iii) Limitations of channel coding theorem

Q3. Given a binary channel shown in figure.


(i) Find the channel matrix of the channel
(ii) Find P(y1) and P(y2) when P(x1)= P(x2)= 0.5
(iii) Find the joint probability P(x1, y2) and P(x2, y1) when P(x1)=
P(x2) = 0.5

Q4. Explain mutual information and its properties. Also verify the
expression H(X, Y) = H(Y) + H(X/Y)

Q5. (i) Show that the channel capacity of an ideal AWGN Channel
1 S S
with infinite band width is given by C ∞ = . ≅ 1.44
ln 2 N N
(ii) Given an AWGN channel with 4 KHz band width and the noise
power spectral density η/2 = 10−12 W/Hz. The signal power
required at the receiver is 0.1 mW. Calculate the capacity of this
Channel.
Q6. Prove that I(X; Y) = H(X) + H(Y) – H(X, Y) where H(X, Y) is joint
Entropy.
Q7. Two BSCs are connected in cascade, as shown in figure.

(i) Find the channel matrix of the resultant channel.


(ii) Find P(z1) and P(z2) if P(x1) = 0.6 and P(x2) = 0.4
Q8. (i) Explain Shannon Fano Algorithm?
(ii) A DMS X has five symbols x1, x2, x3, x4 and x5 with P(x1) =
0.4, P(x2) = 0.19, P(x3) = 0.16, P(x4) =0.15 and P(x5) =0.1
Construct a Shannon Fano code for X and calculate the
efficiency of the code.

Q9. For two independent messages m1 and m2 prove that total amount of
information conveyed is the sum of the information associated with
each message individually.

Section-C
Note: Attempt any two questions from this section. (15*2=30)

Q10. What is hamming distance? Using hamming bound condition


explain hamming code.
A parity check matrix is given below

(a) Determine the generator matrix G


(b) Find the code word that begins with 1010
(c) If the received codeword Y is 0111100, then decode this
received code word.

Q11. Consider the DMC shown in figure:


(i) Find the output probabilities if P(x1) = 1/2 and P(x2) = P(x3) =
1/4.
(ii) Find the output entropy H(Y).
Q12. A binary source produces 0’s and 1’s independently with
probability P (0) = 0.2 and P (1) = 0.8. the binary data is then
transmitted over a noisy channel. The probability of correct
reception when a “0” is transmitted over the channel is 0.9. Also
the probability of reception of a “0” when a “1” has been
transmitted is 0.2.

(a) Find the probability of reception of a “1” when a “0” is


transmitted and probability of reception of a “1” when a “1” is
transmitted.
(b) Find overall probability of receiving a “0” and receiving “1”.
(c) If a “1” is received, what is the probability that a “0” was
transmitted?

You might also like