0% found this document useful (0 votes)
82 views10 pages

Ae73 Information Theory & Coding DEC 2014

The document contains solutions to multiple questions on information theory and coding. It includes concepts like independence of events, probability of error in a communication system, entropy calculations, channel capacity, error correcting codes, and trellis diagrams.

Uploaded by

Harshit Gupta
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
82 views10 pages

Ae73 Information Theory & Coding DEC 2014

The document contains solutions to multiple questions on information theory and coding. It includes concepts like independence of events, probability of error in a communication system, entropy calculations, channel capacity, error correcting codes, and trellis diagrams.

Uploaded by

Harshit Gupta
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 10

AE73 INFORMATION THEORY & CODING DEC 2014

Q.2 a. Show that if events A and B are independent, then


( )
P A ∩ B = P(A )P B ()
Answer:

b. In a binary communication system (Fig.1), a 0 or 1 is transmitted. Because of


channel noise, a 0 can be received as 1 and vice versa. Let m 0 and m 1 denote the
events of transmitting 0 and 1, respectively. Let r 0 and r 1 denote the events of
receiving 0 and 1, respectively. Let P(m 0 ) = 0.5, P(r1 m 0 ) = p = 0.1 , and
P(r0 m1 ) = q = 0.2

(i) Find P(r 0 ) and P(r 1 ).


(ii) If a 0 was received, what is the probability that a 0 was sent?
(iii) If a 1 was received, what is the probability that a 1was sent?
(iv) Calculate the probability of error P e .
(v) Calculate the probability that the transmitted signal is correctly read at the
receiver.

Answer:

© IETE 1
AE73 INFORMATION THEORY & CODING DEC 2014

Q.3 a. Let X and Y be defined by


X = cos θ and Y = sin θ
Where θ is a random variable uniformly distributed over [0,2π].
(i) Show that X and Y are uncorrelated
(ii) Show that X and Y are not independent

Answer:

© IETE 2
AE73 INFORMATION THEORY & CODING DEC 2014

b. Consider a random process Y(t) defined by


t
Y(t ) = ∫0 X(τ) dτ
Where X(t) is given by
X(t ) = A cos ωt
Where ω is constant and A = N[0; σ2 ] .
(i) Determine the pdf of Y(t) at t = t k
(ii) Is Y(t) WSS?

Answer:

Q4 a. Explain about the average information content of symbols in long dependent


sequences.

Answer: See Article 4.2.3 Page 145, of Text Book-I

b. Calculate the conditional entropy of an M-ary discrete memoryless channel.

Answer: Refer Pages 165-166 of Text Book-I

© IETE 3
AE73 INFORMATION THEORY & CODING DEC 2014

Q.5 a. A message source generates one of four messages randomly every


microseconds. The probabilities of these messages are 0.4, 0.3, 0.2, and 0.1.
Each emitted message is independent of the other messages in the sequence.
(i) What is the source entropy?
(ii) What is the rate of information generated by this source (in bits per
second)?

Answer:

b. An analog signal having 4-kHz bandwidth is sampled at 1.25 times the Nyquist
rate, and each sample is quantized into one of 256 equally likely levels.
Assume that the successive samples are statistically independent.
(i) What is the information rate of this source?
(ii) Can the output of this source be transmitted without error over an AWGN channel
with a bandwidth of 10 kHz and S/N ratio of 20 dB?
(iii) Find the S/N ratio required for error free transmission for part (ii)

Answer:

© IETE 4
AE73 INFORMATION THEORY & CODING DEC 2014

Q.6 a. Determine the capacity of a channel of infinite bandwidth.

Answer:

b. Consider a binary memoryless source X with two symbols x 1 and x 2. Show that
H(X) is maximum when both x 1 and x 2 are equiprobable.

Answer:

© IETE 5
AE73 INFORMATION THEORY & CODING DEC 2014

Q.7 a. Consider an AWGN channel with 4-kHz bandwidth and the noise power
η
spectral density = 10−12 W/Hz. The signal power required at the receiver is
2
0.1 mW. Calculate the capacity of this channel.

Answer:

b. Draw and explain observations of Bandwidth-Efficiency diagram.

Answer: Refer Pages 48-49 of Text Book-II

© IETE 6
AE73 INFORMATION THEORY & CODING DEC 2014

Q.8 a. For a (6, 3) systematic linear block code, the three parity check bits c 4 , c 5 , and c 6
are formed from the following equation:
c 4 = d1 ⊕ d3
c5 = d1 ⊕ d 2 ⊕ d3
c6 = d1 ⊕ d 2
(i) Write down the generator matrix G.
(ii) Construct all possible code words.
(iii) Suppose that the received word is 010111. Decode this received word by
finding the location of the error and the transmitted data bits.

Answer:

© IETE 7
AE73 INFORMATION THEORY & CODING DEC 2014

b. Given a generator matrix G = [1 1 1 1 1]. Construct a (5, 1) code. How many


errors can this code correct? Find the codeword for data vectors d=0 and d=1.
Comment on the result.

Answer:

Q.9 a. Draw the state diagram, tree diagram, and trellis diagram for the K=3, rate
1/3 code generated by
g 1 (X)=X+X2
g 2 (X)=1+X
g 3 (X)=1+X+X2

Answer:

© IETE 8
AE73 INFORMATION THEORY & CODING DEC 2014

© IETE 9
AE73 INFORMATION THEORY & CODING DEC 2014

b. Factorize the polynomial x3 + x2 + x + 1


Answer:

TEXT BOOKS

I Digital and Analog Communication Systems by K. Sam Shanmugam, John Wiley India
Edition, 2007 reprint.

II Digital Communications by Simon Haykin, John Wiley & Sons, Student Edition.

© IETE 10

You might also like