0% found this document useful (0 votes)
105 views

Itc

This document contains 8 questions related to information theory and coding. The questions cover topics such as: 1) Deriving the expression for channel capacity and finding the capacity of a binary symmetric channel 2) Calculating entropies for a discrete source transmitting messages through a channel 3) Applying Shannon-Fano coding and finding efficiency and redundancy 4) Drawing channel matrices and calculating information and entropy 5) Describing syndrome testing, error detection, and error correction 6) Finding convolutional code codewords and cyclic code codewords 7) Explaining how Reed-Solomon codes perform well in bursty noise and writing short notes on trellis codes 8) Writing short notes on rate distortion theory for sources with

Uploaded by

sushant sahoo
Copyright
© © All Rights Reserved
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
105 views

Itc

This document contains 8 questions related to information theory and coding. The questions cover topics such as: 1) Deriving the expression for channel capacity and finding the capacity of a binary symmetric channel 2) Calculating entropies for a discrete source transmitting messages through a channel 3) Applying Shannon-Fano coding and finding efficiency and redundancy 4) Drawing channel matrices and calculating information and entropy 5) Describing syndrome testing, error detection, and error correction 6) Finding convolutional code codewords and cyclic code codewords 7) Explaining how Reed-Solomon codes perform well in bursty noise and writing short notes on trellis codes 8) Writing short notes on rate distortion theory for sources with

Uploaded by

sushant sahoo
Copyright
© © All Rights Reserved
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 2

1. A) Describe the Shannon theorem for channel capacity.

Derive the expression


for channel

capacity.

B) Find the Channel capacity for probability

p=0.6 for given Binary

Symmetric channel.

2.

A) A discrete source transmits messages X1, X2 and X3 with probabilities


0.3, 0.4 and 0.3. The source is connected to the channel given figure.
Calculate all the entropies.

OR
B)Explain the mutual information and also prove the average of mutual
information is

I ( X ; Y )=H ( X )H

( YX )=H ( Y )H ( YX )

3. A) Apply the Shannon-Fano coding procedure for the following message

ensemble.
[X]= [x 1
x2
x3
x4
x5
x6
x7
x8 ]
[P]= [1/16 1/16
1/16 1/16 1/8
1/8 1/4
1/4]
Find the efficiency and redundancy.
B) Explain entropy. Find the maximum entropy for M equally probable
source.
4. A) Draw the channel matrix for lossless, deterministic channel, noiseless
channel and binary
symmetric channel.

B) For the given probabilities P(Xi)= , , , . Calculate amount of information and


entropy.
5. A) Describe the process of syndrome testing, error detection and error correction.
B) Explain hamming distance and hamming code for error detection

6. A) The encoder for convolutional code is shown below:

Find all the codewords for a 4-bit input data.


B) The generator polynomial of a (7,4) cyclic code is
code words

g ( x ) =1+ x+ x . Find the 16

of this code. Consider a message vector D= 0101

7. A) Explain the Reed Soloman codes perform so well in a bursty- noise environment.
B) Write the short notes on trellis codes.
8. A) Write the short notes on rate distortion theory for Gaussian source with memory.
B) How we describe the performance analysis of any coding system? Give the example.

You might also like