0% found this document useful (0 votes)
60 views2 pages

Pdcs-Assignment 1

1) The document contains 5 problems involving information theory concepts such as joint and conditional entropy, mutual information, discrete memoryless channels, and Huffman coding. 2) The first problem involves calculating information theoretic measures for a joint probability distribution and finding mutual information. 3) The second problem involves finding the output probabilities and entropy of a discrete memoryless channel. 4) The remaining problems involve Huffman coding of discrete memoryless sources.

Uploaded by

Deeksha Mishra
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
60 views2 pages

Pdcs-Assignment 1

1) The document contains 5 problems involving information theory concepts such as joint and conditional entropy, mutual information, discrete memoryless channels, and Huffman coding. 2) The first problem involves calculating information theoretic measures for a joint probability distribution and finding mutual information. 3) The second problem involves finding the output probabilities and entropy of a discrete memoryless channel. 4) The remaining problems involve Huffman coding of discrete memoryless sources.

Uploaded by

Deeksha Mishra
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 2

PDCS-ASSIGNMENT 1

1) The following table shows the joint probability distribution of two random
variables X and Y with respective values and
. Calculate H(X), H(Y), H(X,Y), H(X/Y) and H(Y/X) and I(X;Y).

1/2 1/8 1/8


1/8 1/8 0

2) consider a DMC shown in figure below,


i) Find the output probabilities if p( ) = 1/2 and p( ) = p( ) = 1/4.
ii) Find the output entropy H(Y).

1/3

1/3
1/3
1/4
1/2
1/4

1/4
1/4
1/2

3) A bandpass signal (t)=Re [ ] has centre frequency and


bandwidth << . It is desired to mix the signal upto frequency >> and
generate the signal (t)=Re [ ].
Hint: First generate the lowpass complex representation of (t)( (t), (t))
and then generate (t)= (t)cos(2π ) - (t)sin(2π ).

4) A DMS has three output symbols with probabilities {0.5, 0.4, and 0.1}
i) Determine the Huffman code for this source and find the efficiency .
ii) Determine the Huffman code for this source taking two symbols at a time and
find the efficiency .
iii) Determine the Huffman code for this source taking three symbols at a time
and find the efficiency .
Compare the above three efficiencies and comment.

5) Consider a DMS with source probabilities {0.20, 0.20, 0.15, 0.15, 0.10, 0.10,
0.05, 0.05}.
i) Determine an efficient fixed length code for the source.
ii) Determine the Huffman code for this source.
iii) Compare the two codes and comment.

You might also like