0% found this document useful (0 votes)
386 views1 page

BEC503 Digital Communication

Study Materials for BEC503(vtu)

Uploaded by

rakshanjingade95
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
386 views1 page

BEC503 Digital Communication

Study Materials for BEC503(vtu)

Uploaded by

rakshanjingade95
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 1

J. N. N.

College Engineering, Shimoga 577204


Department of Electronics & Communication Engineering

Semester: V semester OBE & CBCS 2022 scheme Date: 22-11-2024


Subject: Digital Communication (BEC503)
Faculty: Dr. Manjunatha. P & Mr. Prashanth G S

Information theory Question Bank

1. Define Self information. Why logarithmic expression is chosen for measuring information.

2. A discrete source emits one of the four symbols S = {S0 , S1 , S2 , S3 } with probabilities P =
{1/3, 1/6, 1/4, 1/4} respectively. The successive symbols emitted by the source are statistically
independent. Calculate the entropy of the source i) H(s) ii) H(s)max iii) Information

3. Define the following

(i) Entropy
(ii) Information rate
(iii) Self information

4. A DMS emits symbols from the source alphabet S = {S1 , S2 , S3 , S4 , S5 , S6 , S7 } with probabilities
P = {0.25, 0.25, 0.125, 0.125, 0.125, 0.0625, , 0.0625, } Compute i) H(s) ii) H(s)max iii) Information

5. A zero memory source has a source alphabet S = {s1 , s2 , s3 } with respective probabilities P =
{1/2, 1/4, 1/4}. Calculate i) Entropy of the source. ii)All symbols and the corresponding probabilities
of the second order extension of the source. Find the entropy of the second order extension of the
source. Show that H(S 2 ) = 2H(S)

6. A binary source is emitting an independent sequence of ’0’s and ’1’s with probabilities p and 1 − p
respectively. Plot the entropy of the source versus p.

7. Consider zero memory source emitting three symbols x, y z, with respective probabilities P =
{0.6, 0.3, 0.1}. Calculate i) Entropy of the source. ii)All symbols and the corresponding probabilities
of the second order extension of the source. Find the entropy of the second order extension of the
source. Show that H(S 2 ) = 2H(S)

8. A discrete memory source (DMS)has an alphabet X = {x1 , x2 , x3 , x4 , x5 , x6 } and source statistics


probabilities P = {0.3, 0.25, 0.2, 0.12, 0.08, 0.05, } Construct the binary Huffman code. Also find the
code efficiency and redundancy of coding

9. Construct a binary code by applying Huffman coding procedure for the following message X =
{x1 , x2 , x3 , x4 , x5 , x6 } with respective probabilities probabilities of P = {0.4, 0.2, 0.2, 0.1, 0.07, 0.03, }
Also determine the code efficiency and redundancy of the code.

10. Design a trinary source code for the source shown using Huffman’s coding procedure. S =
{s1 s2 s3 s4 s5 s6 } P = 31 14 18 18 12
1 1
12 X = {0 1 2}

11. A discrete memory less source has an alphabet of six symbols with probability statistics given below.

Message A B C D E F
Probability 0.3 0.25 0.20 0.12 0.08 0.05

(i) Construct the Huffman code by moving combined symbols as high as possible. Compute efficiency
and variance.
(ii) Construct the Huffman trinary code by moving combined symbols as high as possible.

Dr. Manjunatha P Prof., Dept of ECE, JNN College of Engg Shimoga [email protected] 1

You might also like