Assignment I - ITC 2023
Assignment I - ITC 2023
ASSIGNMENT-I
1. Prove that source entropy is maximum when messages are having equal probability.
2. Consider a source produces four symbols with probabilities {0.5, 0.25, 0.125, 0.125}. Calculate
source entropy.
3. Given an AWGN channel with 4 KHz bandwidth and the noise power 10-12 W/Hz. The signal
power required at the receiver is 0.1 mW. Determine the channel capacity.
4. For the given message probabilities construct Shannon-Fano code tree and evaluate the coding
efficiency.
X X1 X2 X3 X4 X5 X6 X7
P(x) 0.03 0.1 0.07 0.4 0.15 0.15 0.1
5. Repeat the question 4, considering Huffman coding technique and compare the efficiency of
both the encoding schemes.
6. A high resolution black and white TV picture consists of 2×106 picture elements and 16
different brightness levels. Pictures are repeated at the rate of 32 per second. All picture
elements are assumed to be independent and all levels have equal likelihood of occurrence.
Calculate the average rate of information conveyed by the TV picture source.
7. A telegraph source having two symbols dot and dash. The dot duration is 0.2 sec and dash
duration is 3 times of dot duration. The probability of dot’s occurring is twice that of dash and
the time between symbols is 0.2 sec. Calculate the information rate of the telegraphic source.
8. Prove that Shannon capacity can be expressed as C = B log2 (1+SNR). Where B indicates
channel bandwidth.
9. For the given message probabilities design Shannon-Fano encoding table and calculate coding
efficiency.
X X1 X2 X3 X4 X5 X6 X7
10. Repeat the question 9, considering non binary Huffman coding technique and compare the
efficiency of both the encoding schemes.
11. A discrete memoryless binary source produces four symbols: {x1, x2, x3, x4} with probabilities
{0.4, 0.3, 0.2, 0.1}. Calculate source entropy. Also identify the amount of information contained
in the messages (x1 x2 x1 x3) and (x4 x3 x3 x2).
12. Consider a DMS X, that was encoded differently as follows,
xi code-I code-II code-III code-IV
x1 00 0 0 0
x2 01 10 11 100
x3 10 11 100 110
x4 11 110 110 111
(i) Examine the existence of all the codes from the above.
(ii) Determine the prefix free codes, uniquely decodable codes and instantaneous codes from
the above.
13. Encode the following binary sequence considering Lempel-Ziv algorithm,
11110000110010101001100001111000. Identify the drawback of LZ algorithm and discuss its
remedy.
0.4
0.6
x2 y2
16. Compare the following types of channel with example, (a) Lossless Channel, (b) Deterministic
Channel, (c) Noiseless Channel, (d) Binary Symmetric Channel and (d) Binary Erasure
Channel.
17. Determine H(X), H(Y), H(X/Y), H(Y/X) and H(XY) for the following channel matrix,
considering P(x1)=0.3, P(x2)=0.4 and P(x3)=0.3.
0.8
x1 0.2 y1
1
x2 y2
0.3
x3 y3
0.7
18. An input alphabet (source) consists of 100 characters. (a) If the code words are encoded by a
fixed length code, determine the required no. of bits for encoding. (b) Let us assume that 10 of
the characters are equally likely and that each occurs with probability 0.05. Also assume that
the remaining 90 characters are equally likely. Determine the average number of bits required
to encode this alphabet using a variable length Huffman code.
19. A speech signal is sampled at 8 KHz and coded with differential PCM the outputs of which
belong to a set of eight symbols. The symbols have the following probabilities, {0.4, 0.25, 0.15,
0.1, 0.05, 0.03, 0.01, 0.01}. Determine the source entropy. Also calculate the source entropy if
all the symbols are equal probable.
20. Designer wish to transmit voice at 32 kbps through a communication channel of bandwidth
3 KHz. The transmitted power is such that the received SNR is 1000. Determine if the data can
be transmitted error free through this channel. If not, then find out the minimum SNR required
E
for error free transmission. Also calculate Nb for the received data in this case.
0