Assignment2 2
Assignment2 2
1. 2. 3. 4.
Consider a DMS with source probabilities 0.3, 0.25, 0.2, 0.15, 0.1. Find the source entropy H(X) Prove that the entropy of a discrete source is a maximum when the output symbols are equiprobable. A source X has an infinitely large set of outputs with probabilities of occurrence given by 2-i , i=1,2,3 What is the average self information H(X) of this source. For an AWGN channel of transmission bandwidth 4 KHz and noise power spectral density equal to 10-12 W/Hz, find the channel capacity if the signal power at the receiver is 0.1mW.
5.
Compute message uncertainty in bits per character for a textual transmission using 7 bit ASCII coding. Assume that each character is equiprobable and the noise of the channel results in bit error probability of 0.01.