0% found this document useful (0 votes)
37 views

Assignment2 2

The document contains questions about information theory and coding concepts. It asks the reader to: 1) Calculate the source entropy for a discrete memoryless source (DMS) with specified output probabilities. 2) Prove that source entropy is maximized when output symbols are equiprobable. 3) Determine the average self information for a source with outputs following a geometric distribution. 4) Compute the channel capacity for an additive white Gaussian noise (AWGN) channel given the transmission bandwidth and power spectral densities. 5) Calculate the message uncertainty per character for a text transmission over a binary symmetric channel with a given bit error probability.
Copyright
© Attribution Non-Commercial (BY-NC)
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
37 views

Assignment2 2

The document contains questions about information theory and coding concepts. It asks the reader to: 1) Calculate the source entropy for a discrete memoryless source (DMS) with specified output probabilities. 2) Prove that source entropy is maximized when output symbols are equiprobable. 3) Determine the average self information for a source with outputs following a geometric distribution. 4) Compute the channel capacity for an additive white Gaussian noise (AWGN) channel given the transmission bandwidth and power spectral densities. 5) Calculate the message uncertainty per character for a text transmission over a binary symmetric channel with a given bit error probability.
Copyright
© Attribution Non-Commercial (BY-NC)
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 1

National University of Sciences and Technology (NUST) Pakistan Navy Engineering College MS COMM 2012 EE852-Information Theory and

Coding-Fall 2012 Assignment 2 (Due: 11th October 2012)

1. 2. 3. 4.

Consider a DMS with source probabilities 0.3, 0.25, 0.2, 0.15, 0.1. Find the source entropy H(X) Prove that the entropy of a discrete source is a maximum when the output symbols are equiprobable. A source X has an infinitely large set of outputs with probabilities of occurrence given by 2-i , i=1,2,3 What is the average self information H(X) of this source. For an AWGN channel of transmission bandwidth 4 KHz and noise power spectral density equal to 10-12 W/Hz, find the channel capacity if the signal power at the receiver is 0.1mW.

5.

Compute message uncertainty in bits per character for a textual transmission using 7 bit ASCII coding. Assume that each character is equiprobable and the noise of the channel results in bit error probability of 0.01.

You might also like