0% found this document useful (0 votes)
171 views4 pages

Com2 Lesson 2

This document covers key concepts in information theory including: 1. Information measure and entropy which quantify the uncertainty in a message or source. Entropy is highest when all outcomes are equally likely. 2. Channel capacity defines the maximum rate of information that can be reliably transmitted over a channel based on its bandwidth and signal-to-noise ratio. 3. The Shannon-Hartley theorem gives the formula for channel capacity of a bandpass channel as a function of bandwidth and signal-to-noise ratio. 4. Sample problems demonstrate calculating information rates, entropy, channel capacity, and error rates for various communication systems and codes.
Copyright
© Attribution Non-Commercial (BY-NC)
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
171 views4 pages

Com2 Lesson 2

This document covers key concepts in information theory including: 1. Information measure and entropy which quantify the uncertainty in a message or source. Entropy is highest when all outcomes are equally likely. 2. Channel capacity defines the maximum rate of information that can be reliably transmitted over a channel based on its bandwidth and signal-to-noise ratio. 3. The Shannon-Hartley theorem gives the formula for channel capacity of a bandpass channel as a function of bandwidth and signal-to-noise ratio. 4. Sample problems demonstrate calculating information rates, entropy, channel capacity, and error rates for various communication systems and codes.
Copyright
© Attribution Non-Commercial (BY-NC)
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 4

Lesson 2: Information Theory

A. Information Measure = 1

Where: Pi = probability of the ith message

B. Average Information (Entropy)

=
=0

Note: units could be in bits/symbol; bits/keys

Note: maximum entropy formula for symbols having same probabilities

C. Relative Entropy =

D. Redundancy = 1

E. Rate of Information =

=
=1

F. Code Word Length = G. Average Code Word length

=
=1

H. Coding Efficiency = I. Coding Redundancy = 1 100%

Sample problems: 1. A binary source sends binary 1 with a probability of 0.3. Evaluate the average information for the source. For a binary source, find the probability for sending a binary 1 and binary 0, such that the average source information will be maximized. 2. A numeric keypad has the digits 0,1,2,3,4,5,6,7,8, and 9. Assume that the probability of sending any one digit is the same as that for sending any of the

other digits. Calculate how often the buttons must be pressed in order to send out information at the rate of 2 bits per second. 3. The international Morse Code uses a sequence of dot and dashes to transmit letters of the English alphabet. The dash is represented by a current pulse that has duration of 3 units and a dot has duration of 1 unit. The probability of occurrence of a dash is 1/3 of the probability of occurrence of the dot. A. Calculate the information content of the dot. B. Calculate the average information in the dot-dash code. C. Assume that the dot lasts 1 msec, which is the same interval as the pause between symbols. Find the average rate (bps) of the information transmission. 4. Calculate the average information content in the English language assuming that each of the 26 characters in the alphabet occurs with equal probability. Also compute for the coding efficiency in bits and dits.

Lecture III. Channel Capacity


The maximum rate at which information can be transmitted through a channel.

1. Lossless Channel - A channel described by a channel matrix with only one non-zero element in each column. (source entropy) = 2 2. Deterministic Channel - describe by a channel matrix with only one non zero element in each row.(destination entropy)

= 2 3. Noiseless channel - a channel that is both lossless and deterministic. 3 = 2 = 2

4. Shannon Limit for Information capacity

= 2 (1+ )

5. Shannon-Hartley theorem = 22

Note:

M= output symbols, N= input symbols, C=bps, BW=Hz

Sample problems: 1. A telephone line has a bandwidth of 3.2 kHz and a SNR of 36dB. A signal is transmitted down the line using a three level code. Calculate the maximum data rate taking into account the presence of noise. 2. A telephone system has an input analog signal sampled at 1.5x the Nyquist rate and each sample is quantized into one of 256 equally likely levels. a. Calculate the information rate of source. b. Calculate the BER of source if it has BW= 10 kHz and S/N ratio of 20 dB, transmitted over AWGN channel. c. Find S/N in dB required for error free transmission, if BW = 10 kHz. d. Find BW required for error free transmission if S/N is 25 dB.

You might also like