Fundamental Limits and Basic Information Theory: Computer Communications
Fundamental Limits and Basic Information Theory: Computer Communications
Lecture 4
Fundamental Limits and Basic Information Theory
Supplementary Reading
Andrew Tanenbaum, Computer Networks (4/e), Pearson Education, 2003
Section 2.1
Bandwidth-Limited Signals
Maximum Data Rate of a Channel
Bandwidth-Limited Signals
Information Theory
Shannons Theorem
Extended Nyquists work in 1948 to include the effect of noise
Theorem states:
C = B log2 (1 + S/N)
Where
C = channel capacity (bits / second)
B = hardware bandwidth
S = average signal power
N = average noise power
Often simply represent S/N as the signal-to-noise ration (SNR)
10
SNR backgrounder
Signal to Noise Ratio = Signal power / Noise power
Normally quoted in decibells (dB)
dB as a Function of S/N Ratio
45
40
35
25
20
dB
30
E.g.
15
10
5
Noise power = 1
0
10000
8000
6000
4000
2000
S/N ratio
Halving the signal power leads to 3dB ratio w.r.t. original power
10 * Log10 (1/2) = -3.01
11
12
13
14
15
Now let each Pi be distinct, and let sum of all Pi = 1, for symbols 1..M
Let the surprisal of each symbol be defined by ui = -log2(Pi)
Arrival of improbable symbol is very surprising: as Pi
0 then ui
16
Example
Transmitter
Receiver
Base of logs
= PG uG + PR uR
Before any symbol is received, uncertainty = 0.75 log2 (0.75) + 0.25 log2 (0.25) = 0.811 b/sym
In general, therefore: receiver uncertainty H = - Pi log2 Pi
Quantity of information transmitted = quantity of uncertainty removed at the receiver
CS3 Computer Communications, Copyright University of Edinburgh 2004
17
0.8
0.6
H(x)
0.4
0.2
0
0
0.1 0.2
0.3 0.4
0.5
0.6 0.7
0.8 0.9
Hbefore Hafter = R
CS3 Computer Communications, Copyright University of Edinburgh 2004
18
Entropy - results
For compression: Source Coding Theorem
If entropy of a data type is e, then e transmitted bits, but no fewer, are
enough to communicate each value.
Limits compression in error-free transmission.
19
Entropy - results
For error correction: Channel Encoding Theorem
Based on Shannons concept of Channel Capacity (C)
C = B log2 (1 + SNR) bits / second
B = channel bandwidth
SNR = Signal-to-Noise Ratio
Assumes line noise has Gaussian distribution1
Theorem:
If channel capacity is c then each value transmitted can communicate
20
10