Channel Capacity, Shannon Hartley Law and Shannon's Limit
Channel Capacity, Shannon Hartley Law and Shannon's Limit
Presentation by
Dr.R.Hemalatha,Asso.Prof / ECE
SSN College of Engineering
Objectives
• To discuss about
– Channel capacity
– Channel coding theorem
– Shannon Hartley theorem
Channel Capacity
• Channel capacity is concerned with the information handling capacity
of a given channel. It is affected by:
– The attenuation of a channel which varies with frequency as well
as channel length.
– The noise induced into the channel which increases with distance.
– Non-linear effects such as clipping on the signal.
• Some of the effects may change with time e.g. the frequency response
of a copper cable changes with temperature and age.
• Hence, modelling a channel is essential to estimate how much
information can be passed through it.
• Non linear effects and attenuation can be compensated, but it is
extremely difficult to remove noise.
• The highest rate of information that can be transmitted through a
channel with least error prob is called the channel capacity, C.
Channel Capacity
• Channel capacity of a discrete memoryless channel is the
maximum mutual information I(X;Y) in any single use of the
channel, where the maximization is over all possible input
probability distributions.
C max I X ;Y
pxj
• C is denoted by bits / channel use or bits / transmission.
Channel Coding Theorem
• Let a discrete memoryless source with an alphabet S have entropy H(S) and
produce symbols every Ts seconds. Let a discrete memoryless channel have
capacity C and be used every Tc seconds. Then if,
H (S ) C
Ts Tc
• There exists a coding scheme for which the source output can be transmitted
over the channel and can be reconstructed with an arbitrarily small
probability of error. Conversely, if,
H (S ) C
Ts Tc
• It is not possible to transmit information over the channel and reconstruct it
with an arbitrarily small probability of error.
Application of Channel Coding Theorem to
Binary Symmetric Channels
• Consider a discrete memoryless source emits equally likely
symbols 0 and 1 every Ts seconds.
• Source Entropy, H(S) = 1 bit / source symbol.
• Therefore,
1 C
Ts Tc
Application of Channel Coding Theorem to
Binary Symmetric Channels
• Consider a discrete memoryless source emits equally likely
symbols 0 and 1 every Ts seconds.
• Source Entropy, H(S) = 1 bit / source symbol.
• Therefore,
1 C
Ts Tc
Noise
Noise
X (t)
Source Sampling Output
Mean=0
B.W=W
2
E[Xk ]=P
Shannon Hartley/Information Capacity
Theorem
Noise
X (t) Xk
Source Sampling Output
Mean=0 Nyquist
B.W=W Rate
2
E[Xk ]=P
Shannon Hartley/Information Capacity
Theorem
WGN: Mean=0
PSD=No/2
Noise B.W=W
Variance= NoW
X (t) Xk Nk
Source Sampling Output
Mean=0 Nyquist
B.W=W Rate
2
E[Xk ]=P
Shannon Hartley/Information Capacity
Theorem
WGN: Mean=0
PSD=No/2
Noise B.W=W
Variance= NoW
X (t) Xk Nk
Source Sampling Output
Yk X k N k
Mean=0 Nyquist
B.W=W Rate
2
E[Xk ]=P
Shannon Hartley/Information Capacity
Theorem
• Consider a band-limited and power limited Gaussian Channel
Input:
• X(t) Zero mean, Stationary random process, band-limited to W Hz.
• Xk Random variable obtained by sampling X(t) at nyquist rate 2W
Hz.
• The input is power limited , with the average power, P=E[X 2]
k
Channel:
• The symbols are transmitted over noisy channel, that is band-limited
to W Hz.
• The channel output is assumed to be affected by AWGN, with zero
mean, PSD-No/2, B.W-W.
Output:
• Yk =Xk +Nk k=1,2....K
• Statistically independent, noise sample has zero mean, variance
σ2=NoW .
Shannon Hartley Theorem
• The capacity of the channel is given by,
C max I X ;Y / E X k2 P
f X ( x)
• where,
I X k ;Yk H (Yk ) H (Yk X k )
H (Yk ) H (N k )
• While adding two Gaussian random variables, the variance of
the resultant is the sum of the individual variances.
• Variance of the received sample=P+NoW
• Variance of the noise sample=NoW
Shannon Hartley Theorem
• The differential entropy of a Gaussian random variable with
variance 2 , is given by
1
log 2 (2e 2 )
2
• The differential entropy of the output sample,
1
H (Yk ) log 2 (2e(P N oW ))
2
• The differential entropy of the noise sample,
1
H (N k ) log 2 (2eNoW )
2
Shannon Hartley Theorem
• The capacity of the channel is given by,
C H (Yk ) H (N k )
1 1
log 2 (2e(P N oW )) log 2 (2eNoW )
2 2
1 2e(P NoW )
log 2
2 2eN oW
1 P
C log 2 1 bits/Channel use
2 N oW
Shannon Hartley Theorem
1 P
C log 2 1 bits/Sample
2 N oW
P
C W log 2 1 bits/second
N oW
Shannon Hartley Theorem
• Consider a band-limited Channel operating in the presence of
additive white Gaussian noise. The Shannon-Hartley theorem
states that the channel capacity is given by,
P
C W log 2 1
N oW
• P↑Channel capacity increases, can have more information bits
per transmission.
• However, the increase in capacity as a function of power is
logarithmic and slow.
• W ↑ As the bandwidth increases more samples can be
transmitted per second, increased transmission rate.
• On the other hand, noise power to the receiver increases, which
reduces the performance.
Effect of BW on the capacity
• To analyze the effect of increase in B.W, assume that B.W tend
to infinity
P
Lim W log 2 1
W
N oW
• Using L'Hospital's rule,
P WN o P
Lim log 2 1
W N
o P N oW
P WN o P
Lim log 1 log 2 e
W N
e
o P N oW
P
C log 2 e Merely increasing the B.W,
No cannot have a corresponding
P
C 1.44 bits/sec change in capacity
No
Effect of BW on the capacity
Shannon Limit
• In practical communication system, R<C. For a AWGN channel,
P
R W log 2 1
N oW
• Rewriting in terms of spectral bit rate –‘r’(bandwidth efficiency-
R/W)
P
r log 2 1
N oW
• Assume the energy per bit to be E b
P
R
RP
r log 2 1
N o RW
rEb
r log 2 1
No
Shannon Limit
rEb
r log 2 1
No
• Rewriting
E b 2 r 1
No r
• When B.Winfinity,
Eb
ln( 2) 0.693 1.6dB
No
• Hence for reliable communication it is essential that,
Eb Shannon’s Limit /
0.693 Shannon’s Power
No
Efficiency Limit
Summary
• Following topics were discussed in detail
– Channel capacity
– Channel coding theorem
– Shannon Hartley theorem
Test your understanding
1. A telephone network has a bandwidth of 3.4 kHz.
(a) Calculate the capacity of the channel for a signal-to-noise ratio of 30
dB.
(b) Calculate the minimum signal-to-noise ratio required for information
transmission through the channel at the rate of 9600 bits/s.