0% found this document useful (0 votes)
193 views29 pages

Channel Capacity, Shannon Hartley Law and Shannon's Limit

Channel capacity refers to the maximum rate at which information can be transmitted over a communications channel with an arbitrarily low error rate. The Shannon-Hartley theorem states that channel capacity is equal to the bandwidth of the channel multiplied by the logarithm of one plus the signal-to-noise ratio. Channel capacity is affected by factors like attenuation, noise, and non-linear effects in the channel. The Shannon-Hartley theorem defines the fundamental limit on the rate of error-free transmission over a power-limited and bandwidth-limited channel.

Uploaded by

stanley
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
193 views29 pages

Channel Capacity, Shannon Hartley Law and Shannon's Limit

Channel capacity refers to the maximum rate at which information can be transmitted over a communications channel with an arbitrarily low error rate. The Shannon-Hartley theorem states that channel capacity is equal to the bandwidth of the channel multiplied by the logarithm of one plus the signal-to-noise ratio. Channel capacity is affected by factors like attenuation, noise, and non-linear effects in the channel. The Shannon-Hartley theorem defines the fundamental limit on the rate of error-free transmission over a power-limited and bandwidth-limited channel.

Uploaded by

stanley
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 29

Channel Capacity, Shannon Hartley

law and Shannon’s limit

Presentation by
Dr.R.Hemalatha,Asso.Prof / ECE
SSN College of Engineering
Objectives
• To discuss about
– Channel capacity
– Channel coding theorem
– Shannon Hartley theorem
Channel Capacity
• Channel capacity is concerned with the information handling capacity
of a given channel. It is affected by:
– The attenuation of a channel which varies with frequency as well
as channel length.
– The noise induced into the channel which increases with distance.
– Non-linear effects such as clipping on the signal.
• Some of the effects may change with time e.g. the frequency response
of a copper cable changes with temperature and age.
• Hence, modelling a channel is essential to estimate how much
information can be passed through it.
• Non linear effects and attenuation can be compensated, but it is
extremely difficult to remove noise.
• The highest rate of information that can be transmitted through a
channel with least error prob is called the channel capacity, C.
Channel Capacity
• Channel capacity of a discrete memoryless channel is the
maximum mutual information I(X;Y) in any single use of the
channel, where the maximization is over all possible input
probability distributions.

C  max I X ;Y 
pxj 
• C is denoted by bits / channel use or bits / transmission.
Channel Coding Theorem
• Let a discrete memoryless source with an alphabet S have entropy H(S) and
produce symbols every Ts seconds. Let a discrete memoryless channel have
capacity C and be used every Tc seconds. Then if,

H (S ) C

Ts Tc
• There exists a coding scheme for which the source output can be transmitted
over the channel and can be reconstructed with an arbitrarily small
probability of error. Conversely, if,

H (S ) C

Ts Tc
• It is not possible to transmit information over the channel and reconstruct it
with an arbitrarily small probability of error.
Application of Channel Coding Theorem to
Binary Symmetric Channels
• Consider a discrete memoryless source emits equally likely
symbols 0 and 1 every Ts seconds.
• Source Entropy, H(S) = 1 bit / source symbol.
• Therefore,
1 C

Ts Tc
Application of Channel Coding Theorem to
Binary Symmetric Channels
• Consider a discrete memoryless source emits equally likely
symbols 0 and 1 every Ts seconds.
• Source Entropy, H(S) = 1 bit / source symbol.
• Therefore,
1 C

Ts Tc

• The ratio Tc / Ts equals the code rate of the channel encoder


denoted by R.
1 C
 RC
Ts Tc
Shannon’s Channel Coding Theorem
• Shannon’s Channel Coding Theorem states that if the
information rate, R (bits/s) is equal to or less than the channel
capacity, C, (i.e. R < C) then there is, in principle, a coding
technique which enables transmission over the noisy channel
with no errors.
• The inverse of this is that if R > C, then the probability of error
is close to 1 for every symbol.
• Thus the channel capacity defines the:
“the maximum rate of reliable (error-free) information
transmission through the channel”
Shannon Hartley/Information Capacity
Theorem

Noise

Source Sampling Output


Shannon Hartley/Information Capacity
Theorem

Noise

X (t)
Source Sampling Output

Mean=0
B.W=W
2
E[Xk ]=P
Shannon Hartley/Information Capacity
Theorem

Noise

X (t) Xk
Source Sampling Output

Mean=0 Nyquist
B.W=W Rate
2
E[Xk ]=P
Shannon Hartley/Information Capacity
Theorem
WGN: Mean=0
PSD=No/2
Noise B.W=W
Variance= NoW
X (t) Xk Nk
Source Sampling Output

Mean=0 Nyquist
B.W=W Rate
2
E[Xk ]=P
Shannon Hartley/Information Capacity
Theorem
WGN: Mean=0
PSD=No/2
Noise B.W=W
Variance= NoW
X (t) Xk Nk
Source Sampling Output

Yk  X k  N k
Mean=0 Nyquist
B.W=W Rate
2
E[Xk ]=P
Shannon Hartley/Information Capacity
Theorem
• Consider a band-limited and power limited Gaussian Channel
Input:
• X(t) Zero mean, Stationary random process, band-limited to W Hz.
• Xk Random variable obtained by sampling X(t) at nyquist rate 2W
Hz.
• The input is power limited , with the average power, P=E[X 2]
k
Channel:
• The symbols are transmitted over noisy channel, that is band-limited
to W Hz.
• The channel output is assumed to be affected by AWGN, with zero
mean, PSD-No/2, B.W-W.
Output:
• Yk =Xk +Nk k=1,2....K
• Statistically independent, noise sample has zero mean, variance
σ2=NoW .
Shannon Hartley Theorem
• The capacity of the channel is given by,


C  max I X ;Y / E X k2  P
f X ( x)
  
• where,
I X k ;Yk  H (Yk )  H (Yk X k )
 H (Yk )  H (N k )
• While adding two Gaussian random variables, the variance of
the resultant is the sum of the individual variances.
• Variance of the received sample=P+NoW
• Variance of the noise sample=NoW
Shannon Hartley Theorem
• The differential entropy of a Gaussian random variable with
variance  2 , is given by
1
log 2 (2e 2 )
2
• The differential entropy of the output sample,
1
H (Yk )  log 2 (2e(P  N oW ))
2
• The differential entropy of the noise sample,
1
H (N k )  log 2 (2eNoW )
2
Shannon Hartley Theorem
• The capacity of the channel is given by,

C  H (Yk )  H (N k )
1 1
 log 2 (2e(P  N oW ))  log 2 (2eNoW )
2 2
1  2e(P  NoW ) 
 log 2  
2  2eN oW 
1  P 
C  log 2  1 bits/Channel use
2  N oW 
Shannon Hartley Theorem
1  P 
C  log 2  1 bits/Sample
2  N oW 

• 2W samples are transmitted per second, Hence the capacity is


given by,

 P 
C  W log 2 1 bits/second
 N oW 
Shannon Hartley Theorem
• Consider a band-limited Channel operating in the presence of
additive white Gaussian noise. The Shannon-Hartley theorem
states that the channel capacity is given by,

C  W log2 1 SNR


• where,
C – Channel capacity in bits per second
W – Bandwidth of the channel in Hz.
SNR – Signal to Noise power ratio.
Shannon Hartley Theorem
• The channel capacity, C, increases as the available bandwidth
increases and as the signal to noise ratio increases (improves).
• It is applicable to both analogue and data communications, but
most common in data communications.
• The channel capacity theorem is one of the most important
results of information theory. It highlights the interplay between
3 key system parameters:
– channel bandwidth,
– average transmitted or received signal power,
– noise power at the channel output.
Shannon Hartley Theorem
• For a given average transmitted power P and channel bandwidth,
B, we can transmit information at the rate C bits/s with no error,
by employing sufficiently complex coding systems.
• It is not possible to transmit at a rate higher than C bits/s by any
coding system without a definite probability of error.
• Hence the channel capacity theorem defines the fundamental
limit on the rate of error-free transmission for a power-limited,
band-limited channel.
Bounds on Communication

 P 
C  W log 2 1
 N oW 
• P↑Channel capacity increases, can have more information bits
per transmission.
• However, the increase in capacity as a function of power is
logarithmic and slow.
• W ↑ As the bandwidth increases more samples can be
transmitted per second, increased transmission rate.
• On the other hand, noise power to the receiver increases, which
reduces the performance.
Effect of BW on the capacity
• To analyze the effect of increase in B.W, assume that B.W tend
to infinity
 P 
Lim W log 2 1 
W 
 N oW 
• Using L'Hospital's rule,
P  WN o  P 
Lim  log 2  1 
W  N  
o  P  N oW 
P  WN o  P  
Lim log  1  log 2 e 
W  N  
e
o  P  N oW  
P
C log 2 e Merely increasing the B.W,
No cannot have a corresponding
P
C 1.44 bits/sec change in capacity
No
Effect of BW on the capacity
Shannon Limit
• In practical communication system, R<C. For a AWGN channel,
 P 
R  W log 2 1 
 N oW 
• Rewriting in terms of spectral bit rate –‘r’(bandwidth efficiency-
R/W)
 P 
r  log 2 1 
 N oW 
• Assume the energy per bit to be E b 
P
R
 RP 
r  log 2 1 
 N o RW 
 rEb 
r  log 2 1 
 No 
Shannon Limit
 rEb 
r  log 2 1 
 No 
• Rewriting
E b 2 r 1

No r

• When B.Winfinity,
Eb
 ln( 2)  0.693 1.6dB
No
• Hence for reliable communication it is essential that,
Eb Shannon’s Limit /
 0.693 Shannon’s Power
No
Efficiency Limit
Summary
• Following topics were discussed in detail
– Channel capacity
– Channel coding theorem
– Shannon Hartley theorem
Test your understanding
1. A telephone network has a bandwidth of 3.4 kHz.
(a) Calculate the capacity of the channel for a signal-to-noise ratio of 30
dB.
(b) Calculate the minimum signal-to-noise ratio required for information
transmission through the channel at the rate of 9600 bits/s.

2. A communications channel with a bandwidth of 4 kHz has a


signal power to noise ratio of 7. The bandwidth is reduced by
25%. How much should the signal power be increased to
maintain the same channel capacity?
Thank you !

You might also like