EC523 Advanced Comm. Lecture 1
EC523 Advanced Comm. Lecture 1
https://www.mediafire.com/#obp2z3cy03m3u
Dr. Hussein ELAttar
1
Course Contents
2
Course Contents
1. What is information theory?
– Measurement of Information
• Information Uncertainty
• Shannon's Entropy.
• Information rate.
2. Source coding : Huffman Coding.
3. Shannon Channel Capacity, Shannon Limit.
4. Channel Coding
a) Linear Block Codes
– General Steps for Construction of Code
– Decoding the Received Code Words
– Error Correction (syndrome )
– Hamming codes
b) Convolution Coding
– Convolution encoder
» State diagram for encoder
» Trellis diagram for encoder
– Viterbi Decoder
Course Contents
5. Introduction to Multiplexing and Multiple-access techniques in
Communication Systems
6. FDM and FDMA techniques, performance and capacity-FDM Systems
7. TDM and TDMA techniques, performance and capacity- TDM Systems
8. Spread Spectrum Techniques
9. Spreading Codes- M-Sequences- Gold Codes.
10. CDM and CDMA Techniques, Performance and capacity
11. Examples of CDMA Communication Systems- 3G Mobile Systems
12. Signal Propagation and Path Loss Models, with emphasis on Multipath fading
channel
13. Rayleigh fading channel (Flat and frequency selective fading, Coherence-
bandwidth RMS delay spread - Coherence time - Doppler frequency.
14. Multicarrier, OFDM and OFDMA Techniques.
15. Overview of receiver diversity (System model, selection combining, threshold
combining, equal gain combining, and maximal ratio combining).
16. 4G Systems (LTE-WIMAX)
Introduction
Shannon Wants to…
• Shannon wants to find a way for “reliably”
transmitting data throughout the channel at
“maximal” possible rate.
Information
Source Coding
Communication
Channel
Destination Decoding
Source Channel
Data
Encoding Encoding
Channel
Source Channel
User
Decoding Decoding
Shannon’s Information Theory and Coding
Source
encoder Channel Channel
decoder Source
source coding coding
channel decoding decoding
receiver
0110101001110…
The phone may or may not ring in the next 1 hour. The
probability of this event is less that the probability of event in
statement (a). Statement (b) carries more information than
statement (a). (more surprise)…. lower probability
n
1 n
H = ∑ Pk log2 = − ∑ Pk log2 Pk
x =1 Pk x =1
Code efficiency
Code Variance
R : is the information rate
r : is rate at which messages are generated
H : is Entropy or average information.
Huffman coding
Huffman Coding
Huffman Coding
If we used equal length code then we need 3bits as a codeword thus 3*5 = total of 15 bits
Here we used only 14 bits
Course Contents (part 1)
1. What is information theory?
– Measurement of Information
• Information Uncertainty
• Shannon's Entropy.
2. Source coding : Huffman Coding.
3. Shannon Channel Capacity, Shannon Limit.
4. Channel Coding
a) Linear Block Codes
– General Steps for Construction of Code
– Decoding the Received Code Words
– Error Correction (syndrome )
– Hamming codes
b) Convolution Coding
– Convolution encoder
» State diagram for encoder
» Trellis diagram for encoder
– Viterbi Decoder
Channel Capacity (Shannon)
Channel Capacity
• Channel Capacity is the maximum rate at which data can
be transmitted over a given communication channel,
under given conditions.
• we would like to get as high a data rate as possible at a
particular limit of error rate for a given bandwidth
• Problem: Given a Bandwidth, what data rate can we
achieve?
– Nyquist Formula
• Assume noise free
C = 2B(log2M)
– C is the channel capacity in bps.
–B is the given transmission bandwidth .
–M is the number of discrete signal levels
Channel Capacity (Shannon Capacity formula )
Channel Capacity
Shannon Capacity Formula
• Now consider the relationship among data rate, noise, and error rate.
• Faster data rate shortens each bit, so burst of noise affects more bits
– At given noise level, higher data rate results in higher error rate
• All of these concepts can be tied together neatly in a formula developed by Claude Shannon
.Capacity = BW log2(1+SNR)
– Only white noise is assumed. Therefore it represents the theoretical maximum that can be
achieved.This is referred to as error-free capacity.
• Some remarks:
– Given a level of noise, the channel capacity hence,the data rate could be increased
by increasing either signal strength or bandwidth.
– We would expect that a greater signal strength would improve the ability to receive
data correctly.
– But as the signal strength increases, so do the effects of nonlinearities in the system
which leads to an increase in intermodulation noise.
– Because noise is assumed to be white, the wider the bandwidth, the more noise is
admitted to the system. Thus, as B increases, SNR decreases.
39
Example
• Consider an example that relates the Nyquist and Shannon
formulations. Suppose the spectrum of a channel is between 3
MHz and 4 MHz, and SNRdB = 24dB. So,
B = 4 MHz – 3 MHz = 1 MHz
SNRdB = 24 dB = 10 log10(SNR) SRN = 251
• Using Shannon’s formula, the capacity limit C is:
C = 106 x 1og2(1+251) ≈ 8 Mbps.
• If we want to achieve this limit, how many signaling levels are
required at least?
By Nyquist’s formula: C = 2Blog2M
We have 8 x 106 = 2 x 106 x log2M M = 16.
40
Channel Capacity
Note:
Consider B and W are used interchangeably for Channel BW
Consider S and P are used interchangeably for Signal Power
Channel Capacity
• Noiseless channel has infinite capacity:
Note:
Consider B and W are used interchangeably for Channel BW
Consider S and P are used interchangeably for Signal Power
Shannon limit …
• The Shannon theorem puts a limit on the
transmission data rate, not on the error
probability:
– Theoretically possible to transmit information at
any rate Rb ≤ C, with an arbitrary small error
probability by using a sufficiently complicated
coding scheme
Rb > C
Notes on the Tradeoff