0% found this document useful (0 votes)
30 views

EC523 Advanced Comm. Lecture 1

Uploaded by

Twins Twin
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
30 views

EC523 Advanced Comm. Lecture 1

Uploaded by

Twins Twin
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 52

(EC523) Advanced Communication Systems

https://www.mediafire.com/#obp2z3cy03m3u
Dr. Hussein ELAttar

1
Course Contents

2
Course Contents
1. What is information theory?
– Measurement of Information
• Information Uncertainty
• Shannon's Entropy.
• Information rate.
2. Source coding : Huffman Coding.
3. Shannon Channel Capacity, Shannon Limit.
4. Channel Coding
a) Linear Block Codes
– General Steps for Construction of Code
– Decoding the Received Code Words
– Error Correction (syndrome )
– Hamming codes
b) Convolution Coding
– Convolution encoder
» State diagram for encoder
» Trellis diagram for encoder
– Viterbi Decoder
Course Contents
5. Introduction to Multiplexing and Multiple-access techniques in
Communication Systems
6. FDM and FDMA techniques, performance and capacity-FDM Systems
7. TDM and TDMA techniques, performance and capacity- TDM Systems
8. Spread Spectrum Techniques
9. Spreading Codes- M-Sequences- Gold Codes.
10. CDM and CDMA Techniques, Performance and capacity
11. Examples of CDMA Communication Systems- 3G Mobile Systems
12. Signal Propagation and Path Loss Models, with emphasis on Multipath fading
channel
13. Rayleigh fading channel (Flat and frequency selective fading, Coherence-
bandwidth RMS delay spread - Coherence time - Doppler frequency.
14. Multicarrier, OFDM and OFDMA Techniques.
15. Overview of receiver diversity (System model, selection combining, threshold
combining, equal gain combining, and maximal ratio combining).
16. 4G Systems (LTE-WIMAX)
Introduction
Shannon Wants to…
• Shannon wants to find a way for “reliably”
transmitting data throughout the channel at
“maximal” possible rate.

Information
Source Coding

Communication
Channel

Destination Decoding

For example, maximizing the


speed of ADSL @ your home
Shannon’s Vision

Source Channel
Data
Encoding Encoding
Channel
Source Channel
User
Decoding Decoding
Shannon’s Information Theory and Coding

Sent Channel Capacity (Shannon ) Received


messages messages
Symbols

Source
encoder Channel Channel
decoder Source
source coding coding
channel decoding decoding
receiver
0110101001110…

Error Detection and Correction Decompression


Compression
 Linear Block Codes
ex. Huffman (Hamming codes)
 Convolution codes
coding
1. Measurement of Information: Information Uncertainty
How much information does a message carry
from the sender to the receiver?
• the basic model of Information Theory is:

• Here, information is being transmitted from the sender to


the receiver. Prior to the transmission, the receiver has
no idea what is the content of the information.

• This implies the concept of information as a random


variable and ‘Information Uncertainty’ (i.e. the receiver is
uncertain of the content of the information, until after it
have received it down the transmission channel)
Information Uncertainty (surprise) Vs Probability

• Consider the following statements:


(a) Tomorrow, the sun will rise from the East.
(b) The phone will ring in the next 1 hour.
(c) It will snow in Cairo this summer.
Everybody knows the sun always rises from the East. The
probability of this event is almost 1. Thus, this statement
carries No information.(no surprise).probability is 1(sure)

The phone may or may not ring in the next 1 hour. The
probability of this event is less that the probability of event in
statement (a). Statement (b) carries more information than
statement (a). (more surprise)…. lower probability

It has never snowed in cairo in summer. The probability of this


event is very small. Statement (c) carries the most information
amongst all. (the most surprise) ….least probability
1 
I k = log2   = − log2 Pk
 Pk 
(Average Information)

n
1  n
H = ∑ Pk log2   = − ∑ Pk log2 Pk
x =1  Pk  x =1

Entropy is defined as the average information


of a random variable
we already knew with certainty what was going to happen in advance,
so there is no potential gain in information after learning the outcome.
A fair dice n
always results in H ( X ) = − ∑ P ( xi ) log2 P ( xi )
x =1
p = 1/6 for all
possible 6 H ( X ) = −(6)( 1 ) log2 ( 1 )
outcomes 6 6
H ( X ) = 2.584962
Course Contents
1. What is information theory?
– Measurement of Information
• Information Uncertainty
• Shannon's Entropy.
• Information rate.
2. Source coding : Huffman Coding.
3. Shannon Channel Capacity, Shannon Limit.
4. Channel Coding
a) Linear Block Codes
– General Steps for Construction of Code
– Decoding the Received Code Words
– Error Correction (syndrome )
– Hamming codes
b) Convolution Coding
– Convolution encoder
» State diagram for encoder
» Trellis diagram for encoder
– Viterbi Decoder
Fixed-length vs. Variable-length Encoding
 Fixed-length Encoding
 The highest entropy occurs when the
symbols have equal probabilities and in this
case the best thing to do is to allocate each
one an equal length code.
 ASCII code is fixed-length. The extended ASCII
standard uses 8 bits per character.
Fixed-length vs. Variable-length Encoding
Variable-length Encoding
 Character code lengths vary.
 Huffman encoding uses shorter
bit patterns for more common
characters, and longer bit
patterns for less common
characters
 Average codeword length

 Code efficiency
 Code Variance
R : is the information rate
r : is rate at which messages are generated
H : is Entropy or average information.
Huffman coding
Huffman Coding
Huffman Coding
If we used equal length code then we need 3bits as a codeword thus 3*5 = total of 15 bits
Here we used only 14 bits
Course Contents (part 1)
1. What is information theory?
– Measurement of Information
• Information Uncertainty
• Shannon's Entropy.
2. Source coding : Huffman Coding.
3. Shannon Channel Capacity, Shannon Limit.
4. Channel Coding
a) Linear Block Codes
– General Steps for Construction of Code
– Decoding the Received Code Words
– Error Correction (syndrome )
– Hamming codes
b) Convolution Coding
– Convolution encoder
» State diagram for encoder
» Trellis diagram for encoder
– Viterbi Decoder
Channel Capacity (Shannon)
Channel Capacity
• Channel Capacity is the maximum rate at which data can
be transmitted over a given communication channel,
under given conditions.
• we would like to get as high a data rate as possible at a
particular limit of error rate for a given bandwidth
• Problem: Given a Bandwidth, what data rate can we
achieve?

– Nyquist Formula
• Assume noise free

– Shannon Capacity Formula


• Assume white noise
Channel Capacity
Nyquist Formula
Assume a channel is noise free.
Nyquist formulation:
• Given a signal bandwidth B, highest signal rate is 2B.
• With multilevel signaling, the Nyquist formula becomes:

C = 2B(log2M)
– C is the channel capacity in bps.
–B is the given transmission bandwidth .
–M is the number of discrete signal levels
Channel Capacity (Shannon Capacity formula )
Channel Capacity
Shannon Capacity Formula
• Now consider the relationship among data rate, noise, and error rate.
• Faster data rate shortens each bit, so burst of noise affects more bits
– At given noise level, higher data rate results in higher error rate
• All of these concepts can be tied together neatly in a formula developed by Claude Shannon

.Capacity = BW log2(1+SNR)
– Only white noise is assumed. Therefore it represents the theoretical maximum that can be
achieved.This is referred to as error-free capacity.
• Some remarks:
– Given a level of noise, the channel capacity hence,the data rate could be increased
by increasing either signal strength or bandwidth.
– We would expect that a greater signal strength would improve the ability to receive
data correctly.
– But as the signal strength increases, so do the effects of nonlinearities in the system
which leads to an increase in intermodulation noise.
– Because noise is assumed to be white, the wider the bandwidth, the more noise is
admitted to the system. Thus, as B increases, SNR decreases.
39
Example
• Consider an example that relates the Nyquist and Shannon
formulations. Suppose the spectrum of a channel is between 3
MHz and 4 MHz, and SNRdB = 24dB. So,
B = 4 MHz – 3 MHz = 1 MHz
SNRdB = 24 dB = 10 log10(SNR)  SRN = 251
• Using Shannon’s formula, the capacity limit C is:
C = 106 x 1og2(1+251) ≈ 8 Mbps.
• If we want to achieve this limit, how many signaling levels are
required at least?
By Nyquist’s formula: C = 2Blog2M
We have 8 x 106 = 2 x 106 x log2M  M = 16.

40
Channel Capacity

Tradeoff between Bandwidth and Signal to Noise Ratio

Note:
Consider B and W are used interchangeably for Channel BW
Consider S and P are used interchangeably for Signal Power
Channel Capacity
• Noiseless channel has infinite capacity:

• Infinite bandwidth channel has limited capacity


Channel Capacity
Channel Capacity
Channel Capacity
Channel Capacity
Rate/Bandwidth and SNR, Eb/No Trade-off
Channel Capacity
Rate/Bandwidth and SNR, Eb/No Trade-off

Note:
Consider B and W are used interchangeably for Channel BW
Consider S and P are used interchangeably for Signal Power
Shannon limit …
• The Shannon theorem puts a limit on the
transmission data rate, not on the error
probability:
– Theoretically possible to transmit information at
any rate Rb ≤ C, with an arbitrary small error
probability by using a sufficiently complicated
coding scheme

– For an information rate Rb > C , it is not possible


to find a code that can achieve an arbitrary
small error probability.
Rb ≤ C

Rb > C
Notes on the Tradeoff

You might also like