0% found this document useful (0 votes)
13 views

Lecture 1

The document discusses Shannon's channel capacity theorem and how it established that errors can be reduced to any desired level through proper encoding without sacrificing transmission rate below the channel capacity. It then discusses different types of codes like block codes (Hamming, BCH, RS, LDPC) and convolutional codes (Turbo codes) used for encoding data before transmission. It provides examples of Hamming (7,4) code and how encoding helps detect and correct errors.

Uploaded by

elmzyonasara
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
13 views

Lecture 1

The document discusses Shannon's channel capacity theorem and how it established that errors can be reduced to any desired level through proper encoding without sacrificing transmission rate below the channel capacity. It then discusses different types of codes like block codes (Hamming, BCH, RS, LDPC) and convolutional codes (Turbo codes) used for encoding data before transmission. It provides examples of Hamming (7,4) code and how encoding helps detect and correct errors.

Uploaded by

elmzyonasara
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 18

Introduction

z Shannon demonstrated that by proper


encoding of information, errors introduced
by a noisy environment can be reduced to
any desired level without sacrificing
transmission rate, as long as transmission
rate is below capacity of channel.
z Since Shannon’s work, much research has
been done to find efficient encoding and
decoding methods.
Introduction (2)

z Transmission and storage of digital


information are two processes that transfer
data from an information source to a
destination.
Digital Source

m c
Information Source Channel Modulator/

Equivalent channel
source coder coder writer

Channel/
memory
Source m’ Channel c’ Demod./
Sink

Destination
decoder decoder reader
Types of Codes

z Two structurally different types of codes are


typically used:
z Block Codes
z Hamming
z BCH, RS
z LDPC

z Convolutional Codes
z Turbo codes typically use convolutional codes as
constituent codes
z TCM based on convolutional codes
Block Codes
z A block of k digital symbols is input to the encoder
from which n digital symbols are output (typically
n > k).

Block
encoder
k symbols n symbols

z Each k bit message sequence is mapped to one of


Mk possible codewords. Since there are Mn
possible M-ary sequences of length n, errors can be
detected if an invalid codeword is received.
Block Codes (2)

z Code rate R = k/n.


z The message sequence carries k symbols of
information.
z The codeword, which carries k symbols of
information, is made up of n symbols.
z There are (n-k) redundant symbols.
Convolutional Codes

zA convolutional code also produces n


symbols for k input symbols.
z However, output not only depends on current
k inputs but also on km previous inputs,
where m is the encoder memory.
z Encoder is implemented by a sequential
logic circuit.
Modulation and Coding

z Symbol rate = Rs, signaling interval = T = 1/Rs.


z For each symbol, the modulator selects a
waveform of duration T to represent the symbol to
be transmitted.
z Example BPSK:

2 Eb
s0 (t ) = cos(2πf c t + π ), 0 ≤ t ≤ T
T
2 Eb
s1 (t ) = cos(2πf c t ), 0 ≤ t ≤ T
T
Modulation and Coding (2)
z Transmitted signal is:
N
s (t ) = ∑ si (t − nT )
n =0
where i = 0,1,…,M-1 and is random (i = 0 or 1
for binary case).
z The received signal is:
r (t ) = a (t ) s (t − τ (t )) + n(t )
where a(t) is the time varying channel gain, τ(t) is the
delay introduced by the channel and n(t) is additive noise.
Modulation and Coding (3)

z AWGN Channel
z a(t) = a and τ(t) = τ.
z Flat Rayleigh Fading
z a(t) = time varying Rayleigh envelope
z τ(t) introduces time varying phase shift.

z Noise introduces detection errors at the


receiver.
z Error rate is function of Es/No.
Modulation and Coding (4)

z BER for BPSK in AWGN is:


⎛ 2 Eb ⎞
Pb = Q⎜ ⎟
⎝ No

z BER for BPSK in slow flat Rayleigh fading with


ideal channel phase compensation:
1⎡ γb ⎤
Pb = ⎢1 − ⎥
2 ⎢⎣ 1+ γ b ⎥⎦
Modulation and Coding (5)

z Coding increases symbol rate (k info bits


without coding, n code bits after coding).
z For same transmitted power, the code bit
energy is less than the uncoded bit energy Ec
= REb = (k/n)Eb.
z Therefore, the probability that a code bit is
incorrectly detected is higher than the
probability that an uncoded bit is incorrectly
detected.
Modulation and Coding (6)

z Coded data streams provide improved bit


error rates after decoding due to the error
correction capabilities of the code.
Example Hamming (7,4)

z 0000 0000000 1000 1000110


z 0001 0001101 1001 1001011
z 0010 0010111 1010 1010001
z 0011 0011010 1011 1011100
z 0100 0100011 1100 1100101
z 0101 0101110 1101 1101000
z 0110 0110100 1110 1110010
z 0111 0111001 1111 1111111
Example Hamming (7,4)

z Assume that we transmit 0000 in the uncoded case.


z If the first bit is incorrectly detected, we receive 1000,
which is a valid message.
z Assume that we transmit 0000000 in the uncoded
case
z If the first bit is detected in error, we receive 1000000,
which is not a valid codeword.
z Error has been detected.
z Codeword 0000000 differs from 1000000 in only one bit
position. All other codewords differ from 1000000 in at
least two positions.
Example Hamming (7,4)

z Assuming independent errors


z P(uncoded word error) = 1-(1-Pu)4.
z P(coded word error) = 1-(1-Pc)7-7Pc(1-Pc)6.
z In AWGN channel:

⎛ 2 Eb ⎞ ⎛ 2 ( 4 / 7 ) Eb ⎞
Pu = Q⎜ ⎟, Pc = Q⎜ ⎟
⎝ N0
⎠ ⎝ N0

Example Hamming (7,4) WER
Word Error Rate

1.00E+00

1.00E-01
WER

1.00E-02 uncoded

coded

1.00E-03

1.00E-04
0 1 2 3 4 5 6 7 8 9
Eb/No (dB)
Example Hamming (7,4) BER

z BER
z Uncoded Pb = Pu.
z Coded

Pb = 9 Pc2 (1 − Pc ) 5 + 19 Pc3 (1 − Pc ) 4 + 16 Pc4 (1 − Pc ) 3 +


12 Pc5 (1 − Pc ) 2 + 7 Pc6 (1 − Pc )1 + Pc7
Example Hamming (7,4) BER
Bit Error Rate

1.00E+00

1.00E-01
BER

1.00E-02

uncoded

coded
1.00E-03

1.00E-04
0 1 2 3 4 5 6 7 8 9
Eb/No (in dB)

You might also like