0% found this document useful (0 votes)
29 views

16.548 Notes 15:: Concatenated Codes, Turbo Codes and Iterative Processing

This document discusses iterative processing and its application to error correction codes like turbo codes. It begins by introducing the concept of iterative decoding and its ability to approach the theoretical channel capacity limit. It then provides background on concatenated codes and how turbo codes are a type of parallel concatenated code that uses recursive systematic convolutional encoders and pseudo-random interleaving. The rest of the document discusses details of turbo code encoding, iterative decoding, performance improvements from additional iterations, and tradeoffs between complexity, latency, and performance.

Uploaded by

Rana
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
29 views

16.548 Notes 15:: Concatenated Codes, Turbo Codes and Iterative Processing

This document discusses iterative processing and its application to error correction codes like turbo codes. It begins by introducing the concept of iterative decoding and its ability to approach the theoretical channel capacity limit. It then provides background on concatenated codes and how turbo codes are a type of parallel concatenated code that uses recursive systematic convolutional encoders and pseudo-random interleaving. The rest of the document discusses details of turbo code encoding, iterative decoding, performance improvements from additional iterations, and tradeoffs between complexity, latency, and performance.

Uploaded by

Rana
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 84

16.

548 Notes 15:


Concatenated Codes, Turbo
Codes and
Iterative Processing
Outline
! Introduction
" Pushing the Bounds on Channel Capacity
" Theory of Iterative Decoding
" Recursive Convolutional Coding
" Theory of Concatenated codes
! Turbo codes
" Encoding
" Decoding
" Performance analysis
" Applications
! Other applications of iterative processing
" Joint equalization/FEC
" Joint multiuser detection/FEC
Shannon Capacity Theorem
Capacity as a function of Code
rate
Motivation: Performance
of Turbo Codes.
Theoretical Limit!
! Comparison:
" Rate 1/2 Codes.
" K=5 turbo code.
" K=14 convolutional code.
! Plot is from:
L. Perez, “Turbo Codes”, chapter 8
of Trellis Coding by C. Schlegel.
IEEE Press, 1997.

Gain of almost 2 dB!


Power Efficiency of Existing
Standards
Error Correction Coding
! Channel coding adds structured redundancy to a
transmission.
m Channel x
Encoder

" The input message m is composed of K symbols.


" The output code word x is composed of N symbols.
" Since N > K there is redundancy in the output.
" The code rate is r = K/N.
! Coding can be used to:
" Detect errors: ARQ
" Correct errors: FEC
Traditional Coding Techniques
The Turbo-Principle/Iterative
Decoding
! Turbo codes get their name because the decoder uses
feedback, like a turbo engine.
Theory of Iterative Coding
Theory of Iterative Coding(2)
Theory of Iterative Coding(3)
Theory of Iterative Coding (4)
Log Likelihood Algebra
New Operator Modulo 2 Addition
Example: Product Code
Iterative Product Decoding
RSC vs NSC
Recursive/Systematic Convolutional Coding
Concatenated Coding
! A single error correction code does not always provide
enough error protection with reasonable complexity.
! Solution: Concatenate two (or more) codes
" This creates a much more powerful code.
! Serial Concatenation (Forney, 1966)

Outer Block Inner


Encoder Interleaver Encoder

Channel

Outer De- Inner


Decoder interleaver Decoder
Concatenated Codes (2)
Alternative to Concatenated
Coding
Interleaver/Deinterleaver
Concatenated interleaver and De-
interleaver
Structure of Concatenated
Interleaver system
Iterative concatenated decoding
Turbo Codes
! Backgound
" Turbo codes were proposed by Berrou and Glavieux in the
1993 International Conference in Communications.
" Performance within 0.5 dB of the channel capacity limit for
BPSK was demonstrated.
! Features of turbo codes
" Parallel concatenated coding
" Recursive convolutional encoders
" Pseudo-random interleaving
" Iterative decoding
The building blocks of turbo
codes
! Recursive systematic codes
! Parallel Concatenation plus puncturing
! Interleaving
Recursive Systematic
Convolutional Encoding
x i( 0 ) ! An RSC encoder can be
constructed from a standard
mi xi convolutional encoder by
D D
feeding back one of the
x i(1 ) outputs.
! An RSC encoder has an
Constraint Length K= 3
infinite impulse response.
mi x i( 0 )
! An arbitrary input will cause
a “good” (high weight)
xi output with high probability.
ri ! Some inputs will cause
D D
“bad” (low weight) outputs.
x i( 1 )
Parallel Concatenated Codes
! Instead of concatenating in serial, codes can also be
concatenated in parallel.
! The original turbo code is a parallel concatenation of two
recursive systematic convolutional (RSC) codes.
" systematic: one of the outputs is the input.

Systematic Output
Input Encoder
#1
MUX
Interleaver

Parity
Output
Encoder
#2
Parallel Concatenation of RSC codes
Pseudo-random Interleaving
! The coding dilemma:
" Shannon showed that large block-length random codes achieve
channel capacity.
" However, codes must have structure that permits decoding with
reasonable complexity.
" Codes with structure don’t perform as well as random codes.
" “Almost all codes are good, except those that we can think of.”
! Solution:
" Make the code appear random, while maintaining enough
structure to permit decoding.
" This is the purpose of the pseudo-random interleaver.
" Turbo codes possess random-like properties.
" However, since the interleaving pattern is known, decoding is
possible.
Why Interleaving and
Recursive Encoding?
! In a coded systems:
" Performance is dominated by low weight code words.
! A “good” code:
" will produce low weight outputs with very low probability.
! An RSC code:
" Produces low weight outputs with fairly low probability.
" However, some inputs still cause low weight outputs.
! Because of the interleaver:
" The probability that both encoders have inputs that cause
low weight outputs is very low.
" Therefore the parallel concatenation of both encoders will
produce a “good” code.
Theory of Turbo-decoding
Turbo decoding (2)
Iterative Decoding
Deinterleaver
APP
APP
Interleaver
systematic Decoder Decoder
data #1 #2 hard bit
parity decisions
data DeMUX

Interleaver

! There is one decoder for each elementary encoder.


! Each decoder estimates the a posteriori probability (APP) of
each data bit.
! The APP’s are used as a priori information by the other
decoder.
! Decoding continues for a set number of iterations.
" Performance generally improves from iteration to iteration, but
follows a law of diminishing returns.
The log-MAP algorithm
1/10 α ( si ) γ ( si → si +1 ) β ( si +1 )
S3
0/0
1

1
0/0
S2 0
1/1
0/0
0

1/ 1
S1

1
11
1/

S0 0/00
i=0 i=1 i=2 i=3 i=4 i=5 i=6

The log-MAP algorithm:


Performs arithmetic in the log domain
Multiplies become additions
Additions use the Jacobian Logarithm:

ln(e x + e y ) = max( x, y ) + ln(1 + e −| y − x| )


Decoding with a feedback loop
Iterative Turbo Decoder
Performance as a Function of
Number of Iterations
10
0 ! K=5
! r=1/2
-1
10
1 iteration
! L=65,536
-2
10

2 iterations
-3
10
BER

-4
10 6 iterations 3 iterations

-5
10 10 iterations

-6
10 18 iterations

-7
10
0.5 1 1.5 2
E b /N o in dB
Another Example
Performance Factors
and Tradeoffs
! Complexity vs. performance
" Decoding algorithm.
" Number of iterations.
" Encoder constraint length
! Latency vs. performance
" Frame size.
! Spectral efficiency vs. performance
" Overall code rate
! Other factors
" Interleaver design.
" Puncture pattern.
" Trellis termination.
Performance Bounds for
Linear Block Codes
! Union bound for soft-decision decoding:
wi 
2N
2rEb 
Pb ≤ ∑ Q d i 

i =1 N  No 
! For convolutional and turbo codes this becomes:
n ( m+ N ) ~ 
Nd w 2rEb 
Pb ≤ ∑ N
d
Q d
No


d = d free  
! The free-distance asymptote is the first term of the sum:
~
N free w  
2rEb
Pb ≈ free
Q d free 

N  No 
! For convolutional codes N is unbounded and:
 2rEb 
Pb ≈ W Q d free
d
0


 No 
Free-distance Asymptotes
0
10
Convolutional Code ! For convolutional code:
CC free distance asymptote
Turbo Code " dfree = 18
-2 TC free distance asymptote
10
" Wdo = 187
 Eb 

Pb ≈ 187Q 18
-4  N o 
10
For turbo code
BER

" dfree = 6
-6
10 " Nfree = 3
" wfree = 2
3 ⋅ 2  E 
10
-8 Pb ≈ Q 6 b 
65536  No 

0.5 1 1.5 2 2.5 3 3.5 4


E b/No in dB
Application: Turbo Codes for
Wireless Multimedia
! Multimedia systems require varying quality of service.
" QoS
" Latency
# Low latency for voice, teleconferencing
" Bit/frame error rate (BER, FER)
# Low BER for data transmission.
! The tradeoffs inherent in turbo codes match with the
tradeoffs required by multimedia systems.
" Data: use large frame sizes
# Low BER, but long latency
" Voice: use small frame sizes
# Short latency, but higher BER
Influence of Interleaver Size
! Constraint Length 5.
! Rate r = 1/2.
-1
10
! Log-MAP decoding.
L = 1,024
-2
L = 4,096 ! 18 iterations.
10 L = 16,384
L = 65,536 ! AWGN Channel.
-3
Voice
10

-4 Video
BER

10
Conferencing

10
-5
Replayed
Video
-6
10
Data
-7
10
0.5 1 1.5 2 2.5
E b /N o in dB
Application: Turbo Codes for
Fading Channels
! The turbo decoding algorithm requires accurate
estimates of channel parameters:
" Branch metric:
γ ( s i → s i + 1 ) = ln P [ m i ] + z is x is + z ip x ip
 4 a i* E s  2
z i =   ri = 2 ri a i*
 No  σ
" Average signal-to-noise ratio (SNR).
" Fading amplitude.
" Phase.
! Because turbo codes operate at low SNR, conventional
methods for channel estimation often fail.
" Therefore channel estimation and tracking is a critical issue
with turbo codes.
Fading Channel Model
! Antipodal modulation:
sk = {−1,+1}
! Gaussian Noise:
No
Pn =
Channel
sk 2Es
Turbo BPSK
Encoder Interleaver Modulator ! Complex Fading:
ak = (α + X k ) + jYk
ak " α is a constant.
# α=0 for Rayleigh Fading
nk # α>0 for Rician Fading
" X and Y are Gaussian
Turbo De- BPSK random processes with
Decoder interleaver Demod autocorrelation:
R( k ) = J o ( 2πf d Ts k )
Pilot Symbol Assisted
Modulation
! Pilot symbols:
" Known values that are periodically inserted into the transmitted
code stream.
" Used to assist the operation of a channel estimator at the
receiver.
" Allow for coherent detection over channels that are unknown and
time varying.
segment #1 segment #2

symbol symbol symbol symbol


#1 #Mp #1 #Mp

symbol pilot symbol symbol pilot symbol


#1 symbol #Mp #1 symbol #Mp

pilot symbols added here


Pilot Symbol Assisted
Turbo Decoding
dj xi xi sk ! Desired statistic:
Turbo Insert
Channel
Encoder Interleaver Pilot
ak  2 
Symbols Re 2 rk ak* 
nk σ 

2
! Initial estimates are
Delay
σ2 rk found using pilot
(q)
aˆ symbols only.
Re {}
⋅ (⋅ ) ∗ k
Filter
! Estimates for later
Compare xˆi( q ) xˆi( q ) Insert sˆk( q ) iterations also use
to Channel Pilot
Threshold Interleaver Symbols
data decoded with
high reliability.
Λ(qi)
! “Decision directed”
(q ) (q)
Remove y i y i Turbo
Pilot Channel
Decoder
Deinterleaver
Symbols
dˆ (j q )
Performance of
1
Pilot Symbol Assisted Decoding
10

DPSK with differential detection


! Simulation parameters:
BPSK with estimation prior to decoding
BPSK with refined estimation
" Rayleigh flat-fading.
0 BPSK with perfect channel estimates
10 " r=1/2, K=3
" 1,024 bit random interleaver.
-1
10
" 8 iterations of log-MAP.
" fdTs = .005
" Mp = 16
BER

-2
10
! Estimation prior to decoding
degrades performance by 2.5 dB.
-3
10 ! Estimation during decoding only
degrades performance by 1.5 dB.
-4
! Noncoherent reception degrades
10
performance by 5 dB.

-5
10
0 1 2 3 4 5 6 7 8 9 10
E b /N o in dB
Other Applications
of Turbo Decoding
! The turbo-principle is more general than merely its
application to the decoding of turbo codes.
! The “Turbo Principle” can be described as:
" “Never discard information prematurely that may be useful
in making a decision until all decisions related to that
information have been completed.”
-Andrew Viterbi
" “It is a capital mistake to theorize before you have all the
evidence. It biases the judgement.”
-Sir Arthur Conan Doyle
! Can be used to improve the interface in systems that
employ multiple trellis-based algorithms.
Applications of the
Turbo Principle
! Other applications of the turbo principle include:
" Decoding serially concatenated codes.
" Combined equalization and error correction decoding.
" Combined multiuser detection and error correction
decoding.
" (Spatial) diversity combining for coded systems in the
presence of MAI or ISI.
Serial Concatenated Codes
! The turbo decoder can also be used to decode serially
concatenated codes.
" Typically two convolutional codes.

n(t)
Outer Inner AWGN
Data Convolutional interleaver Convolutional
Encoder Encoder

Turbo
interleaver
Decoder
APP
Inner deinterleaver Outer Estimated
Decoder Decoder Data
Performance of Serial
Concatenated Turbo Code
! Plot is from:
S. Benedetto, et al “Serial Concatenation
of Interleaved Codes: Performance
Analysis, Design, and Iterative Decoding”
Proc., Int. Symp. on Info. Theory, 1997.

! Rate r=1/3.
! Interleaver size L = 16,384.
! K = 3 encoders.
! Serial concatenated codes
do not seem to have a bit
error rate floor.
Turbo Equalization
! The “inner code” of a serial concatenation could be an
Intersymbol Interference (ISI) channel.
" ISI channel can be interpreted as a rate 1 code defined
over the field of real numbers.
n(t)
(Outer) AWGN
Data Convolutional interleaver ISI
Encoder Channel

Turbo
interleaver
Equalizer
APP
(Outer)
SISO deinterleaver SISO Estimated
Equalizer Decoder Data
Performance of Turbo Equalizer
! Plot is from:
C. Douillard,et al “Iterative Correction of
Intersymbol Interference: Turbo-
Equaliztion”, European Transactions on
Telecommuications, Sept./Oct. 1997.

! M=5 independent
multipaths.
" Symbol spaced paths
" Stationary channel.
" Perfectly known channel.
! (2,1,5) convolutional code.
Turbo Multiuser Detection
! The “inner code” of a serial concatenation could be a
multiple-access interference (MAI) channel.
" MAI channel describes the interaction between K
nonorthogonal users sharing the same channel.
" MAI channel can be thought of as a time varying ISI
channel.
" MAI channel is a rate 1 code with time-varying coefficients
over the field of real numbers.
" The input to the MAI channel consists of the encoded and
interleaved sequences of all K users in the system.
! MAI channel can be:
" CDMA: Code Division Multiple Access
" TDMA: Time Division Multiple Access
System Diagram
“multiuser interleaver”
d1 Convolutional b1
Encoder interleaver #1
#1

Parallel b MAI
to Channel
Serial n(t)
AWGN
dK Convolutional bK
Encoder interleaver #K
#K

Turbo
MUD
Λ(q ) multiuser Λ(q ')
interleaver
APP
SISO Ψ (q ) multiuser
Ψ (q ') Bank of
dˆ ( q )
y MUD deinterleaver
K SISO
Decoders
Estimated
Data
Simulation Results:
MAI Channel w/ AWGN
! From:
" M. Moher, “An iterative algorithm
for asynchronous coded multiuser
detection,” IEEE Comm. Letters,
Aug.1998.
! Generic MA system
" K=3 asynchronous users.
" Identical pulse shapes.
" Each user has its own interleaver.
! Convolutionally coded.
" Constraint length 3.
" Code rate 1/2.
! Iterative decoder.
Conclusion
! Turbo code advantages:
" Remarkable power efficiency in AWGN and flat-fading
channels for moderately low BER.
" Deign tradeoffs suitable for delivery of multimedia services.
! Turbo code disadvantages:
" Long latency.
" Poor performance at very low BER.
" Because turbo codes operate at very low SNR, channel
estimation and tracking is a critical issue.
! The principle of iterative or “turbo” processing can be
applied to other problems.
" Turbo-multiuser detection can improve performance of
coded multiple-access systems.

You might also like