Unit IV Convolutional Codes
Unit IV Convolutional Codes
• Introduction Codes
of convolution code,
• State diagram,
R1:7.1 to 7.3 (page 381
• to
Polynomial description of convolution code, 408)
T2: Chapter 6
• Generator matrix of convolution code,
• Tree diagram,
• Trellis diagram,
• Sequential decoding and Viterbi decoding,
: R1 (Page 418)
• Known good convolution code,
: T2 Chapter 7 (Page 277 to
• Introduction to LDPC 278)
• Turbo codes. : T2 Chapter 7 (Page 209 to
212)
T2: J C Moreira, P G Farrell, “Essentials of Error-Control Coding”, Wiley
Student Edition.
T
1 2
T
3
Introduction of Convolutional Codes
‘k’ ( n, k, ‘n’
input m) outpu
encoder t
•Output depends not only on current set of ‘k’ input bits,
but also on ‘m’ past input.
• linear sequential ckt with input memory m.
5
Convolutional encoder representation
• Connection representation
• Connection vector or Polynomial representation
• The State diagram
• The tree diagram
• The trellis diagram
6
Connection Representation
Input, k=1
+ X = C B A
C
i/p B A Output, n =2
.....010100101=L
+ Y = C A
CODE RATE: r
L bit sequence produces a coded output sequence of length n(L+M) bits
r = L/n(L+M)
= no of bits to be transmitted/no. of bits being transmitted
L>>M, r L/nL,
r =1/n bits per symbol
v2
+ X= C B A
C
i/p B A
+ Y= C A
Message sequence: m
(101)
Time Output Time Output
(Branch word) (Branch wor
v1 v1
v1 v2
v1 v2
t1 1 0 0 1 1
t2 0 1 0 0 1
v2 v2
v1
v1
v1 v2 v1 v2
t3 1 0 1 0 0
t4 0 1 0 01
v2 v2
EXAMPLE:
+ X = C B A
Input, k=1 C
i/p B A Output, n =2
Let data=101 + Y = C A
U ( X ) m( X )g 1 X interlaced with
C B A X 101 then m (X ) 1 X
m( X )g 2m
m (X)g1 X (1 2X
2 2
)(1 X X 3 )
m ( X )g 2 X (1 X 4 )
1
(1 X X X2
) 2 1X
4
g 2 X 1 X 2
m (X )g1 X 1 X 0X 2 X 3 X
4
m (X )g2 X
1 0X 0X 2 0X 3
U(X ) (1,X 1)
4 (1, 0)X (0, 0)X 2 (1, 0)X 3 (1,
1)X 4
U 11 10 0 0 10 1
1
13-Dec-
2014
6.2 and 6.3
Time domain and Transform domain approach of
convolutional codes
+ X=CBA
C
i/p B A
+ Y=CA
States 22 = 4;
00 01 10 11
Only two transitions initiating from a state
Only two transitions ending up in a state
13
This Next For all possible combinations
i/p o/p
state 0
state C X
B A 1
B A Y BA C/XY
00 0 00 00
00 1 11 10 0/00
01 00 1111
00
00
01
10 11
00 0000
1100 0/11 1/11
10
01
10 11 0011 1/00
11
11 00 0011
01 10
01
11 11 1100 Prof. Sharada N. Ohatkar
0/10
11 1/01
+ X=C+B+A 0/01
C 11
i/p B A
+ Y=C+A
1/10
14
2.TREE diagram
This Next
state
BA
i/p o/p
state
Trellis diagram
C X
BA 00 0/00 00
Y
1/11 0/11
00 0 00 00
00 1 11 10 01 01
01 0 11 00
1/00
0/10
01 1 00 10
10 10
10 0 10 01 1/01
0 Prof. Sharada N.
10 1 01 11 0/01
Ohatkar
11 0 01 01 11 11
1 1/10
11 1 10 11
ti ti1
00 11 00 00 00 00
00 00 00 00 01 01
01
10 01 01
01 01 01 01 10 10
11 00
10 10
01
10 10 10 10 11 1011
11 11
11 11 11 11
i/p sequence
100111011 00 11 00 00 00 00
00 00 00 00 01 01
01
10 01 01
01 01 01 01 10 10
11 00
10 10
01
10 10 10 10 11 1011
BA C BA C BA C BA C BA C BA C BA C BA 11C BA 11C BA C BA C
11 11 11 11
to 1 t1 0 t2 0 t3 1 t4 1 t5 1 t6 0 t7 1 t8 1 t9
Pr0of. Shta1ra0d . O0hatkar
aN
00 00 00 00 00 00 00 00 00 00 00 00
11
11 11 11
01 01 01 01 01 01 01 01 01 01 01 01
00
10 01 01
10 Prof. 1Sh0arada N. 10 10 10 10 10 10 10 10 10
Oha1tk0ar 01 01
11 11 11 11 11 11
10
11 11 11 11 11 11
11 10 11 11 01 10 01 00 01 01
T.E.(ETC/Elex) Faculty Orientation
13-Dec- 1
Workshop on ITCT 8
2014
11
Minimum Free Distance of a Convolutional
Code
5 6 6
(Channel noise)
received sequence
This is the maximum likelihood criterion.
Idea of decoding by selecting the code sequence that is most alike the
received sequence.
For a code sequence of length L bits, there are 2RcL possible sequences, where
Rc is the rate of the code.
The maximum likelihood decoder selects a sequence c’ ,from the set of all
these possible sequences, which has the maximum similarity to the received
sequence. 22
6.12 Decoding of Convolutional Codes: The
Viterbi Algorithm
23
VETERBI DECODING
ERRORLESS
Received bits: 11 10 11 11 01 10 01 00 01 01 11
11 10 11 11 01 10 01 00 01 01
00 00 00 00 00 00 00 00 11
0 00 2 3 5,0 3,4 4,3 4,3 3,2 3,4 00 4,3 00 5,0
00 2,3
11 11 11 11 11 11 11 11
11
11 114,3 113,4 112,3 113,2 115,0 114,3 112,3 115,0 114,3
01 11 0
00 00 00 00 00 00 00 00 00
10 10 10 10 10 10 10 10 10 10
3
10
0 3,2 0,5 3,4 4,3 4,3 5,0 3,4 4,3
01 01 01 01 01 01 01 01
01 01 3,2
Prof. Sharada N. Ohatkar
01 01 01 01 01 01 01 01 01
11
2 10 4,3 10 3,4 10 0,5 10 5,0 10 3,2 10 4,3 10 0,5 10 3,2 10 4,3
o/pTsequence
T T T3 T4 T5 T6 T7 T8 T9
Prof. Sharada N. Ohatkar
T10 T11
0
0 1 1 1 0 0 24
1 2
Received bits: 11 10 11 11 01 10 01 00 01 01 11
11 10 11 11 01 01 00 01 01
00 10 11 00
0 00 2 3 55 00 00 55
0
00 ,,0 2,3 3,4 4,3 4,3 3,2 3,4 4,3 ,,0
1 0 11 11 11 11 1 0
1 1
1 11 0 1 4,3 1 3,4 112,3 55
01 1 1 3,2 4,3 112,3 55 1 4,3
0 001 00 00 ,,000 00 ,,0001
1 1 0 00 0 0
1 11 101 10 10
00 3
10
0 3,2 0,, 3,4 4,3 4 3 5,, 43
0 01 01 5 00 3,4 01 00 4 01 00 00 00 4 3,2
01
5 11 11 ,, 11 11 ,,
Prof. Sharada N. Ohatkar 01 01 01
1 1
11
2 10 4,3 00,5,5 550110 3,2 10 4,3 00 10 3,2 10 4,3
, ,5,
o/pTsequence
T 3,4
T T3 T4 T5
0,0 T6 T7 T8 5T9
Prof. Sharada N. Ohatkar
T10 T11
0
0 1 1.E.(ETC/ Ele1x) Faculty 1 1 0 0 25
1 2
T
13-1Dec-2014 Workshop on ITCT
O0rientation
Known good convolution codes:
• avoid ‘catastrophic convolutional code’.
The state diagram of a ‘catastrophic convolutional code’ includes at
least one loop in which a nonzero information sequence corresponds to
an all-zero output sequence.
•the maximum free distance for the given rate and constraint length
=0
LDPC Codes
Overview
Code Rate ratio of information bits to total number
of bits in codeword.
LDPC codes represented by Tanner Graphs
• two types of vertices: Bit Vertices and Check Vertices
Performance of LDPC code affected by presence
of cycles in Tanner graph.
0 1 2 3 4 5 6
1 1 0 1 0 0 0
0 1 1 0 1 0 0
H
0 0 1 1 0 1 0
0 0 0 1 1 0 1 0 1 2
3
LDPC Codes and Their Applications
Low Density Parity Check (LDPC) codes have superior
error performance
4 dB coding gain over convolutional codes
WiFi (802.11n)
0 1 2 3 4 5 6 7
Hard disks 8
Deep-space satellite Signal to Noise Ratio (dB)
missions
Turbo
• A Turbo Codes
Encoder