0% found this document useful (0 votes)
93 views25 pages

Convolutional Codes: Cheng-Chi Wong

This document discusses convolutional codes. It begins with an introduction that defines convolutional codes and provides an example of a (2,1,2) convolutional code. It then covers topics such as the polynomial form of convolutional codes, graphical representations using state diagrams and trellis diagrams, different types of encoder realizations, and decoding algorithms like the maximum likelihood algorithm and Viterbi algorithm. The document aims to provide an overview of key concepts and representations of convolutional codes.

Uploaded by

trungthanhbk01
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
93 views25 pages

Convolutional Codes: Cheng-Chi Wong

This document discusses convolutional codes. It begins with an introduction that defines convolutional codes and provides an example of a (2,1,2) convolutional code. It then covers topics such as the polynomial form of convolutional codes, graphical representations using state diagrams and trellis diagrams, different types of encoder realizations, and decoding algorithms like the maximum likelihood algorithm and Viterbi algorithm. The document aims to provide an overview of key concepts and representations of convolutional codes.

Uploaded by

trungthanhbk01
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 25

Convolutional Codes

Cheng-Chi Wong
National Chiao-Tung University

May, 2012

CCWONG (NCTU)

Convolutional Codes

May, 2012

1 / 25

Outline
1. Introduction

2. Convolutional Code Representation

3. Encoding of Convolutional Code

4. Decoding of Convolutional Code

CCWONG (NCTU)

Convolutional Codes

May, 2012

2 / 25

Introduction
(n, k, m) binary convolutional code
k-tuple to n-tuple stream-oriented encoding procedure
m-stage memory cells

(0)

ut
(1)
ut
..
.
(k1)
ut

(0)

ut1
(1)
ut1
..
.
(k1)
ut1

..
.

(0)

utm
(1)
utm
..
.
(k1)
utm

(0)

ct
(1)
ct
..
.

(n1)

ct

Code structure
1. Algebraic description
2. Graphical representation

Decoding algorithm
1. Maximum likelihood algorithm
2. Maximum a posteriori probability algorithm
CCWONG (NCTU)

Convolutional Codes

May, 2012

3 / 25

Introduction Example
(0)

ct

ut

ut1

ut2

(1)

ct

(2, 1, 2) convolutional code

1.
2.

(0)

ct

= ut ut1 ut2

(1)
ct

= ut ut2

u = (1, 0, 0, 1, 1, 0) c = (11, 10, 11, 11, 01, 01)


u = (1, 1, 1, 1, 1, 1) c = (11, 01, 10, 10, 10, 10)

CCWONG (NCTU)

Convolutional Codes

May, 2012

4 / 25

Polynomial Form

Laurent Series a(D) =

tz at D

over GF(2)

D: discrete-time delay operator


at GF(2) and z Z

Information:
Codeword:

u(D)=[u(0) (D)

c(D)=[c(0) (D)

u(1) (D)
c(1) (D)

c(n1) (D)]
T

Generator matrix:
i-th row of

u(k1) (D)]

G(D)=[g0 g1 gk1 ]
[
(0)
(1)
G(D): gi (D)= gi (D) gi (D)

c(D) =

k1

(n1)

gi

]
(D)

u(i) (D)gi (D) = u(D)G(D)

i=0

CCWONG (NCTU)

Convolutional Codes

May, 2012

5 / 25

Graphical Form
State diagram:
1. States: contents of delay elements
2. Branches: state-transition and input/output mapping

Example: G(D) = 1 + D + D2

1 + D2

0/00

00

1/11

0/11
1/00

10

01

0/10
1/01

0/01

11

Current
State
ut1 ut2
00
00
01
01
10
10
11
11

Input
ut
0
1
0
1
0
1
0
1

Output
(0) (1)

ct ct
00
11
11
00
10
01
01
10

Next
State
ut ut1
00
10
00
10
01
11
01
11

1/10

CCWONG (NCTU)

Convolutional Codes

May, 2012

6 / 25

Graphical Form (cont.)


Trellis diagram: state transition with the dimension of time

Example: G(D) = 1 + D + D2

St2
00

St1
00

00

11
11
01

01

10
10

11

00

10

10

11
t2

CCWONG (NCTU)

00

00

11
11
01

00
10

01
01

St+1

St
00
11
11

00

1 + D2

10

10

11

01

00

10

10

Convolutional Codes

01

00
10

01
01

t1

00

11
11

10

01
01

St+2
00

11
t

10

01
01
10

11
t+1
May, 2012

7 / 25

Encoder Realization
(j)

Generator polynomial: gi (D) =


Type-I realization

a0 + a1 D + + am Dm
1 + b1 D + + bm Dm
output

a0

a1

a2

am1

am

b1

b2

bm1

bm

input

Type-II realization
input

am

am1

am2

a1

a0

output

bm

CCWONG (NCTU)

bm1

bm2

Convolutional Codes

b1

May, 2012

8 / 25

Encoder Classication
Encoder structure
1.
2.

Recursive or non-recursive
Systematic or non-systematic

Multiple encoders for the same code

(
)
G(D) = 1 + D + D2 1 + D2
(
)
G (D) = 1 + D3 1 + D + D2 + D3
)
(
1 + D2

G (D) = 1
1 + D + D2
Catastrophic error propagation
Innite-weight information Finite-weight codeword
Systematic encoder Non-catastrophic

Number of delay elements


CCWONG (NCTU)

Convolutional Codes

May, 2012

9 / 25

Encoding Process
Starting from all-zero states

Example: trellis diagram for G(D) = 1 + D + D2


u0 = 1
00

00

u1 = 0
00

11
11
01

11

01

00

10

11

01

00

= (11)

CCWONG (NCTU)

u3 = 1
00

00

01

00

10

11
= (10)

01

10

10
(0) (1)
(c2 , c2 )

10

= (00)

(0) (1)
(c3 , c3 )

Convolutional Codes

00

01

00
10

10

01
01

11

00
11
11

00

10

01
01

u4 = 0
00

11
11

10
10

01
01

(0) (1)
(c1 , c1 )

00
11
11

10
10

01
01

(0) (1)
(c0 , c0 )

u2 = 1
00

11
11

10
10

00

1 + D2

11
= (01)

10

01
01
10
(0) (1)
(c4 , c4 )

11
= (10)

May, 2012

10 / 25

Encoding Process (cont.)


Ending at all-zero states

Example: trellis diagram for G(D) = 1 + D + D2

00

uN 3 =
00

00

11
11
01

00

11

01
01
10

(0)
(1)
(cN 3 , cN 3 )

CCWONG (NCTU)

00

11
11
01

10
10

uN 2 =
00

00

11
= ()

01
01
10

(0)
(1)
(cN 2 , cN 2 )

uN = 0
00

11
11
01

10
10

uN 1 =
00

11

01

00

= ()

10

11

01

00

= ()

Convolutional Codes

00

01

00
10

10

01
01
10
(0) (1)
(cN , cN )

uN +1 = 0
00

11
11

10
10

01
01

(0)
(1)
(cN 1 , cN 1 )

00

11
11

10
10

00

1 + D2

11
= ()

10

01
01
10

11

(0)
(1)
(cN +1 , cN +1 )

= ()

May, 2012

11 / 25

Encoding Process Exercise


[

G(D) = 1 + D3

1 + D + D2 + D3

c(0) (D)

u(D)

c(1) (D)

CCWONG (NCTU)

Convolutional Codes

May, 2012

12 / 25

Maximum Likelihood Algorithm


Channel with noise e = (e0 , e1 , , er1 )
(0)

(1)

(r1)

Input: xi = (xi , xi , , xi

) {x0 , x1 , x2 , }
Output: y = xi + e = (y0 , y1 , , yr1 )
Maximum Likelihood (ML) Algorithm
r1

(j)
Choose x to maximize P(y|x ) =
P(yj |x )
j=0

Logarithm computation

log P(y|x ) =

r1

(j)

log P(yj |x )

j=0

Metric (distance) between x and y:


(j)

(j)

(x , yj ) = log P(yj |x ) M(x , y) =

r1

(j)

(x , yj )

j=0

max P(y|x ) = min M(x , y)


x

CCWONG (NCTU)

Convolutional Codes

May, 2012

13 / 25

Maximum Likelihood Algorithm (cont.)


Additive white Gaussian noise (AWGN) channel

[
]
1
P(ej ) =
exp (e2j )/(2 2 )
2
[
]
1
(j)
(j)
P(yj |x ) =
exp (yj x )2 /(2 2 )
2
r1
r1

(j) 2
(j)
M(x , y) =
(yj x ) =
yj x
j=0

j=0

(j)

Binary input data: x = 1 (= (1)uj )

(j)

(j)

(j)

(j)

P(yj |x = +1) > P(yj |x = 1) yj 1 > yj + 1


P(yj |x = +1) < P(yj |x = 1) yj 1 < yj + 1

CCWONG (NCTU)

Convolutional Codes

May, 2012

14 / 25

Viterbi Algorithm

M(St1
) St1

c0 c1 cn1

St

M(St )

M(St1
) St1

c0 c1 cn1

Denition
Branch metric: distance computation
=

n1

n1

j=0

j=0

rj cj and =

rj cj

Path metric: add-compare-select (ACS) operation

{
}
M(St ) = min M(St1 ) + , M(St1 ) +
CCWONG (NCTU)

Convolutional Codes

May, 2012

15 / 25

Viterbi Algorithm (cont.)


Step 1 Initialization
{
M(S0 ) =

0 if S0 = 0
if S0 {1, , 2m 1}

Step 2 Metric computation


1.

()

Find of 2k branches connecting to St


(i)

()

(St1 , St ) =

n1

(j)

( = 0 2m 1)
(j)

rt1 ci,

j=0

2.
3.
4.

(i)
M(St1 )
k

(i)
()
(St1 , St )

Summarize
and
Compare all 2 paths and select the best one
()
Update the metric and the path to each St

Step 3 Set t t + 1, and repeat Step 2 until termination


CCWONG (NCTU)

Convolutional Codes

May, 2012

16 / 25

Viterbi Algorithm Example 1


(3, 1, 2) convolutional code with

1+D2

G(D)=[1+D

1+D+D2 ]

Received data r = (110, 110, 110, 111, 010, 101, 101)

S0
S (0)

S1
000

S (0)

S2
000

S (0)

S3
000

S (0)

S4
000

S (0)

S5
000

S (0)

S6
000

S (0)

S7
000

111
011

111
011

111
011

111
011

111
011

111
011

111
011

S (1) 100

S (1) 100

S (1) 100

S (1) 100

S (1) 100

S (1) 100

S (1) 100

101

101

101

101

101

101

101

S (2) 010
110

S (2) 010
110

S (2) 010
110

S (2) 010
110

S (2) 010
110

S (2) 010
110

S (2) 010
110

S (3)

001

S (3)

001

r0 = 110

CCWONG (NCTU)

r1 = 110

S (3)

001
r2 = 110

S (3)

001

S (3)

r3 = 111

Convolutional Codes

001
r4 = 010

S (3)

001
r5 = 101

S (3)

001

S (0)

S (1)

S (2)

S (3)

r6 = 101

May, 2012

17 / 25

Viterbi Algorithm Example 1 (cont.)


t=0
S0
0

S1
000
111
011

100
101

S3

S4

S5

S6

S7

000

000

000

000

000

111
011

111
011

111
011

111
011

111
011

111
011

100

100

100

100

100

100

101

101

101

101

101

101

010
110

010
110

010
110

010
110

010
110

010
110

001

001

001

001

001

001

010
110
001

S2
000

2 2

r0 = 110

r1 = 110

r2 = 110

r3 = 111

r4 = 010

r5 = 101

r6 = 101

t=1
S0
0

S1
000

111
011

100

111
011

101

010
110
001

S2
000

100
101

010
110
001

r0 = 110

CCWONG (NCTU)

r1 = 110

4 4

2
2

S3

S4

S5

S6

S7

000

000

000

000

000

111
011

111
011

111
011

111
011

111
011

100

100

100

100

100

101

101

101

101

101

010
110

010
110

010
110

010
110

010
110

001

001

001

001

001

r2 = 110

r3 = 111

Convolutional Codes

r4 = 010

r5 = 101

r6 = 101

May, 2012

18 / 25

Viterbi Algorithm Example 1 (cont.)


t=2
S0
0

S1
000

111
011

100

010
110
001

111
011

101

S2
000

100

111
011
3

101
1

010
110
001

r0 = 110

S3
000

100
101

r1 = 110

6
5

5
2
5
4

010
110
001

4
5

r2 = 110

S4

S5

S6

S7

000

000

000

000

111
011

111
011

111
011

111
011

100

100

100

100

101

101

101

101

010
110

010
110

010
110

010
110

001

001

001

001

r3 = 111

r4 = 010

r5 = 101

r6 = 101

t=3
S0
0

S1
000

111
011

100

010
110
001

111
011

101

S2
000

100

010
110
001

r0 = 110

CCWONG (NCTU)

r1 = 110

111
011
3

101
1

S3
000

100

111
011
2

101
3

010
110
001
r2 = 110

S4
000

100
101

8
3

5
5
5
4

010
110
001

6
6

r3 = 111

Convolutional Codes

S5

S6

S7

000

000

000

111
011

111
011

111
011

100

100

100

101

101

101

010
110

010
110

010
110

001

001

001

r4 = 010

r5 = 101

r6 = 101

May, 2012

19 / 25

Viterbi Algorithm Example 1 (cont.)


t=4
S0
0

S1
000

111
011

100

010
110
001

111
011

101

S2
000

100

010
110
001

r0 = 110

111
011
3

101
1

S3
000

100

r1 = 110

010
110
001

111
011
2

101
3

S4
000

100

111
011
5

101
4

r2 = 110

010
110
001

S5
000

100
101

r3 = 111

4
6

7
7
5
7

010
110
001

4
8

r4 = 010

S6

S7

000

000

111
011

111
011

100

100

101

101

010
110

010
110

001

001

r5 = 101

r6 = 101

t=5
S0
0

S1
000

111
011

100

010
110
001

111
011

101

S2
000

100

010
110
001

r0 = 110

CCWONG (NCTU)

r1 = 110

111
011
3

101
1

S3
000

100

010
110
001
r2 = 110

111
011
2

101
3

S4
000

100

010
110
001

111
011
5

101
4

S5
000

100

111
011
7

101
4

r3 = 111

Convolutional Codes

010
110
001
r4 = 010

S6
000

100
101

6
9

5
6

010
110
001
r5 = 101

S7
000
111
011

100
101

010
110
001
r6 = 101

May, 2012

20 / 25

Viterbi Algorithm Example 1 (cont.)


t=6
S0
0

S1
000

111
011

100

010
110
001

111
011

101

S2
000

100

010
110
001

r0 = 110

111
011
3

101
1

S3
000

100

r1 = 110

010
110
001

111
011
2

101
3

S4
000

100

r2 = 110

010
110
001

111
011
5

101
4

S5
000

100

r3 = 111

010
110
001

111
011
7

101
4

S6
000

100

111
011
5

101
5

r4 = 010

010
110
001

S7
000

100
101

r5 = 101

8
7

010
110
001

r6 = 101

Surviving path u
= (1 1 0 0 1)
S0
0

S1
000

111
011

100

010
110
001

111
011

101

S2
000

100

010
110
001

r0 = 110

CCWONG (NCTU)

r1 = 110

111
011
3

101
1

S3
000

100

010
110
001
r2 = 110

111
011
2

101
3

S4
000

100

010
110
001

111
011
5

101
4

S5
000

100

r3 = 111

Convolutional Codes

010
110
001
r4 = 010

111
011
7

101
4

S6
000

100

010
110
001
r5 = 101

111
011
5

101
5

S7
000

100

101

010
110
001

r6 = 101

May, 2012

21 / 25

Viterbi Algorithm Example 2


Inner product maximum metric as survivor
S0

S (0)

S1

+++

0.3S (0)
-

+
(1)
S
-

++
+

(2)
S
-

S (1)

-
-
-0.3
-

++
+

S (2)

++

++

1.1S (0)
-

(3)
S
-

S2

+++

++

r0 = (0.7, 0.5, 0.1)

++

2.0S (0)
-2.5

S (1)

-1.6
-
-0.3
-

++
+

S (2)

-
-S (3)

S3

+++

S (1)

-2.8
3.5
0.2
-0.7

r1 = (1.1, 0.4, 0.2)

++
+

S (2)

++

++

1.8S (0)
3.3

1.0
-S (3)

S4

+++

++

r2 = (0.0, 0.8, 1.7)

++

1.8S (0)
2.5

S (1)

0.6
2.2
2.2
3.7

++
+

S (2)

2.2
-1.5S (3)

S5
+++

++

S (1)

5.8
1.3
4.8
1.9

S (2)

-0.2
2.2S (3)

r3 = (0.2, 0.1, 0.1)

++

1.6
3.1S (3)

r4 = (0.6, 0.3, 1.2)

Absolute dierence minimum metric as survivor


S0

S (0)

S1

+++

3.0S (0)

+
(1)

++
+

(2)
S

3.3

S (2)

++

++

++
+
++

S (3)

++

S3

+++

9.2S (0)
10.5

(1)

6.6

6.5

S (2)

r0 = (0.7, 0.5, 0.1)

CCWONG (NCTU)

5.7S (0)

+
(1)

(3)
S

S2

+++

++
+

5.2
S (3)

12.0
7.1
9.6
10.1

S (2)

++

++

12.4S (0)
10.3

+
(1)

r1 = (1.1, 0.4, 0.2)

S4

+++

++
+
++

8.4
10.7S (3)

Convolutional Codes

++

S5

14.8S (0)
14.5

(1)

12.2
11.4
12.0
9.9

S (2)

r2 = (0.0, 0.8, 1.7)

+++

++
+
++

S (1)

11.2
15.3
12.2
14.7

S (2)

13.0
11.4S (3)

r3 = (0.2, 0.1, 0.1)

++

15.0
13.9S (3)

r4 = (0.6, 0.3, 1.2)

May, 2012

22 / 25

Viterbi Algorithm Exercise


St+1

St
S (0)

1.8

+++

S (0)

2.5

++

1.6

S (3)

++

++

S (3)

+++

S (0)

++

S (2)

S (3)

++

10
rt = (1.1, 0.3, 0.6)

CCWONG (NCTU)

S (0)

++

S (2)

++

S (3)

++

S (3)

S (2)

St+3
+++

S (0)

++

++

rt+1 = (0.5, 0.2, 1.7)

++

rt+2 = (1.0, 0.7, 0.3)

S (3)

++

S (0)

++

++

S (3)

St+5
+++

S (0)

S (1)

++

S (1)

S (2)

S (3)

++

S (2)

S (2)

rt+4 = (1.0, 0.2, 0.5)

S (1)

++

St+4
+++

S (1)

S (3)

++

++
+

S (2)

rt+3 = (0.4, 0.1, 1.6)

++

S (0)

S (1)

S (1)

++

St+5
+++

S (2)

++

rt+2 = (0.9, 0.8, 0.2)

St+2
+++

S (1)

10 + +

S (3)

S (1)

++

S (0)

S (1)

rt+1 = (0.6, 1.1, 0.4)

St+1

St

10

++

++

St+4
+++

S (2)

rt = (0.7, 0.5, 0.8)

S (0)

++

S (0)

S (1)

S (2)

3.1

++

St+3
+++

S (1)

S (2)

S (0)

S (1)

St+2
+++

++

S (2)

S (3)

++

S (3)

rt+3 = (0.7, 1.0, 0.3) rt+4 = (0.6, 0.6, 0.7)

Convolutional Codes

May, 2012

23 / 25

Sliding Window Technique


utL+1

utL+2

ut1

ut

00

00

00

00

00

00

01

01

01

01

01

01

10

10

10

10

10

10

11

11

11

11

11

11

u
tL+1

Survivor truncation with length L


Survivor updating and management
Output decision
Truncation length
Quantization
Metric normalization
Termination
CCWONG (NCTU)

Convolutional Codes

May, 2012

24 / 25

Thank you for your attention.

CCWONG (NCTU)

Convolutional Codes

May, 2012

25 / 25

You might also like