0% found this document useful (0 votes)
83 views66 pages

L09 Viterbi

Convolutional codes provide powerful error correction for wireless networks and deep space communications. They operate by using a sliding window over message bits to calculate parity bits according to the code's constraint length. The encoder can be viewed as a state machine that shifts message bits through registers to calculate the next state. Transmitting only the parity bits spreads each message bit across multiple coded bits. Larger constraint lengths provide greater error correction at the cost of lower data rates.

Uploaded by

K.Karuna Sree
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
83 views66 pages

L09 Viterbi

Convolutional codes provide powerful error correction for wireless networks and deep space communications. They operate by using a sliding window over message bits to calculate parity bits according to the code's constraint length. The encoder can be viewed as a state machine that shifts message bits through registers to calculate the next state. Transmitting only the parity bits spreads each message bit across multiple coded bits. Larger constraint lengths provide greater error correction at the cost of lower data rates.

Uploaded by

K.Karuna Sree
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 66

Convolutional Codes

COS 463: Wireless Networks


Lecture 9
Kyle Jamieson

[Parts adapted from H. Balakrishnan]


Convolutional Coding: Motivation
• So far, we’ve seen block codes

• Convolutional Codes:
– Simple design, especially at the transmitter

– Very powerful error correction capability (used in NASA


Pioneer mission deep space communications)

[Credit: Nasa Ames Research Center]

2
Convolutional Coding: Applications
• Wi-Fi (802.11 standard) and cellular networks (3G, 4G, LTE
standards)

• Deep space satellite communications

• Digital Video Broadcasting (Digital TV)

• Building block in more advanced codes (Turbo Codes), which


are in turn used in the above settings

3
Today
1. Encoding data using convolutional codes
– How the encoder works
– Changing code rate: Puncturing

2. Decoding convolutional codes: Viterbi Algorithm

4
Convolutional Encoding
• Don’t send message bits, send only parity bits

• Use a sliding window to select which message bits may


participate in the parity calculations

Message bits: 1 0 1 1 0 1 0 0 1 0 1

Constraint length K

5
Sliding Parity Bit Calculation

K=4

-3 -2 -1 0 1 2 3 4 5 6 7 8 …..
Message
0 0 0 0 1 1 0 1 0 0 1 0 1
bits:

+
P[0]= 0
• Output: 0 6
Sliding Parity Bit Calculation

K=4

-3 -2 -1 0 1 2 3 4 5 6 7 8 …..
Message
0 0 0 0 1 1 0 1 0 0 1 0 1
bits:

+
P[1] = 1
• Output: 01 7
Sliding Parity Bit Calculation

K=4

-3 -2 -1 0 1 2 3 4 5 6 7 8 …..
Message
0 0 0 0 1 1 0 1 0 0 1 0 1
bits:

+
P[2] = 0
• Output: 010 8
Sliding Parity Bit Calculation

K=4

-3 -2 -1 0 1 2 3 4 5 6 7 8 …..
Message
0 0 0 0 1 1 0 1 0 0 1 0 1
bits:

+
P[3] = 1
• Output: 0100 9
Multiple Parity Bits
P2[3] = 1

-3 -2 -1 0 1 2 3 4 5 6 7 8 …..
Message
0 0 0 0 1 1 0 1 0 0 1 0 1
bits:

+
• Output: ….11 P1[3] = 1 10
Multiple Parity Bits
P2[4] = 0

-3 -2 -1 0 1 2 3 4 5 6 7 8 …..
Message
0 0 0 0 1 1 0 1 0 0 1 0 1
bits:

+
• Output: ….1100 P1[4] = 0 11
Multiple Parity Bits
P2[5] = 1

-3 -2 -1 0 1 2 3 4 5 6 7 8 …..
Message
0 0 0 0 1 1 0 1 0 0 1 0 1
bits:

+
• Output: ….110001 P1[5] = 0 12
Encoder State
• Input bit and K-1 bits of current state determine state on next
clock cycle
– Number of states: 2K-1

Input bit
State

Message
0 0 0 0 1 1 0 1 0 0 1 0 1
bits:

Constraint length K

13
Constraint Length
• K is the constraint length of the code

• Larger K:
– Greater redundancy
– Better error correction possibilities (usually, not always)

14
Transmitting Parity Bits
• Transmit the parity sequences, not the message itself
– Each message bit is “spread across” K bits of the output
parity bit sequence

– If using multiple generators, interleave the bits of each


generator
• e.g. (two generators):

!0 0 , !1 0 , !0 1 , !1 1 , !0 2 , !1[2]

15
Transmitting Parity Bits
• Code rate is 1 / #_of_generators
– e.g., 2 generators à rate = ½

• Engineering tradeoff:
– More generators improves bit-error correction
• But decreases rate of the code (the number of message
bits/s that can be transmitted)

16
Shift Register View

• One message bit x[n] in, two parity bits out


– Each timestep: message bits shifted right by one, the
incoming bit moves into the left-most register

17
Equation View

0th stream: p0[n] = x[n] + x[n − 1] + x[n − 2] (mod 2)

1st stream: p1[n] = x[n] + x[n − 2] (mod 2)

18
Today
1. Encoding data using convolutional codes
– Encoder state machine
– Changing code rate: Puncturing

2. Decoding convolutional codes: Viterbi Algorithm

19
State Machine View
• Example: K = 3, code rate = ½, convolutional code
– There are 2K-1 states
– States labeled with (x[n-1], x[n-2])
– Arcs labeled with x[n]/p0[n]p1[n]
– Generator: g0 = 111, g1 = 101
– msg = 101100

Starting state
0/00
1/11
00 10

0/10 1/01
0/11
1/00
01 11
0/01 1/10
20
State Machine View
Starting state • P0[n] = (1*x[n] + 1*x[n-1] + 1*x[n-2]) mod 2
0/00 • P1[n] = (1*x[n] + 0*x[n-1] + 1*x[n-2]) mod 2
1/11 • Generators: g0 = 111, g1 = 101
00 10

0/10 1/01
0/11
1/00
01 11
0/01 1/10

• msg = 101100
• Transmit:
21
State Machine View
Starting state • P0[n] = 1*1 + 1*0 + 1*0 mod 2
0/00 • P1[n] = 1*1 + 0*0 + 1*0 mod 2
1/11 • Generators: g0 = 111, g1 = 101
00 10

0/10 1/01
0/11
1/00
01 11
0/01 1/10

• msg = 101100
• Transmit: 11
22
State Machine View
Starting state • P0[n] = 1*0 + 1*1 + 1*0 mod 2
0/00 • P1[n] = 1*0 + 0*1 + 1*0 mod 2
1/11 • Generators: g0 = 111, g1 = 101
00 10
0/10
0/11 1/01
1/00
01 11
0/01 1/10

• msg = 101100
• Transmit: 11 10
23
State Machine View
Starting state • P0[n] = 1*1 + 1*0 + 1*1 mod 2
0/00 • P1[n] = 1*1 + 0*0 + 1*1 mod 2
1/11 • Generators: g0 = 111, g1 = 101
00 10
0/10
0/11 1/01
1/00
01 11
0/01 1/10

• msg = 101100
• Transmit: 11 10 00
24
State Machine View
Starting state • P0[n] = 1*1 + 1*1 + 1*0
0/00 • P1[n] = 1*1 + 0*1 + 1*0
1/11 • Generators: g0 = 111, g1 = 101
00 10
0/10
0/11 1/01
1/00
01 11
0/01 1/10

• msg = 101100
• Transmit: 11 10 00 01
25
State Machine View
Starting state • P0[n] = 1*0 + 1*1 + 1*1
0/00 • P1[n] = 1*0 + 0*1 + 1*1
1/11 • Generators: g0 = 111, g1 = 101
00 10
0/10
0/11 1/01
1/00
01 11
0/01 1/10

• msg = 101100
• Transmit: 11 10 00 01 01
26
State Machine View
Starting state • P0[n] = 1*0 + 1*0 + 1*1
0/00 • P1[n] = 1*0 + 0*0 + 1*1
1/11 • Generators: g0 = 111, g1 = 101
00 10
0/10
0/11 1/01
1/00
01 11
0/01 1/10

• msg = 101100
• Transmit: 11 10 00 01 01 11
27
Today
1. Encoding data using convolutional codes
– How the encoder works
– Changing code rate: Puncturing

2. Decoding convolutional codes: Viterbi Algorithm

28
Varying the Code Rate
• How to increase the rate of a convolutional code?

• Transmitter and receiver agree on coded bits to omit


– Puncturing table indicates which bits to include (1)
• Contains p rows (one per parity equation), N columns

• Example:
N

(Omitted)

(Omitted)

29
Punctured convolutional codes: example

0 0 1 0 1
• Coded bits =
0 0 1 1 1

• With Puncturing:

!!!"
P1 = !""! Puncturing table

30
Punctured convolutional codes: example

0 0 1 0 1
• Coded bits =
0 0 1 1 1

• With Puncturing matrix:


!!!"
P1 = !""!
5 out of 8 bits are retained

31
Punctured convolutional codes: example

0 0 1 0 1
• Coded bits =
0 0 1 1 1

• With Puncturing matrix:

!!!"
P1 = !""!

0
• Punctured, coded bits: 0
32
Punctured convolutional codes: example

0 0 1 0 1
• Coded bits =
0 0 1 1 1

• With Puncturing matrix:

!!!"
P1 = !""!

0 0
• Punctured, coded bits: 0
33
Punctured convolutional codes: example

0 0 1 0 1
• Coded bits =
0 0 1 1 1

• With Puncturing matrix:

!!!"
P1 = !""!

0 0 1
• Punctured, coded bits: 0
34
Punctured convolutional codes: example

0 0 1 0 1
• Coded bits =
0 0 1 1 1

• With Puncturing matrix:

!!!"
P1 = !""!

0 0 1
• Punctured, coded bits: 0 1
35
Punctured convolutional codes: example

0 0 1 0 1
• Coded bits =
0 0 1 1 1

• With Puncturing matrix:

!!!"
P1 = !""!

0 0 1 1
• Punctured, coded bits: 0 1 1
36
Punctured convolutional codes: example

0 0 1 0 1
• Coded bits =
0 0 1 1 1

0 0 1 1
• Punctured, coded bits:
0 1 1

• Punctured rate is increased to: R = (1/2) / (5/8) = 4/5

37
Stretch Break and Question [MIT 6.02 Chp. 8, #1]

• Consider a convolutional code whose parity equations are:


!" = $ % + $ % − 1 + $[% − 3]
!, = $ % + $ % − 1 + $[% − 2]
!. = $ % + $ % − 2 + $[% − 3]

1. What’s the rate of this code? How many states are in the state
machine representation of this code?

2. To increase the rate of the given code, 463 student


Lem E. Tweakit punctures it with the following puncture matrix:
1 0 1 1 0
1 1 0 1 1 . What’s the rate of the resulting code?
1 1 1 1 1

38
Today
1. Encoding data using convolutional codes

2. Decoding convolutional codes: Viterbi Algorithm


– Hard decision decoding
– Soft decision decoding

39
Motivation: The Decoding Problem
• Received bits: Message Coded bits Hamming
distance
000101100110 0000 000000000000 5
0001 000000111011 6

• Some errors have occurred 0010 000011101100 4


0011 000011010111 ...
0100 001110110000
• What’s the 4-bit message? 0101 001110001011
0110 001101011100
0111 001101100111 2
• Most likely: 0111 1000 111011000000
– Message whose coded bits 1001 111011111011
is closest to received bits 1010 111000101100

in Hamming distance 1011 111000010111


1100 110101110000
1101 110101001011
1110 110110011100
1111 110110100111

40
The Trellis
0/00 Starting state
• Vertically, lists encoder states
1/11
00 10
0
• Horizontally, tracks time steps
0/11 0/1 0 1/01
1/0
1/10
• Branches connect states in
01
0/01
11 successive time steps

Trellis:
0/00 0/00 0/00 0/00
00
1/

1/
1/

1/
1 1
0/1 0/1
11

11
11

11
01 1/0 1/0
States

0 0
0 0 0
Branch 0/1 0/1 0/1
10
1/0 1 1/01 1 1/0
1 0 0 1
0/ 0/
11
Time à 1/10 1/10
x[n-1] x[n-2] 41
The Trellis: Sender’s View
• At the sender, transmitted bits trace a unique, single path of
branches through the trellis
– e.g. transmitted data bits 1 0 1 1

• Recover transmitted bits ⟺ Recover path

x[n-1] x[n-2]
00
1/
11

01 1/0
States

0
0
0/1
10
1/0
1
11
Time à
42
Viterbi algorithm
• Want: Most likely sent bit sequence

• Calculates most likely path through trellis

Andrew Viterbi (USC)

1. Hard input Viterbi algorithm: Have possibly-corrupted


encoded bits, after reception

2. Soft input Viterbi algorithm: Have possibly-corrupted


likelihoods of each bit, after reception
– e.g.: “this bit is 90% likely to be a 1.”

43
Viterbi algorithm: Summary
• Branch metrics score likelihood of each trellis branch

• At any given time there are 2K-1 most likely messages we’re
tracking (one for each state)
– One message ⟷ one trellis path
– Path metrics score likelihood of each trellis path

• Most likely message is the one that produces the smallest


path metric

44
Today
1. Encoding data using convolutional codes

2. Decoding convolutional codes: Viterbi Algorithm


– Hard input decoding
– Soft input decoding

45
Hard-input branch metric
• Hard input à input is bits

• Label every branch of trellis with branch metrics


– Hard input Branch metric: Hamming Distance between
received and transmitted bits

Received: 00
0/00 à 0
00
1/11 2
à2 0/11 à
01
States

1/00 à
0
1
0/01 à
10 à1
10 1/01 à
0/ 1
11 1/10 à 1

46
Hard-input branch metric

• Suppose we know encoder is in state 00, receive bits: 00

Received: 00
0/00
00
0
1/
01 11
States

2
10

11
Time à
47
Hard-input path metric
• Hard-input path metric: Sum Hamming distance between sent
and received bits along path

• Encoder is initially in state 00, receive bits: 00

Received: 00

0/00 à 0
00 0 0
1/
11
à
01 2

10 2

11
48
Hard-input path metric
• Right now, each state has a unique predecessor state

• Path metric: Total bit errors along path ending at state


– Path metric of predecessor + branch metric

Received: 00 11

0/00 à 0 0/00 à 2
00 0 0 2
1/
11
1/ 0
11

à
à

01 2 3
0
0/1 1
10 2 à
1/0
0
à 1
1
11 3
49
Hard-input path metric
• Each state has two predecessor states, two predecessor
paths (which to use?)

• Winning branch has lower path metric (fewer bit errors): Prune
losing branch

Received: 00 11 01

0/00 à 0 0/00 à 2 0/00 à 1


00 0 0 2 3
1/
11 1
1/ 0


11

à
à

01 3 0/ 1
2
0
0/1 1
10 2 à
1/0
0
à 1
1
11 3
50
Hard-input path metric

• Prune losing branch for each state in trellis

Received: 00 11 01

0/00 à 0 0/00 à 2 0/00 à 1


00 0 0 2 3
1/
11
1/ 0
11

à
à

01 2 3 2 2
0
0/1 1 / 10à
0 0
10 2 à
1/0
0 à
à 1 / 01
1 0
11 3
51
Pruning non-surviving branches
• Survivor path begins at each state, traces unique path back to
beginning of trellis
– Correct path is one of four survivor paths

• Some branches are not part of any survivor: prune them

Received: 00 11 01

0/00 à 0 0/00 à 2 0/00 à 1


00 0 0 2 1/ 3
1/ 11
11
1/ 0

à
11

à
à

01 2 3 1 2
0 0
0/1 1 0/1 2
10 2 à
1/0
0 à1
/0
3
à 1 à 1
1 0
11 3 0
52
Making bit decisions

• When only one branch remains at a stage, the Viterbi


algorithm decides that branch’s input bits:

Received: 00 11 01
Decide: 0
0/00 à 0 0/00 à 2 0/00 à 1
00 0 0 2 1/ 3
11
1/ 0

à
11
à

01 1 2
0
0/1 2
10 0 à1
/01 3
à
0
11 0
53
End of received data
• Trace back the survivor with minimal path metric

• Later stages don’t get benefit of future error correction, had


data not ended

Received: 00 11 01 10
Decide: 0 1 1 1
0/00 à 2 0/00 à 1
00 0 2 1/ 3 1 3
11 / 11à
à 0
1/
11

01 1 2 1/0
0 2
à

0 à
0/1 2
0

1
10 0 à 3 2 3
1/0 à
1à / 01
0 0 0
11 0
1/10 à 0
54
Terminating the code
• Sender transmits two 0 data bits at end of data

• Receiver uses the following trellis at end:

0/00 0/00
00
01 0/11 0/11
0/01
10
0/10
11

• After termination only one trellis survivor path remains


– Can make better bit decisions at end of data based on this
sole survivor

55
Viterbi with a Punctured Code
• Punctured bits are never transmitted

• Branch metric measures dissimilarity only between received


and transmitted unpunctured bits
– Same path metric, same Viterbi algorithm
– Lose some error correction capability

Received: 0-
0/00 à 0
00
1/11 1
à1 0/11 à
01
States

1/00 à
0
0
0/01 à
10 à1
10 1/01 à
0/ 0
11 1/10 à 1

56
Today
1. Encoding data using convolutional codes

2. Decoding convolutional codes: Viterbi Algorithm


– Hard input decoding
• Error correcting capability
– Soft input decoding

57
How many bit errors can we correct?
• Think back to the encoder; linearity property:
– Message m1 à Coded bits c1
– Message m2 à Coded bits c2
– Message m1 ⨁ m2 à Coded bits c1 ⨁ c2

0 0 0 0 1 1 0 1 0 0 1 0 1

• So, dmin = minimum distance between 000...000 codeword and


codeword with fewest 1s

58
!"!#!"!% $"!#!"!%

Calculating dmin for the convolutional code


Figure 8-5: Branch metric for soft decision decoding.

• Find path with smallest non-zero path metric going from first
00 state to a future 00 state

• Here, dmin = 4, so can correct 1 error in 8 bits:


x[n] 0 0 0 0 0 0

00 00 00 00
00 0/00 0/00 0/00 0/00 4 0/00 0/00
1/11 1/11 1
1/11 1/11 1/11 1
1/11

0/10 0/10 0/10


0 0/10 0
0/10 0
0/10
01 3
1/01 1/01 1/01 1/01 /0
1/01 1/01

0/11 0
0/11 0/11
0/ 0/11 0/
0/11 0/11
10 1/00 2 1/00 1/ 0
1/00 1/00
0 1/00
1/0
1/
1/00 1/00

0/01 0/01 0/01 0/01


0 /01 0/01 0/01
11 1/10 1/10 2 1/10 1/10 1/10 1/10

x[n-1]x[n-2]

t time
59
Today
1. Encoding data using convolutional codes
– Changing code rate: Puncturing

2. Decoding convolutional codes: Viterbi Algorithm


– Hard input decoding
– Soft input decoding

60
Model for Today
• Coded bits are actually continuously-valued “voltages”
between 0.0 V and 1.0 V:

1.0 V
Strong “1”

Weak “1”

Weak “0”

0.0 V Strong “0”

61
On Hard Decisions
• Hard decisions digitize each voltage to “0” or “1” by
comparison against threshold voltage 0.5 V
– Lose information about how “good” the bit is
• Strong “1” (0.99 V) treated equally to weak “1” (0.51 V)

• Hamming distance for branch metric computation

• But throwing away information is almost never a good


idea when making decisions
– Find a better branch metric that retains information about
the received voltages?

62
Soft-input decoding
• Idea: Pass received voltages to decoder before digitizing
– Problem: Hard branch metric was Hamming distance

• “Soft” branch metric


– Euclidian distance between received voltages and voltages
of expected bits:

0.0, 1.0 1.0, 1.0

“Soft” metric
Expected parity bits:
(Vp0, Vp1)
(0, 1)
0.0, 0.0 1.0, 0.0

63
Soft-input decoding
• Different branch metric, hence different path metric
– Same path metric computation

• Same Viterbi algorithm

• Result: Choose path that minimizes sum of squares of


Euclidean distances between received, expected voltages

64
Putting it together:
Convolutional coding in Wi-Fi

Data bits
Data bits
Viterbi
Convolutional Decoder
encoder Coded bits (hard-
Coded bits input decoding) or
Voltage Levels (soft-
Modulation input decoding)
(BPSK, QPSK, …)

Demodulation

65
Thursday Topic:
Rateless Codes

Next week’s Precepts:


Lab 2

66

You might also like