0% found this document useful (0 votes)
90 views33 pages

Unit IV Convolutional Codes

The document discusses convolutional codes including: 1. It introduces convolutional codes, their encoder representation, generator matrices, and trellis diagrams. 2. It explains the connection representation, polynomial representation, state diagram, and tree diagram used to describe convolutional codes. 3. It provides an example of a (2,1,2) convolutional code to illustrate the encoder output, polynomial representation, and state diagram.

Uploaded by

tejal rana
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
90 views33 pages

Unit IV Convolutional Codes

The document discusses convolutional codes including: 1. It introduces convolutional codes, their encoder representation, generator matrices, and trellis diagrams. 2. It explains the connection representation, polynomial representation, state diagram, and tree diagram used to describe convolutional codes. 3. It provides an example of a (2,1,2) convolutional code to illustrate the encoder output, polynomial representation, and state diagram.

Uploaded by

tejal rana
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 33

Unit IV : Convolutional

• Introduction Codes
of convolution code,
• State diagram,
R1:7.1 to 7.3 (page 381
• to
Polynomial description of convolution code, 408)
T2: Chapter 6
• Generator matrix of convolution code,
• Tree diagram,
• Trellis diagram,
• Sequential decoding and Viterbi decoding,
: R1 (Page 418)
• Known good convolution code,
: T2 Chapter 7 (Page 277 to
• Introduction to LDPC 278)
• Turbo codes. : T2 Chapter 7 (Page 209 to
212)
T2: J C Moreira, P G Farrell, “Essentials of Error-Control Coding”, Wiley
Student Edition.

R1: Bernad Sklar, “Digital Communication Fundamentals & applications”,


Pearson Education. Second Edition.
1
Introduction
• Convolutional codes were first introduced by Elias in 1955
as an alternative to block codes.
• Wozencraft proposed sequential decoding as an efficient
decoding scheme for convolutional codes.
• In 1963, Massey proposed a less efficient but simpler-to
implement decoding method called threshold decoding.
In 1967, Viterbi proposed a maximum likelihood decoding
scheme that was relatively easy to implement for codes
with small memory orders.
• Viterbi decoding, together with improved versions of
sequential decoding, led to the application of convolutional
codes to deep-space and satellite communication in early
1970s.
• Reliable data transfer in digital video, radio communication

T
1 2
T
3
Introduction of Convolutional Codes

‘k’ ( n, k, ‘n’
input m) outpu
encoder t
•Output depends not only on current set of ‘k’ input bits,
but also on ‘m’ past input.
• linear sequential ckt with input memory m.

• useful for low-latency communications


•achieve good performance by expanding their memory
depth.
•applied in applications that require good performance
with low implementation cost.

5
Convolutional encoder representation

• Connection representation
• Connection vector or Polynomial representation
• The State diagram
• The tree diagram
• The trellis diagram

6
Connection Representation

Input, k=1
+ X = C B A
C
i/p B A Output, n =2
.....010100101=L
+ Y = C A

Constraint length= 2+1=3 (n,k,m) (2,1,2)


Code rate,r=1/n=1/2 (r,K) (1/2,3)

CODE RATE: r
L bit sequence produces a coded output sequence of length n(L+M) bits
r = L/n(L+M)
= no of bits to be transmitted/no. of bits being transmitted
L>>M, r  L/nL,
r =1/n bits per symbol

CONSTRAINT LENGTH : (K)


The number of shifts over which a single message bit can
the encoder output.
influence 7
v1
t1 C B A v1 v2

v2

+ X= C  B  A
C
i/p B A

+ Y= C  A
Message sequence: m
(101)
Time Output Time Output
(Branch word) (Branch wor
v1 v1
v1 v2
v1 v2
t1 1 0 0 1 1
t2 0 1 0 0 1
v2 v2

v1
v1
v1 v2 v1 v2
t3 1 0 1 0 0
t4 0 1 0 01
v2 v2
EXAMPLE:

+ X = C B A
Input, k=1 C
i/p B A Output, n =2
Let data=101 + Y = C A

Coded output sequence will be of length: n(L+M)


=2(3+2)=10 bits
input B A X Y
C
1 0 0 1 1 Input: 1 0 1
0 1 0 1 0 Output: 11 10 00 10 11
1 0 1 0 0
10
0 1 0 1 0
0 0 1 1 1
1
g1 X   1  X  X
2
Polynomial Representation

U ( X )  m( X )g 1 X  interlaced with
C B A X 101 then m (X )  1 X
m( X )g 2m
m (X)g1 X   (1 2X
2 2
)(1 X  X 3 )
m ( X )g 2 X   (1  X 4 )
1
(1  X  X X2
) 2 1X
4

g 2 X   1  X 2

m (X )g1 X  1 X  0X 2  X 3  X
4

m (X )g2 X 
1 0X  0X 2  0X 3 
 U(X )  (1,X 1)
4  (1, 0)X (0, 0)X 2  (1, 0)X 3  (1,

1)X 4
U 11 10 0 0 10 1
1
13-Dec-
2014
6.2 and 6.3
Time domain and Transform domain approach of
convolutional codes

T2: J C Moreira, P G Farrell, “Essentials of


Error-Control Coding”, Wiley Student
Edition.
State diagram

+ X=CBA
C
i/p B A

+ Y=CA

 States 22 = 4;
00 01 10 11
 Only two transitions initiating from a state
 Only two transitions ending up in a state

13
This Next For all possible combinations
i/p o/p
state 0
state C X
B A 1
B A Y BA C/XY
00 0 00 00
00 1 11 10 0/00

01 00 1111
00
00
01
10 11
00 0000
1100 0/11 1/11
10
01
10 11 0011 1/00
11
11 00 0011
01 10
01
11 11 1100 Prof. Sharada N. Ohatkar
0/10
11 1/01
+ X=C+B+A 0/01

C 11
i/p B A

+ Y=C+A
1/10
14
2.TREE diagram
This Next
state
BA
i/p o/p
state
Trellis diagram
C X
BA 00 0/00 00
Y
1/11 0/11
00 0 00 00
00 1 11 10 01 01
01 0 11 00
1/00
0/10
01 1 00 10
10 10
10 0 10 01 1/01
0 Prof. Sharada N.

10 1 01 11 0/01
Ohatkar

11 0 01 01 11 11
1 1/10
11 1 10 11
ti ti1

00 11 00 00 00 00
00 00 00 00 01 01
01
10 01 01
01 01 01 01 10 10
11 00
10 10
01
10 10 10 10 11 1011
11 11
11 11 11 11
i/p sequence
100111011 00 11 00 00 00 00
00 00 00 00 01 01
01
10 01 01
01 01 01 01 10 10
11 00
10 10
01
10 10 10 10 11 1011
BA C BA C BA C BA C BA C BA C BA C BA 11C BA 11C BA C BA C
11 11 11 11
to 1 t1 0 t2 0 t3 1 t4 1 t5 1 t6 0 t7 1 t8 1 t9
Pr0of. Shta1ra0d . O0hatkar
aN

00 00 00 00 00 00 00 00 00 00 00 00
11
11 11 11
01 01 01 01 01 01 01 01 01 01 01 01
00
10 01 01
10 Prof. 1Sh0arada N. 10 10 10 10 10 10 10 10 10

Oha1tk0ar 01 01
11 11 11 11 11 11
10
11 11 11 11 11 11

11 10 11 11 01 10 01 00 01 01
T.E.(ETC/Elex) Faculty Orientation
13-Dec- 1
Workshop on ITCT 8
2014

11
Minimum Free Distance of a Convolutional
Code

5 6 6

The error-correction capability


of the code is then defined as the number t
of errors that can be corrected,
Decoding of
CONVOLUTION CODE
Sequential Decoding:
(Wozencraft 1957)

• Just one path at a time


• Both forward and backward movement through trellis.
• Decoder keeps track of its decision, tallies it.
•If tally increases faster than threshold value, gives up
that path and retraces path back to the last fork where
tally was below the threshold.
Used with long constraint length codes, where SNR is also
low, but has variable decoding time.
NASA planetary mission links.
Maximum Likelihood Detection
An optimal decoder is one that is able to compare the conditional probabilities
P (sr/c’) that the received sequence sr corresponds to a possible code
sequence c’, and then decide upon the code sequence with the highest
conditional probability:
code sequence c
generated by
the encoder

(Channel noise)
received sequence
This is the maximum likelihood criterion.

Idea of decoding by selecting the code sequence that is most alike the
received sequence.

For a code sequence of length L bits, there are 2RcL possible sequences, where
Rc is the rate of the code.

The maximum likelihood decoder selects a sequence c’ ,from the set of all
these possible sequences, which has the maximum similarity to the received
sequence. 22
6.12 Decoding of Convolutional Codes: The
Viterbi Algorithm

T2: J C Moreira, P G Farrell, “Essentials of Error-


Control Coding”, Wiley Student Edition.

23
VETERBI DECODING
ERRORLESS

Received bits: 11 10 11 11 01 10 01 00 01 01 11

11 10 11 11 01 10 01 00 01 01
00 00 00 00 00 00 00 00 11
0 00 2 3 5,0 3,4 4,3 4,3 3,2 3,4 00 4,3 00 5,0
00 2,3
11 11 11 11 11 11 11 11
11
11 114,3 113,4 112,3 113,2 115,0 114,3 112,3 115,0 114,3
01 11 0
00 00 00 00 00 00 00 00 00
10 10 10 10 10 10 10 10 10 10
3
10
0 3,2 0,5 3,4 4,3 4,3 5,0 3,4 4,3
01 01 01 01 01 01 01 01
01 01 3,2
Prof. Sharada N. Ohatkar
01 01 01 01 01 01 01 01 01
11
2 10 4,3 10 3,4 10 0,5 10 5,0 10 3,2 10 4,3 10 0,5 10 3,2 10 4,3
o/pTsequence
T T T3 T4 T5 T6 T7 T8 T9
Prof. Sharada N. Ohatkar
T10 T11
0
0 1 1 1 0 0 24
1 2

13- 1Dec-2014 T1 .E.(ETC/ 1


Ele on x)
Workshop Faculty
ITCT
VETERBI DECODING
ERRORLESS

Received bits: 11 10 11 11 01 10 01 00 01 01 11

11 10 11 11 01 01 00 01 01
00 10 11 00
0 00 2 3 55 00 00 55
0
00 ,,0 2,3 3,4 4,3 4,3 3,2 3,4 4,3 ,,0
1 0 11 11 11 11 1 0
1 1
1 11 0 1 4,3 1 3,4 112,3 55
01 1 1 3,2 4,3 112,3 55 1 4,3
0 001 00 00 ,,000 00 ,,0001
1 1 0 00 0 0
1 11 101 10 10
00 3
10
0 3,2 0,, 3,4 4,3 4 3 5,, 43
0 01 01 5 00 3,4 01 00 4 01 00 00 00 4 3,2
01
5 11 11 ,, 11 11 ,,
Prof. Sharada N. Ohatkar 01 01 01
1 1
11
2 10 4,3 00,5,5 550110 3,2 10 4,3 00 10 3,2 10 4,3
, ,5,
o/pTsequence
T 3,4
T T3 T4 T5
0,0 T6 T7 T8 5T9
Prof. Sharada N. Ohatkar
T10 T11
0
0 1 1.E.(ETC/ Ele1x) Faculty 1 1 0 0 25
1 2
T
13-1Dec-2014 Workshop on ITCT
O0rientation
Known good convolution codes:
• avoid ‘catastrophic convolutional code’.
The state diagram of a ‘catastrophic convolutional code’ includes at
least one loop in which a nonzero information sequence corresponds to
an all-zero output sequence.

•the maximum free distance for the given rate and constraint length

•the number of data bit errors the paths represent, should be


minimized.
LDPC Codes
 LDPC codes: Overview
linear block codes decoded by
efficient iterative decoding.
 An LDPC parity check matrix H represents the
parity equations in a linear form
• codeword c satisfies the set of parity equations H . c =
0.
• each column in the matrix represents a codeword bit
• each row represents a parity check equation
c0  c1  c3 = 0
1 1 0 1 0 0 0
0 1 1 0 1 0 0 c1  c2  c4 = 0
H c2  c3  c5 =
0 0 1 1 0 1 0 
 
0 0 0 1 1 0 1 0 c3  c4  c6

=0
LDPC Codes
Overview
 Code Rate ratio of information bits to total number
of bits in codeword.
 LDPC codes represented by Tanner Graphs
• two types of vertices: Bit Vertices and Check Vertices
 Performance of LDPC code affected by presence
of cycles in Tanner graph.

0 1 2 3 4 5 6
1 1 0 1 0 0 0
0 1 1 0 1 0 0 
H
0 0 1 1 0 1 0 

 0 0 0 1 1 0 1 0 1 2
3
LDPC Codes and Their Applications
 Low Density Parity Check (LDPC) codes have superior
error performance
 4 dB coding gain over convolutional codes

 Standards and applications


100
 10 Gigabit Ethernet (10GBASE-T)
 Digital Video Broadcasting 10-1
Uncode

Bit Error Probability


d
(DVB-S2, DVB-T2, DVB-C2) 10-2
 Next-Gen Wired Home
10-3 4
Conv.
Networking (G.hn) dB
code

 WiMAX (802.16e) 10-4

 WiFi (802.11n)
0 1 2 3 4 5 6 7
 Hard disks 8
 Deep-space satellite Signal to Noise Ratio (dB)

missions
Turbo
• A Turbo Codes
Encoder

T2: J C Moreira, P G Farrell, “Essentials of Error-Control Coding”, Wiley


1S3
-tDue
cd2- e0n1
4t
Edition.
T.E.(ETC/Elex) Faculty Orientation
3
Workshop on ITCT 8
THANK YOU
ALL
39

You might also like