LDPC Tutorial Mod1
LDPC Tutorial Mod1
and Implementations
Ned Varnica
[email protected]
Marvell Semiconductor Inc
1
Module 1: LDPC Decoding
Overview
Error Correction Codes (ECC)
Intro to Low-density parity-check (LDPC) Codes
ECC Decoders Classification
Soft vs Hard Information
Message Passing Decoding of LDPC Codes
Iterative Code Performance Characteristics
2
Error Correction Codes (ECC)
3
Error Correcting Codes (ECC)
0000
1000
0100
1100
0010
1010
0110
1110
0001
1001
0101
1101
0011
1011
0111
1111
0000000
1101000
0110100
1011100
1110010
0011010
1000110
0101110
1010001
0111001
1100101
0001101
0100011
1001011
0010111
1111111
4-bit message 7-bit codeword
4-bit message space 7-bit codeword space
User message bits
Parity bits
(redundancy)
0010000
0000000
Rate = 4 / 7
7-bit word space
4
Linear Block Codes
Block Codes
User data is divided into blocks (units) of length K bits/symbols
Each K bit/symbol user block is mapped (encoded) into an N bit/symbol codeword,
where N > K
Example:
in Flash Devices user block length K = 2Kbytes or 4Kbytes is typical
code rate R = K / N is usually ~0.9 and higher
Important Linear Block Codes
Reed-Solomon Codes (non-binary)
Bose, Chaudhuri, Hocquenghem (BCH) Codes (binary)
Low Density Parity Check (LDPC) Codes
Turbo-Codes
5
Iterative (ITR) Codes
1
u
2
u
3
u
4
u
5
u
1
u
2
u
3
u
4
u
5
u
1
p
2
p
3
p
4
p
5
p
. . .
. . .
Generator Matrix and Parity Check Matrix
A linear block can be defined by a generator matrix
Matrix associated to G is parity check matrix H, s.t.
A vector is a codeword if
A non-codeword (codeword + noise) will generate a non-zero vector, which is called
syndrome
The syndrome can be used in decoding
0 H v
T
=
Codeword User message
s
=
T
H v
6
G u v G =
(
(
(
(
encoding
N K K K
N
N
g g g
g g g
g g g
1 , 1 1 , 1 0 , 1
1 , 1 11 10
1 , 0 01 00
...
... ... ... ...
...
...
0 H G
T
=
Example
(
(
(
(
=
1 0 0 0 1 0 1
0 1 0 0 1 1 1
0 0 1 0 1 1 0
0 0 0 1 0 1 1
G
) 1 0 1 (
) 0 0 1 1 0 0 0 (
) 1 0 1 1 0 0 0 (
) 1 0 1 1 (
=
=
=
= =
=
T
T
H v
v
0 H v
G u v
u
(
(
(
=
1 1 1 0 1 0 0
0 1 1 1 0 1 0
1 1 0 1 0 0 1
H
Encoding
Decoding
7
Low-Density Parity-Check (LDPC) Codes
David MacKay
8
LDPC Codes
LDPC code is often defined by parity check matrix H
The parity check matrix, H, of an LDPC code with practical length has low density (most
entries are 0s, and only few are 1s), thus the name Low-Density Parity-Check Code
Each bit of an LDPC codeword corresponds to a column of parity check matrix
Each rows of H corresponds to a single parity check
For example, the first row indicates that for any codeword the sum (modulo 2) of
bits 0,1, and N-1 must be 0
bit 0 bit 1 bit N-1
(
(
(
(
(
(
=
0 1 ... 0 0 0 1
... ... ... ... ... ... ...
1 0 ... 0 1 0 0
0 0 ... 1 0 0 0
1 0 ... 0 0 1 1
H
Parity check equation
9
ECC Decoder Classification:
Hard vs Soft Decision Decoding
10
Hard vs. Soft Decoder Classification
Hard decoders only take hard decisions (bits) as the input
E.g. Standard BCH and RS ECC decoding algorithm (Berlekamp-Massey
algorithm) is a hard decision decoder
Hard decoder algorithm could be used if one read is available
EDC
encoder
BCH
encoder
EDC
decoder
BCH
decoder
Front End
/ Detection
Hard decisions {1,0}
decoder
010101110 010101010
11
Hard vs. Soft Decoder Classification
Error-and-Erasure decoder is a variant of soft information decoder: in addition
to hard decisions, it takes erasure flag as an input
Error-and-Erasure decoder algorithm could be used if two reads are available
EDC
encoder
encoder
EDC
decoder
error-and-
erasure
decoder
decisions {1,0,*}
decoder
010101**0 010101010
Front End
/ Detection
12
Hard vs. Soft Decoder Classification
Erasure flag is an example of soft information (though very primitive)
Erasure flag points to symbol locations that are deemed unreliable by
the channel
Normally, for each erroneous symbol, decoder has to determine that
the symbol is in error and find the correct symbol value. However, if
erasure flag identifies error location, then only error value is unknown
Therefore, erasure flag effectively reduces number of unknowns that
decoder needs to resolve
13
Hard vs. Soft Decoder Classification
Example. Rate 10/11 Single parity check (SPC) code
Each valid 11-bit SPC codeword c=(c
0
,c
1
,c
10
) has the sum (mod 2) of all the
bits equal to 0
Assume that (0,0,0,0,0,0,0,0,0,0,0) is transmitted, and (0,0,0,0,1,0,0,0,0,0,0) is
received by decoder
The received vector does not satisfy SPC code constraint, indicating to the
decoder that there are errors present in the codeword
Furthermore, assume that channel detector provides bit level reliability metric in
the form of probability (confidence) in the received value being correct
Assume that soft information corresponding to the received codeword is given
by (0.9,0.8,0.86,0.7,0.55,1,1,0.8,0.98,0.68,0.99)
From the soft information it follows that bit c
4
is least reliable and should be
flipped to bring the received codeword in compliance with code constraint
14
Obtaining Hard or Soft Information
from Flash Devices
15
One Read: Hard Information
Obtaining hard information (decision) via one read
One V
REF
threshold is available: Threshold value should be selected so that the
average raw bit-error-rate (BER) is minimized
In each bit-location, the hard-decision hd = 0 or hd = 1 is made
This information can be used as input into decoder
Shaded area denotes the probability that a bit error is made
V
REF
Decision bin
hd = 1
Decision bin
hd = 0
Multiple Reads: Soft Information
Obtaining soft information via multiple reads
Create bins
Bins can be optimized in terms of their sizes / distribution of V
REF
values
given the number of available reads (e.g. 5 reads)
These bins can be mapped into probabilities
Typically, the closer the bin to the middle point, the lower the confidence
that the bit value (hard-read value) is actually correct
17
V
REF 1
Decision bin
B1
Decision bin
A1
Decision
bin C0
Decision bin
A0
Decision
bin C1
V
REF 2 V
REF 4
V
REF 5
V
REF 3
Pr(bit=1) = 90%
Pr(bit=1) = 55% Pr(bit=1) = 65%
Pr(bit=0) = 55% Pr(bit=0) = 90% Pr(bit=0) = 65%
Decision bin
B0
ITR Decoders with Soft Information
7.6 7.8 8 8.2 8.4 8.6 8.8 9 9.2 9.4 9.6 9.8
10
-4
10
-3
10
-2
10
-1
10
0
SNR [dB]
S
F
R
RS code: t=20
Hard, single
pass decoder
Soft ITR decoders
significantly outperform
hard decision
counterparts
LDPC code
(same parity size)
Soft input,
Soft ITR decoder
18
Hard input,
soft ITR decoder
Raw BER
Decoding LDPC Codes
19
Representation on Bi-Partite (Tanner) Graphs
variable nodes
encoded bits
check nodes
parity check constraints
Each bit 1 in the parity check matrix is represented by an edge between
corresponding variable node (column) and check node (row)
(
(
(
(
(
(
=
1 1 1 1 0 0
1 0 0 0 0 1
0 1 0 1 1 0
0 0 1 0 1 1
0 0 1 0 1 0
H
0 1 0 0 1 1
0 0 0 0 0
20
Hard Decision Decoding: Bit-Flipping Decoder
Decision to flip a bit is made based on the number of unsatisfied checks
connected to the bit
1
1 0 0 1 0
0 0 1 1 1 0 0
First step
1
End of first step Second step
0
0
Valid
codeword
01
Examine number of unsatisfied check neighbors for each bit The left-most bit is the only bit that has 2 unsatisfied check neighbors Flip the left-most bit Examine number of unsatisfied check neighbors for each bit The second bit from the left is the only bit that has 2 unsatisfied check neighbors Flip the second bit from the left
21
Bit-Flipping Decoder Progress on a Large
LDPC Code
Decoder starts with a relatively large number of errors
As decoder progresses, some bits are flipped to their correct values
Syndrome weight improves
As this happens, it becomes easier to identify the bits that are erroneous and
to flip the remaining error bits to actual (i.e. written / transmitted) values
22
1 2 3 4 5 6 7
Iteration
Iteration
Syndrome
weight
Number
of bit errors
50
91
310
136
21
60
10
26
14
10
6
0
6
4
2
0
Soft Information Representation
The information used in soft LDPC decoder represents bit reliability metric,
LLR (log-likelihood-ratio)
The choice to represent reliability information in terms of LLR as opposed
to probability metric is driven by HW implementation consideration
The following chart shows how to convert LLRs to probabilities (and vice
versa)
|
|
\
|
=
=
=
) 1 (
) 0 (
log ) (
i
i
i
b P
b P
b LLR
23
Soft Information Representation
Bit LLR>0 implies bit=0 is more likely, while LLR<0 implies bit=1 is more likely
-100 -80 -60 -40 -20 0 20 40 60 80 100
0
0.1
0.2
0.3
0.4
0.5
0.6
0.7
0.8
0.9
1
LLR
P
(
b
l
=
1
)
) 0 ( =
i
b P
24
Soft Message Passing Decoder
LDPC decoding is carried out via message passage algorithm on the
graph corresponding to a parity check matrix H
The messages are passed along the edges of the graph
First from the bit nodes to check nodes
And then from check nodes back to bit nodes
(
(
(
1
0
0
1 0 1
0 1 1
0 0 1
0 0 0
1 0 0
0 1 1
Check nodes
(rows of H)
Bit nodes
(columns of H)
m = 2 m= 1 m= 0
n = 0 n = 1 n = 2 n = 3 n = 4 n = 5 n = 6
25
Soft LDPC Decoder
There are four types of messages
Message from the channel to the n-th bit node
Message from n-th bit node to the m-th check node
Message from the m-th check node to the n-th bit node
Overall reliability information for n-th bit-node at the end of iteration
( ) i
n m
Q
>
( ) i
m n
R
>
n
L
( ) i
n
P
Channel Detector
Channel Detector
m = 2 m = 1 m= 0
3 2 > R
(i)
3 0 > R
(i)
) (
1 3
i
Q
>
) (
6
i
P
n = 0 n = 1 n = 2 n = 3 n = 4 n = 5 n = 6
26
L
3
Soft LDPC Decoder (cont.)
Message passing algorithms are iterative in nature
One iteration consists of
upward pass (bit node processing/variable node processing): bit nodes
pass the information to the check nodes
downward pass (check node processing): check nodes send the
updates back to bit nodes
The process then repeats itself for several iterations
27
Soft LDPC Decoder (cont.)
Bits-to-checks pass: : n-th bit node sums up all the information it has
received at the end of last iteration, except the message that came from m-th
check node, and sends it to m-th check node
At the beginning of iterative decoding all R messages are initialized to zero
( ) i
n m
Q
>
Channel Detector
m = 2 m = 1 m= 0
L
3
1
3 2
>
i
R
1
3 0
>
i
R
> >
+ =
1 '
1
3 ' 3 1 3
m
i
m
i
R L Q
Channel Detector
n = 0 n = 1 n = 2 n = 3 n= 4 n = 5 n = 6
28
Soft LDPC Decoder (cont.)
Checks-to-bits pass:
Check node has to receive the messages from all participating bit nodes
before it can start sending messages back
Least reliable of the incoming extrinsic messages determines magnitude of
check-to-bit message. Sign is determined so that modulo 2 sum is satisfied
m
10
=
>
i
m
n1
Q
n
1
n
2
n
3
13
3
=
>
i
m n
Q
5
2
=
>
i
m n
Q
m
-5
1
=
>
i
n m
R 5
3
=
>
i
n m
R -10
2
=
>
i
n m
R
bits to checks checks to bits
n
1
n
2
n
3
29
Soft LDPC Decoder (cont.)
At the end of each iteration, the bit node computes overall reliability information
by summing up ALL the incoming messages
s are then quantized to obtain hard decision values for each bit
Stopping criterion for an LDPC decoder
Maximum number of iterations have been processed OR
All parity check equations are satisfied
( ) ( )
n
i i
n m n
m
P L R
>
= +
( )
n
i
P
( )
1, if 0
0, else
n
i
n
P
x
<
)
30
LDPC Decoder Error Correction: Example 1
1
st
iteration:
+4
m= 2 m= 1 m= 0
+4
+4
+4
-12 +7 -9 +7 +10 -11
-9
-12
m= 2 m= 1 m = 0
+4
+7
+7
+10
-11
-4
-7
+4
-7
-4
-10
-4
+4
31
n = 0 n = 1 n = 2 n = 3 n = 4 n = 5 n = 6
n = 0 n = 1 n = 2 n = 3 n = 4 n = 5 n = 6
LDPC Decoder Error Correction: Example 1
APP messages and hard decisions after 1
st
iteration:
m= 2 m= 1 m= 0
+4 -4
-7
+4
-7
-4
-10
-4
+4
+4
-12 +7 -9 +7 +10 -11
-5 +3 -8 -20 +3 +6 -7 P:
HD: 1 0 1 1 0 0 1
Valid codeword (syndrome = 0)
32
n = 0 n = 1 n = 2 n = 3 n = 4 n = 5 n = 6