0% found this document useful (0 votes)
84 views4 pages

LDPC Decoder Help Doc

This document describes Low Density Parity Check (LDPC) decoding algorithms. It introduces the parity check matrix and Tanner graph representation of LDPC codes. It then describes message passing belief propagation decoding algorithms, including the sum-product algorithm, min-sum algorithm, normalized min-sum algorithm, and offset min-sum algorithm. The layered decoding algorithm is also introduced to improve convergence speed. The performance of these algorithms is ranked from best to worst as: layered BP, BP, normalized min-sum, offset min-sum, min-sum.

Uploaded by

Usha Chandrakala
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
84 views4 pages

LDPC Decoder Help Doc

This document describes Low Density Parity Check (LDPC) decoding algorithms. It introduces the parity check matrix and Tanner graph representation of LDPC codes. It then describes message passing belief propagation decoding algorithms, including the sum-product algorithm, min-sum algorithm, normalized min-sum algorithm, and offset min-sum algorithm. The layered decoding algorithm is also introduced to improve convergence speed. The performance of these algorithms is ranked from best to worst as: layered BP, BP, normalized min-sum, offset min-sum, min-sum.

Uploaded by

Usha Chandrakala
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 4

LDPC Decoding

Parity check matrix & Tanner graph example

Notations:

Left side :variable nodes Vi , i ∈ {0, 1, 2 ⋯ , 7}


Right side: check nodes Cj , j ∈ {0, 1, 2, 3}
Γi denote the set of all checked nodes connected to variable node Vi
Υj denote the set of all variable nodes that connected to check node Cj
L(ci ) denote the LLR of encoded bit
Vi→j message from variable node i to check node j
Cj→i message from check node j to variable node i
Message(Belief) passing from variable node to check node in the nth round

∑ Ck→i
(n) (n−1)
Vi→j = L(ci ) +
k∈Γ i \{j}

LDPC Decoding 1
Message passing from check node to variable node in the nth round

⎛ (n−1)

∏ tanh ( )
(n) (−1)
Vk→j
= 2 tanh
⎝k∈Υj \{i} ⎠
Cj→i
2
∣ (n−1) ∣
∏ sgn (Vk→j ) ⋅ min ( Vk→j )
(n−1)

k∈Υ j \{i} ∣ ∣
k∈Υ j \{i}

Message Passing Decoding algorithm(belief propagation decoding)[2]

For transmitted encoded codeword c = [c0 , c1 , ...] with channel output y =


[y0 , y1 , ⋯ ], the input to the LDPC decoder is the log-likelihood ratio (LLR) value:

P r(ci = 0∣yi )
L(ci ) = log { }
P r(ci = 1∣yi
(0) (0)
Initialization : Vi→j = 0, Cj→i = 0, n=0
for each iteration n

for each variable node i uses received messages from previous round
and its accumulated LLR from all previous round, calculate and send
messages to its connected check nodes.

∑ Ck→i
(n) (n−1)
Vi→j = L(ci )(0) +
k∈Γ i \{j}

each check node j receives message from its connected variable


nodes, calculate and send message back.

⎛ (n−1)

tanh ( )
Vk→j

(n) (−1)
= 2 tanh
⎝k∈Υj \{i} ⎠
Cj→i
2

update LLR

( )

LDPC Decoding 2
L(ci )(n) = L(ci )(n−1) + ∑ Ck→i
(n)

k∈Γ i

Make decision based on final L(ci )

Layered decoding algorithm (layered BP) [3]

same performance but converge faster


(0) (0)
Initialization : Vi→j = 0, Cj→i = 0, n=0
for each iteration n

for each layer j (each layer corresponds to a check node or a row of


exponent matrix)

for each variable node i uses received messages from previous round
and its accumulated LLR from all previous round, calculate and send
messages to its connected check nodes.

(n−1)
L(yi ) = L(ci ) − Cj→i
(n)
Vi→j = L(ci )

the check node j receives message from its connected variable nodes,
calculate and send message back.

⎛ (n−1)

∏ tanh ( )
(n) (−1)
Vk→j
= 2 tanh
⎝k∈Υj \{i} ⎠
Cj→i
2
(n)
L(ci ) = L(ci ) + Cj→i

Make decision based on final L(yi )

min-sum decoding algorithm [4]


(n)
replace above Cj→i in layered decoding algorithms with its min-sum approximation

∣ (n−1) ∣
∏ sgn (V (n−1) k → j ) ⋅ min k ∈ Υj \{i} ( Vk→j )
(n)
Cj→i =
∣ ∣
k∈Υ j \{i}

LDPC Decoding 3
Normalised min-sum decoding algorithm [4]
scale min-sum approximation with a positive factor less than 1

(n) (n)
Cj→i = αCj→i , 0 < α < 1

offset min-sum decoding algorithm [4]

if absolute of min-sum approximation less than offset, make it 0

(n) (n)
Cj→i = max(Cj→i − β, 0), 0 < α < 1

Performance
min-sum < normalized min-sum < offset min-sum < layered BP = BP

[2] Gallager, Robert G. Low-Density Parity-Check Codes, Cambridge, MA, MIT


Press, 1963.

[3] Hocevar, D.E. "A reduced complexity decoder architecture via layered decoding
of LDPC codes." In IEEE Workshop on Signal Processing Systems, 2004. SIPS
2004. doi: 10.1109/SIPS.2004.1363033
[4] Chen, Jinghu, R.M. Tanner, C. Jones, and Yan Li. "Improved min-sum decoding
algorithms for irregular LDPC codes." In Proceedings. International Symposium on
Information Theory, 2005. ISIT 2005. doi: 10.1109/ISIT.2005.1523374

LDPC Decoding 4

You might also like