0% found this document useful (0 votes)
62 views54 pages

Green Codes: Energy-E Cient Short-Range Communication

This document discusses modeling communication systems that consider both information theory and processing power. It motivates looking at systems with fixed transmission rates and fixed message sizes, as processing power can be substantial for short-distance communication unlike long-distance communication. The talk will cover decoding power using decoding complexity, complexity-performance tradeoffs through bounds for iterative decoding, and lower bounds on total power for fixed-rate and minimum energy for fixed-message size systems. It will also discuss how tight these bounds are through relating to coding theory literature.

Uploaded by

keshav2211
Copyright
© Attribution Non-Commercial (BY-NC)
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
62 views54 pages

Green Codes: Energy-E Cient Short-Range Communication

This document discusses modeling communication systems that consider both information theory and processing power. It motivates looking at systems with fixed transmission rates and fixed message sizes, as processing power can be substantial for short-distance communication unlike long-distance communication. The talk will cover decoding power using decoding complexity, complexity-performance tradeoffs through bounds for iterative decoding, and lower bounds on total power for fixed-rate and minimum energy for fixed-message size systems. It will also discuss how tight these bounds are through relating to coding theory literature.

Uploaded by

keshav2211
Copyright
© Attribution Non-Commercial (BY-NC)
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 54

Green Codes :

Energy-efficient short-range communication

Pulkit Grover

Department of Electrical Engineering and Computer Sciences


University of California at Berkeley

Joint work with Prof. Anant Sahai


Motivation : Understand processing power
consumed in communicating
Fixed Rate Fixed message size

processor with heat sink small sensors

2
Motivation : Understand processing power
consumed in communicating
Fixed Rate Fixed message size

processor with heat sink small sensors

• Moore’s law : decreasing


implementation complexity

2
Motivation : Understand processing power
consumed in communicating
Fixed Rate Fixed message size

processor with heat sink small sensors

• Moore’s law : decreasing


implementation complexity

- significant power consumed in


computations

2
Motivation : Understand processing power
consumed in communicating
Fixed Rate Fixed message size

processor with heat sink small sensors

• Moore’s law : decreasing


implementation complexity

- significant power consumed in


computations

• total power for communicating

2
Motivation : Understand processing power
consumed in communicating
Fixed Rate Fixed message size

processor with heat sink small sensors

• Small battery operated wireless


• Moore’s law : decreasing
sensors
implementation complexity

- significant power consumed in


computations

• total power for communicating

2
Motivation : Understand processing power
consumed in communicating
Fixed Rate Fixed message size

processor with heat sink small sensors

• Small battery operated wireless


• Moore’s law : decreasing
sensors
implementation complexity
- energy at a premium.
- significant power consumed in
computations

• total power for communicating

2
Motivation : Understand processing power
consumed in communicating
Fixed Rate Fixed message size

processor with heat sink small sensors

• Small battery operated wireless


• Moore’s law : decreasing
sensors
implementation complexity
- energy at a premium.
- significant power consumed in
computations - flexibility in rate.

• total power for communicating

2
Motivation : Understand processing power
consumed in communicating
Fixed Rate Fixed message size

processor with heat sink small sensors

• Small battery operated wireless


• Moore’s law : decreasing
sensors
implementation complexity
- energy at a premium.
- significant power consumed in
computations - flexibility in rate.

• total power for communicating • total energy per bit

2
Promise of Shannon Theory
Fixed Rate: Shannon waterfall

!$

!$"#

!%
:6;*<,<08-56=>?==?*6
!%"#
.*/0$!1!0),0"02

!&

!&"#

!( 34566*60758,-95..

!("#

!#

!#"# R = 1/3
!'
! !"# $ $"# % %"# & &"#
)*+,-

3
Promise of Shannon Theory
Fixed Rate: Shannon waterfall Fixed message size : Verdu “On channel capacity per unit cost”

!$

!$"#

!%
:6;*<,<08-56=>?==?*6
!%"#
.*/0$!1!0),0"02

!&

!&"#

!( 34566*60758,-95..

!("#

!#

!#"# R = 1/3
!'
! !"# $ $"# % %"# & &"#
)*+,-

3
Promise of Shannon Theory
Fixed Rate: Shannon waterfall Fixed message size : Verdu “On channel capacity per unit cost”

!$

!$"#

!%
:6;*<,<08-56=>?==?*6
!%"#
.*/0$!1!0),0"02

!&

!&"#

!( 34566*60758,-95..

!("#

!#

!#"# R = 1/3
!'
! !"# $ $"# % %"# & &"#
)*+,-

• Long distance communication


- processing power ! transmit power -- Shannon theory works!
• Short distance communication
- Processing power can be substantial [Agarwal 98, Kravertz et al ’98,
Goldsmith et al ’02, Cui et al ’05]
3
Information theory + processing power = ?
Fixed rate
!$

!$"#

!%

+ =?
:6;*<,<08-56=>?==?*6
!%"#
.*/0$!1!0),0"02

!&

!&"#

!( 34566*60758,-95..

!("#

!#

!#"#

!'
processor with heat sink
! !"# $ $"# % %"# & &"#
)*+,-

4
Information theory + processing power = ?
Fixed rate
!$

!$"#

!%

+ =?
:6;*<,<08-56=>?==?*6
!%"#
.*/0$!1!0),0"02

!&

!&"#

!( 34566*60758,-95..

!("#

!#

!#"#

!'
processor with heat sink
! !"# $ $"# % %"# & &"#
)*+,-

Fixed message size

+ =?
small sensors

4
Talk Outline

• Motivation: Power consumption

- Fixed rate and fixed message size problems.

• Decoding power using decoding complexity.

• Complexity-performance tradeoffs.

- our bounds for iterative decoding.

• Fixed rate -- lower bounds on total power.

• Fixed message size (Green codes) -- lower bounds on min energy.

• How tight are our bounds : Related coding-theoretic literature

5
Modeling processing power
through decoding complexity

Encoder Decoder

6
Modeling processing power
through decoding complexity

Encoder Decoder

• power consumed in decoding: model using the decoding complexity

- decoding complexity : number of operations performed at the decoder

- constant amount of energy per operation.

6
Modeling processing power
through decoding complexity

Encoder Decoder

• power consumed in decoding: model using the decoding complexity

- decoding complexity : number of operations performed at the decoder

- constant amount of energy per operation.

• the common currency: power

6
Talk Outline

• Motivation: Power consumption

- Fixed rate and fixed message size problems.

• Decoding power using decoding complexity.

• Complexity-performance tradeoffs.

- our bounds for iterative decoding.

• Fixed rate -- lower bounds on total power.

• Fixed message size (Green codes) -- lower bounds on min energy.

• How tight are our bounds : Related coding-theoretic literature

7
Understanding decoding complexity :
complexity - performance tradeoffs

• complexity-performance tradeoffs :

- Required complexity to attain error probability Pe and rate R .

- Lower bounds : Abstract away from details of code structure.

- Upper bounds : code constructions.

• e.g. block codes :


Pe ≈ exp(−mEr (R))

• e.g. convolution codes :

- error exponents with constraint length [Viterbi 67]

- cut-off rate for sequential decoding [Jacobs and Berlekamp 67]

8
Understanding decoding complexity :
complexity - performance tradeoffs

• complexity-performance tradeoffs :

- Required complexity to attain error probability Pe and rate R .

- Lower bounds : Abstract away from details of code structure.

- Upper bounds : code constructions.

• e.g. block codes :


Pe ≈ exp(−mEr (R))

• e.g. convolution codes :

- error exponents with constraint length [Viterbi 67]

- cut-off rate for sequential decoding [Jacobs and Berlekamp 67]

• Want a similar analysis for iterative decoding.


8
Iterative decoding :
Decoding by passing messages
Output nodes
Y1

Y2

Y3

Y4

Y5

Y6

Y7

Y8

Y9

Decoder implementation graph

9
Iterative decoding :
Decoding by passing messages
Output nodes Information nodes
Y1 B
1
Y2 B
2
Y3 B
3
Y4 B
4
Y5 B
5
Y6 B
6
Y7
B
7
Y8

Y9

Decoder implementation graph

9
Iterative decoding :
Decoding by passing messages
Output nodes Helper nodes Information nodes
Y1 B
1
Y2 B
2
Y3 B
3
Y4 B
4
Y5 B
5
Y6 B
6
Y7
B
7
Y8

Y9

Decoder implementation graph

9
Iterative decoding :
Decoding by passing messages
Output nodes Helper nodes Information nodes
Y1 B
1
Y2 B
2
Y3 B
3
Y4 B
4
Y5 B
5
Y6 B
6
Y7
B
7
Y8

Y9

Decoder implementation graph

9
Iterative decoding :
Decoding by passing messages
Output nodes Helper nodes Information nodes
Y1 B
1
Y2 B
2
Y3 B
3
Y4 B
4
Y5 B
5
Y6 B
6
Y7
B
7
Y8

Y9

Decoder implementation graph

9
Iterative decoding :
Decoding by passing messages
Output nodes Helper nodes Information nodes
Y1 B
1
• Each node consumes γ Joules of energy
Y2 B per iteration.
2
Y3 B
3 • After l iterations, the energy consumed is
Y4 γ × l × # of nodes
B
4
Y5 B
5 • Each node is connected to at most α
Y6 B other nodes -- an implementation
6
Y7
constraint.
B
7
Y8

Y9

Decoder implementation graph

9
Iterative decoding :
Decoding by passing messages
Output nodes Helper nodes Information nodes
Y1 B
1
• Each node consumes γ Joules of energy
Y2 B per iteration.
2
Y3 B
3 • After l iterations, the energy consumed is
Y4 γ × l × # of nodes
B
4
Y5 B
5 • Each node is connected to at most α
Y6 B other nodes -- an implementation
6
Y7
constraint.
B
7
Y8

Y9 Suffices now to find l

Decoder implementation graph

9
Lower bound on l : Key Idea
Bi
a

10
Lower bound on l : Key Idea
Bi
a

10
Lower bound on l : Key Idea
Bi
a

10
Lower bound on l : Key Idea
Bi
a

l+1
<a

10
Lower bound on l : Key Idea
Bi
a

l+1
<a

Channel needs to behave atypically only in the decoding neighborhood to cause an error

10
Lower bound on decoding complexity

Result [Sahai, Grover, Submitted to IT Trans. 07]

In the limit of small Pe ! 1


"
1 log Pe
l! log
log(α) (C − R)2

• C = Channel capacity

• R = Rate

• Pe = error probability
• α = maximum node degree

11
Lower bound on decoding complexity
! 1
"
1 log Pe
l! log
log(α) (C − R)2

• A general lower bound

- applies to all (possible) codes with decoding based on passing


messages.

- applies regardless of the presence of cycles.

- applies to all decoding algorithms based on passing messages.

12
Talk Outline

• Motivation: Power consumption

- Fixed rate and fixed message size problems.

• Decoding power using decoding complexity.

• Complexity-performance tradeoffs.

- our bounds for iterative decoding.

• Fixed rate -- lower bounds on total power.

• Fixed message size (Green codes) -- lower bounds on min energy.

• How tight are our bounds : Related coding-theoretic literature

13
Fixed Rate:
Total power consumption
Encoder Decoder
k m m k

# of nodes
Ptotal = PT + γ × l ×
m
≥ PT + γ × l ! "
1
γ log Pe
≥ PT + log
log(α) (C(PT ) − R)2

Minimize Ptotal by optimizing over PT

• l = Number of iterations

• g = Energy consumed per node per iteration

• PT = Transmit power
• m = block-length
14
Fixed Rate:
Total Power Curves
! 1
"
γ log Pe
Ptotal ≥ PT + log
log(α) (C(PT ) − R)2

!# R = 1/3
!%
/*0"!1.),.2

!'

!"(

!$#
34566*6.758,-95//

.
! " # $ % &
)*+,-.

15
Fixed Rate:
Total Power Curves
! 1
"
γ log Pe
Ptotal ≥ PT + log
log(α) (C(PT ) − R)2

!# R = 1/3
!%

758,-:/;<,
=>-?,
/*0"!1.),.2

!'

!"(

!$#
34566*6.758,-95//

.
! " # $ % &
)*+,-.

15
Fixed Rate:
Total Power Curves
! 1
"
γ log Pe
Ptotal ≥ PT + log
log(α) (C(PT ) − R)2

!# R = 1/3
!%

758,-:/;<,
=>-?,
/*0"!1.),.2

!'

@A8;B5/..
!"( 8-56:B;8.A*+,-

!$#
34566*6.758,-95//

.
! " # $ % &
)*+,-.

15
Fixed Rate:
Summary

• Total power increases unboundedly as Pe → 0

• Optimal transmit power strictly larger than the Shannon


limit (transmit power - decoding power tradeoff)

16
Talk Outline

• Motivation: Power consumption

- Fixed rate and fixed message size problems.

• Decoding power using decoding complexity.

• Complexity-performance tradeoffs.

- our bounds for iterative decoding.

• Fixed rate -- lower bounds on total power.

• Fixed message size (Green codes) -- lower bounds on min


energy.

• How tight are our bounds : Related coding-theoretic literature

17
Fixed message size : Green Codes
Minimum energy per-bit
Encoder Decoder
k m m k

Etotal = mPtotal
= m PT + γ × l × # of nodes

18
Fixed message size : Green Codes
Minimum energy per-bit
Encoder Decoder
k m m k

Etotal = mPtotal
= m PT + γ × l × # of nodes

Etotal
Eper bit =
k
PT # of nodes
= +γ×l×
R k

18
Fixed message size : Green Codes
Minimum energy per-bit
Encoder Decoder
k m m k

Etotal = mPtotal
= m PT + γ × l × # of nodes

Etotal
Eper bit =
k
PT # of nodes
= +γ×l×
R k
PT max{k, m}
≥ +γ×l×
R k

18
Fixed message size:
Minimum energy per bit curves

!!(

!&)

!&(

!$)
45.!)6"7,"8

!$(

!')

!'( 9:;++5+"42<23

!()

!((
!" !#$%" &" $" '" (
*+,-./"0,-"123

19
Fixed message size:
Minimum energy per bit curves

!!(

!&)

!&(

!$)
45.!)6"7,"8

14;=>!15?"15@+AB
!$( !"C"'

!')

!'( 9:;++5+"42<23

!()

!((
!" !#$%" &" $" '" (
*+,-./"0,-"123

Black-box bounds : Based on [Massaad, Medard and Zheng]

19
Fixed message size:
Minimum energy per bit curves

!!(

!&)
D@-"15@+AB
!&( ""C")#&

!$)
45.!)6"7,"8

14;=>!15?"15@+AB
!$( !"C"'

!')

!'( 9:;++5+"42<23

!()

!((
!" !#$%" &" $" '" (
*+,-./"0,-"123

Black-box bounds : Based on [Massaad, Medard and Zheng]

19
Fixed message size:
Optimal rate curves

!)!

!$!

!(!
.+/&!012314

!151!")

!'! !151!"6

!#!

!%!

!&!!
!"#$ !"% !"%$ &
*+,-

20
Fixed message size:
Summary

• Minimum energy per bit increases to infinity as Pe → 0

- compare with a constant, ln(4), in classical information theory.

• Optimizing rate converges to 1.

- zero in classical information theory.

21
Talk Outline

• Motivation: Power consumption

- Fixed rate and fixed message size problems.

• Decoding power using decoding complexity.

• Complexity-performance tradeoffs.

- our bounds for iterative decoding.

• Fixed rate -- lower bounds on total power.

• Fixed message size (Green codes) -- lower bounds on min energy.

• How tight are our bounds : Related coding-theoretic


literature

22
Lower bounds on complexity:
how tight are they?
! 1
"
1 log Pe
l! log y
log(α) (C − R)2 y=x

y=f!x"
• Optimal behavior with respect to Pe

- regular LDPC’s achieve this! [Lentmaier et al]

• what about behavior with gap = C − R ? x

23
Complexity behavior with gap = C − R

• [Gallager, Burshtein et al, Sason-Urbanke] Lower bounds on


density for LDPCs.

• [Pfister-Sason, Hsu-Anastastopoulos] Upper bounds.


! "
1
• Khandekar-McEliece conjecture: l ≥ Ω
C −R
• [Sason, Weichman] For LDPCs, IRAs, ARAs, if there are a non-
zero fraction of degree 2 nodes, and the graph is a tree, the
conjecture holds.
! "
1
- but with degree-2 nodes, l ≈ log
Pe
- and it seems that degree-2 nodes are needed to approach capacity.

- from energy perspective, is it worth approaching capacity?

24
Thank you

• Full paper on arxiv

- ‘The price of certainty: “Waterslide curves” and the gap to


capacity’. Anant Sahai and Pulkit Grover.

25

You might also like