Network Information Theory: Multiple Access Channels Broadcasting Channel Capacity Region
Network Information Theory: Multiple Access Channels Broadcasting Channel Capacity Region
Broadcasting channel
Relay channel: there is one source and one destination, but one or more
intermediate sender-receiver pairs that act as relays to facilitate the
communication between the source and one destination.
Gaussian Multiple User Channels
The channel with input power P and additive white Gaussian
noise channel and noise variance is modeled by
Y = X+Z 1 P
R log(1 )
2 N
The Gaussian multiple access channel with m users
m
P 1 P
Y Xi Z and C ( ) log ( 1 ) capacity of a single user
i 1 N 2 N
C ( P /( N1 N 2 )) 0 to C(P/N1 )
The Gaussian interference channel
The interference channel has two senders and two receivers
Z1~N(0,N)
Y1= X1+ αX2+ Z1
X1 + Y1
α
Y2= X2+ αX1 + Z2
α
X2 + Y2
where Z1 and Z2 are independent
N(0,N) random variables
Z2 ~N(0,N)
It is not quite a broadcasting channel, nor
Is it a multiple channel.
This channel has not been solved in general even in the Gaussian case.
But remarkably, in case of high interference, it can bee show that the
capacity region of this channel is the same as if there were no interference
whatsoever.
If the interference α satisfies C(α2 P/(P+N)) > C(P/N), the first transmitter
perfectly understands the codeword of the second transmitter
The Gaussian Two-way Channel
W1 X1 X2 W2
p ( y1 , y2 / x1 , x2 )
Ŵ2 Y1 Y2 Ŵ1
1 1 n
log p ( S1 , S 2 ,..., S n ) log p ( Si ) H ( S )
n n i 1
(n )
A
Definition: The set de ε-typical n-sequence (x1, x2, ... , xk) is defined by
A( n ) ( X 1 , X 2 ,..., X k ) A( n )
1
( x1 , x 2 ,..., x k ) : log p(s) H ( S ) , S [ X 1 , X 2 ,..., X k ]
n
Jointly Typical Sequences
If S=(X1,X2), we have
1
A( n ) ( X 1 , X 2 ) {( x1 , x 2 ) : log p( x1 , x 2 ) H ( X 1 , X 2 ) ,
n
1 1
log p( x1 ) H ( X 1 ) , log p( x2 ) H ( X 2 ) }
n n
n ( b ) 1
a
Notation n 2 to mean log an b for n sufficiently large
n
Theorem: For any ε > 0, for sufficiently large n
1. P[ A( n ) ( S )] 1 , S { X 1 , X 2 ,..., X k }
2. s A( n ) ( S ) p(s) 2 n ( H ( S ) )
3. A( n ) ( S ) 2 n ( H ( S ) 2 )
4. Let S1 , S 2 { X 1 , X 2 ,..., X k }. If (x1 , x 2 ) A( n ) ( S1 , S 2 ),
then p(s1 /s 2 ) 2 n ( H ( S1 / S 2 ) 2 )
Jointly Typical Sequences
Theorem:
(n)
Let S1,S2 be two subset of X1, X2, …, Xk. For any ε>0, define ( S1 / s 2 )
A
to be the set of s1 sequences that are jointly ε-typical with a particular s2
sequence.
If s 2 A (n)
( S 2 ), then for sufficiently large n, we have.
X1 : W1 X1n, ; X2 : W2 X2n
Decode function g: Yn W1 x W2
1
P e
(n)
2 n ( R1 R2 )
P{ g (Y n
) ( w1 , w2 ) |( w1 , w2 ) send}
( w1 , w2 ) W1xW2
W1 X1
p(y/x1,x2) Y (Wˆ1 , Wˆ1 )
W2 X2
The multiple access channel
The capacity region of the multiple access channel is the closure of the set
of achievable (R1, R2).
Theorem: The capacity of a multiple access channel (X1, X2, p(y/x1,x2), Y)
is the closure of the convex hull of all (R1, R2) satisfying
C2
I(X2; Y )
I(X1; Y ) C1 R1
Independent binary symmetric channel
1-p1
0 0
p1
X1 R2
Y1 Capacity region
1 p1 1 C2=1-H(p2)
1-p1
1-p2
0 0
p2
X2 Y2
1 p2 1 0 C1=1-H(p1) R1
1-p2
R2
Binary multiplier channel : Y = X1 X2 C2=1
Capacity region
Setting X2=1, we can send at a rate of 1
bit per transmission from sender 1
to receiver. If X1=1, we can achieve R2=1.
R1 + R 2 = 1 0 C1=1 R1
Binary erasure multiple access channel
Capacity region
R2
X1 0
C2=1
1 Y
1/2
X2 2
0 1/2 C1=1 R1
1-p
0 0 CBEC = 1 - p
p
X E Y p = 1/2;
p
1 1 CBEC=1 bit per transmission
1-p
The capacity region of the multiple access channel
The closure of the set of achievable (R1, R2).
Theorem: The capacity of a multiple access channel (X1, X2, p(y/x1,x2), Y)
is the closure of the convex hull of all (R1, R2) satisfying
I ( X 1 ; Y / X 2 ) h(Y / X 2 ) h(Y / X 1 , X 2 )
h( X 1 X 2 Z / X 2 ) h( X 1 X 2 Z / X 1 , X 2 )
h ( X 1 Z / X 2 ) h ( Z / X 1 , X 2 ) h( X 1 Z ) h ( Z )
1
h( X 1 Z ) log(2e) N
2
1 1 1 P
log(2e)( P1 N ) log(2e) N log(1 1 )
2 2 2 N
Multiple access channel Gaussian – capacity region
1 P1 P2 P1 P2
C ( x) log(1 x) : R1 C ( ); R2 C ( ); R1 R2 C ( )
2 N N N
Example:
Distributed Source Coding
How to encoding a source X: A rate R > H(X) is sufficient.
Two source H(X,Y) ~ p(x,y): A rate R > H(X,Y) is sufficient.
R=H(X,Y)
Slepian-Wolf Theorem
Theorem: For the distributed source coding probability for the source (X,Y)
drawn i.i.d ~p(x,y), the achievable rate region is given by
Broadcast Channel
• One-to-many channel
• Downlink of cellular or satellite channels
• TV, radio broadcasting, DMB, DVB
Broadcast capacity region
• General capacity region unknown
• Capacity region known for degraded broadcast channel
• Physically degraded X Y1 Y2
• Stochastically degraded
• Same conditional marginal distributions as a physically
degraded channel
• Broadcast capacity depends only on conditional
marginal distributions since users do not cooperate
• Superposition coding is optimal
• Example) Gaussian broadcast channel