0% found this document useful (0 votes)
52 views30 pages

Network Information Theory: Multiple Access Channels Broadcasting Channel Capacity Region

The document discusses several topics in network information theory, including: 1) Multiple access channels, which involve multiple senders transmitting independent messages to a single receiver over a common channel. 2) The broadcasting channel, where a single sender transmits independent messages to multiple receivers over a common channel. 3) Gaussian multiple user channels, including the Gaussian multiple access channel where the channel output is the sum of the inputs plus noise, and the capacity region characterizing achievable transmission rates.

Uploaded by

viniciusvkdlima
Copyright
© Attribution Non-Commercial (BY-NC)
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPT, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
52 views30 pages

Network Information Theory: Multiple Access Channels Broadcasting Channel Capacity Region

The document discusses several topics in network information theory, including: 1) Multiple access channels, which involve multiple senders transmitting independent messages to a single receiver over a common channel. 2) The broadcasting channel, where a single sender transmits independent messages to multiple receivers over a common channel. 3) Gaussian multiple user channels, including the Gaussian multiple access channel where the channel output is the sum of the inputs plus noise, and the capacity region characterizing achievable transmission rates.

Uploaded by

viniciusvkdlima
Copyright
© Attribution Non-Commercial (BY-NC)
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPT, PDF, TXT or read online on Scribd
You are on page 1/ 30

Network Information Theory

Multiple access channels


Broadcasting channel
Capacity region
Network Information Theory
Communication problems:
interference, cooperation and feedback

Many senders and receivers and channel transition matrix.

Distributed source coding (data compression), distributed


Communication (capacity region).

Broadcasting channel

Multiple access channels


Network Information Theory

Examples of large communication network: computer


networks, satellite network and the phone system.

Other channels: relay channel, interference channel and


Channel.

Relay channel: there is one source and one destination, but one or more
intermediate sender-receiver pairs that act as relays to facilitate the
communication between the source and one destination.
Gaussian Multiple User Channels
The channel with input power P and additive white Gaussian
noise channel and noise variance is modeled by

Yi = Xi + Zi , i = 1, 2, 3, … where Zi are i.i.d. Gaussian


random variables with mean zero and variance N.
1 n 2
The signal X = (X1, X2,…, Xn) has a constrain 
n i 1
Xi  P
1 P
C  max2 I ( X ; Y )  log(1  ), bits per transmission
p ( X ):E [ X ] P 2 N
For a single user Gaussian channel

Y = X+Z 1 P
R  log(1  )
2 N
The Gaussian multiple access channel with m users
m
P 1 P
Y   Xi  Z and C ( )  log ( 1  ) capacity of a single user
i 1 N 2 N

The achievable rate region for the Gaussian channel is given

P Here we have m codebooks, the i-th codebook


Ri  C ( ) nR
having 2 i codeword of power P.
N Each of the independent transmitters chooses
2P an arbitrary codeword from the own codebook
Ri  R j  C ( )
N and simultaneously send these vectors. The
Gaussian noise is added. Y = X + Z
3P
Ri  R j  R j  C ( )
N The receive looking for the m codewords.
m
mP If (R1, R2, . . , Rm) is in the capacity region, then

i 1
Ri  C (
N
) the probability of error goes to 0 as n tends to
infinity.
The Gaussian broadcasting channel
The model of channel: Y1 = X + Z 1 Y 2 = X + Z2

where Z1 and Z2 are arbitrarily correlated Gaussian random


variables with variance N1 and N2. The sender wishes to
send independent message at rate R1 and R2 to receivers Y1
and Y2.
The capacity region:
 P   (1   ) P 
R1  C   and R2  C  , 0    1
 N1   N2 
The transmitter generates two codebooks: αP, R1, (1- α)P, R2.
The transmitter send the sum of the codewords X(i) +X(j).
i  {1, 2, ..., 2 nR 1
} and j  {1, 2, ..., 2 nR 2
}
The receivers decodes their message.
The Gaussian relay channel
For relay channel, we have a sender X and an ultimate intended receiver
Y. Also present is the relay channel intended solely to help the receive.
Y1 = X + Z 1
Y1:X1
Y = X + Z 1 + X1 + Z2

where Z1 and Z2 are independent zero mean


Gaussian random variables with variance
N1 and N2, respectively.

X1i = fi(Y1i, Y2i, …, Y1(i-1)) X Y

X: power P; X1: power P1; Capacity


  P  P1  2 (1   ) PP1   P  P1 P
C  max min C  , C   if  ,
 N 2 N1
0 1
  N1  N 2   N1  
 
 P 
C  C     1
 N1 
The Gaussian relay channel

The channel appears to be noise-free after relay, and the


capacity C(P/N1) from X to the relay can be achieved. Thus
the rate C(P/(N1+N2)) without the relay is increased by the
presence of the relay to C(P/N1). For large N2, and for
P1 P

N 2 N1

we see that the increment in the rate is from

C ( P /( N1  N 2 ))  0 to C(P/N1 )
The Gaussian interference channel
The interference channel has two senders and two receivers
Z1~N(0,N)
Y1= X1+ αX2+ Z1
X1 + Y1
α
Y2= X2+ αX1 + Z2
α
X2 + Y2
where Z1 and Z2 are independent
N(0,N) random variables
Z2 ~N(0,N)
It is not quite a broadcasting channel, nor
Is it a multiple channel.
This channel has not been solved in general even in the Gaussian case.
But remarkably, in case of high interference, it can bee show that the
capacity region of this channel is the same as if there were no interference
whatsoever.
If the interference α satisfies C(α2 P/(P+N)) > C(P/N), the first transmitter
perfectly understands the codeword of the second transmitter
The Gaussian Two-way Channel

Very similar to interference channel, with the additional provision that


sender 1 is attached to receiver 2 and the sender 2 is attached to
receiver 1.

W1 X1 X2 W2
p ( y1 , y2 / x1 , x2 )
Ŵ2 Y1 Y2 Ŵ1

Let P1 and P2 : the power of transmitters 1 and 2


N1 and N2: the noise variances of the two channel

R1 < C(P1/N1) and C(P2/N2)


Jointly Typical Sequences
Let (X1, X2, …, Xk) a finite collection of discrete random variables with some
fixed joint distribution, p(x1,x2, …,xk). Let S the sub-set of these random
variables and consider n independent copies of S. Thus
n
P( S  s )   P( Si  si )
i 1
n
For example if S  ( X i , X j ), P( S  s )  P[(X i , X j )  ( xi , x j )]   p( xik , x jk )
k 1
By the law of the large numbers, for any subset of random variables,

1 1 n
 log p ( S1 , S 2 ,..., S n )    log p ( Si )  H ( S )
n n i 1
(n )
A
Definition: The set  de ε-typical n-sequence (x1, x2, ... , xk) is defined by
A( n ) ( X 1 , X 2 ,..., X k )  A( n )
 1 
 ( x1 , x 2 ,..., x k ) :  log p(s)  H ( S )   , S  [ X 1 , X 2 ,..., X k ]
 n 
Jointly Typical Sequences
If S=(X1,X2), we have
1
A( n ) ( X 1 , X 2 )  {( x1 , x 2 ) :  log p( x1 , x 2 )  H ( X 1 , X 2 )   ,
n
1 1
 log p( x1 )  H ( X 1 )   ,  log p( x2 )  H ( X 2 )   }
n n

n ( b  ) 1
a
Notation n  2 to mean log an  b   for n sufficiently large
n
Theorem: For any ε > 0, for sufficiently large n
1. P[ A( n ) ( S )]  1   , S  { X 1 , X 2 ,..., X k }
2. s  A( n ) ( S )  p(s)  2  n ( H ( S )  )
3. A( n ) ( S )  2  n ( H ( S )  2 )
4. Let S1 , S 2  { X 1 , X 2 ,..., X k }. If (x1 , x 2 )  A( n ) ( S1 , S 2 ),
then p(s1 /s 2 )  2  n ( H ( S1 / S 2 ) 2 )
Jointly Typical Sequences

Theorem:
(n)
Let S1,S2 be two subset of X1, X2, …, Xk. For any ε>0, define  ( S1 / s 2 )
A
to be the set of s1 sequences that are jointly ε-typical with a particular s2
sequence.
If s 2  A (n)
 ( S 2 ), then for sufficiently large n, we have.

A( n ) ( S1 / s 2 )  2 n ( H ( S1 / S 2 )  2 ) and (1   )2 n ( H ( S1 / S 2 ) 2 )   p(s 2 ) | A( n ) ( S1 / s 2 ) |


s2
(n )
A
Let  denote the typical set for the probability mass function p(s1,s2,s3),
and let
n
P ( S1  s1 , S 2  s 2 , S 2  s 3 )   p( s1i /s3i ) p( s2i / s3i ) p ( s3i )
i 1

then P{( S1, S 2 , S 2 )  A( n )  2 n ( I ( S1 ;S 2 / S 2 )  6 )


The multiple access channel
Definition: A discrete memoryless multiple access channel consists of
three alphabets, X1, X2 and Y, and a probability transition matrix p(y/x1,x2).
nR1 nR2
A Code ( 2 , 2 , n) for multiple access channel consists of two set of
nR1
integers W1 = {1, 2, ..., 2 } and W2 ={1, 2, ..., 2 nR1
} called the message
sets, two encode function

X1 : W1  X1n, ; X2 : W2  X2n

Decode function g: Yn  W1 x W2
1
P e
(n)

2 n ( R1  R2 )
 P{ g (Y n
)  ( w1 , w2 ) |( w1 , w2 ) send}
( w1 , w2 )  W1xW2

W1 X1
p(y/x1,x2) Y (Wˆ1 , Wˆ1 )
W2 X2
The multiple access channel
The capacity region of the multiple access channel is the closure of the set
of achievable (R1, R2).
Theorem: The capacity of a multiple access channel (X1, X2, p(y/x1,x2), Y)
is the closure of the convex hull of all (R1, R2) satisfying

R1 < I(X1; Y / X2) ; R2 < I(X2; Y / X1) ; R1 + R2 ≤ I( (X1;X2); Y )

for some distribution p1(x1)p2(x2) on X1 x X1

Example of the capacity


R
region for a multiple access channel
2

C2

I(X2; Y )

I(X1; Y ) C1 R1
Independent binary symmetric channel
1-p1
0 0
p1
X1 R2
Y1 Capacity region
1 p1 1 C2=1-H(p2)
1-p1
1-p2
0 0
p2
X2 Y2
1 p2 1 0 C1=1-H(p1) R1
1-p2
R2
Binary multiplier channel : Y = X1 X2 C2=1
Capacity region
Setting X2=1, we can send at a rate of 1
bit per transmission from sender 1
to receiver. If X1=1, we can achieve R2=1.
R1 + R 2 = 1 0 C1=1 R1
Binary erasure multiple access channel
Capacity region
R2

X1 0
C2=1

1 Y
1/2

X2 2
0 1/2 C1=1 R1

Binary input X1 = X2 = (0,1) and a ternary output Y = X1+ X2

1-p
0 0 CBEC = 1 - p
p
X E Y p = 1/2;
p
1 1 CBEC=1 bit per transmission
1-p
The capacity region of the multiple access channel
The closure of the set of achievable (R1, R2).
Theorem: The capacity of a multiple access channel (X1, X2, p(y/x1,x2), Y)
is the closure of the convex hull of all (R1, R2) satisfying

R1 < I(X1; Y / X2) ; R2 < I(X2; Y / X1) ; R1 + R2 ≤ I( (X1;X2); Y )

for some distribution p1(x1)p2(x2) on X1 x X1


R2
The point A correspond to the maximum D C
I(X2;Y/X1)
rate achievable from sender 1 to the
receiver 2 is not sending any
I(X2; Y ) B
Information. This is
A
maxR1 < maxp(x1)p(x2) I(X1; Y / X2) ;
I(X1; Y ) I(X1;Y/X2) R1
For any distribution p1(x1)p2(x2)
I ( X 1 ; Y / X 2 )   p 2 ( x2 ) I ( X 1 ; Y / X 2  x2 ) Achievable region of multiple
X2
access channel or fixed input
Distribution.
 max I ( X 1 ; Y / X 2  x2 )
Multiple access channel Gaussian - MAC
Independent joint Gaussian input distributions achieves the capacity region

I ( X 1 ; Y / X 2 )  h(Y / X 2 )  h(Y / X 1 , X 2 )
 h( X 1  X 2  Z / X 2 )  h( X 1  X 2  Z / X 1 , X 2 )
 h ( X 1  Z / X 2 )  h ( Z / X 1 , X 2 )  h( X 1  Z )  h ( Z )
1
 h( X 1  Z )  log(2e) N
2
1 1 1 P
 log(2e)( P1  N )  log(2e) N  log(1  1 )
2 2 2 N
Multiple access channel Gaussian – capacity region
1 P1 P2 P1  P2
C ( x)  log(1  x) : R1  C ( ); R2  C ( ); R1  R2  C ( )
2 N N N
Example:
Distributed Source Coding
How to encoding a source X: A rate R > H(X) is sufficient.
Two source H(X,Y) ~ p(x,y): A rate R > H(X,Y) is sufficient.

If X-source and Y-source are separately described: R=Rx+Ry>H(X)+H(Y)


is sufficient.

R=H(X,Y)
Slepian-Wolf Theorem
Theorem: For the distributed source coding probability for the source (X,Y)
drawn i.i.d ~p(x,y), the achievable rate region is given by
Broadcast Channel

• One-to-many channel
• Downlink of cellular or satellite channels
• TV, radio broadcasting, DMB, DVB
Broadcast capacity region
• General capacity region unknown
• Capacity region known for degraded broadcast channel
• Physically degraded X  Y1  Y2
• Stochastically degraded
• Same conditional marginal distributions as a physically
degraded channel
• Broadcast capacity depends only on conditional
marginal distributions since users do not cooperate
• Superposition coding is optimal
• Example) Gaussian broadcast channel

• Capacityregion: convex hull of the closure of all (R1, R2)


such that
R2  I (U ; Y2 )  H (Y2 )  H (Y2 / U )  1  H (   p2 )
  p2   (1  p2 )  (1   ) p2
Gaussian Broadcast Channel
Gaussian Broadcast Channel
Example:
Example:
Gaussian Vector Broadcast Channel

You might also like