0% found this document useful (0 votes)
11 views

Problem Set2 Itc

The document presents a problem set focused on information theory, covering topics such as mutual information, channel capacity, and cascaded channels. It includes various tasks requiring calculations of entropy and information measures for different channel configurations and noise conditions. Additionally, it explores the implications of joint distributions and Markov chains in the context of information flow between variables.

Uploaded by

sethu101286
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
11 views

Problem Set2 Itc

The document presents a problem set focused on information theory, covering topics such as mutual information, channel capacity, and cascaded channels. It includes various tasks requiring calculations of entropy and information measures for different channel configurations and noise conditions. Additionally, it explores the implications of joint distributions and Markov chains in the context of information flow between variables.

Uploaded by

sethu101286
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 2

Problem Set-2

1. Each time an input symbol is transmitted over channel 1 it is repeated over channel 2 (see
Figure ), so that the output may be considered to be a pair of symbols (bj , ck ). Furthermore

we assume that the repetition is per formed independently of the results of the original
transmission, so that

P (ck /ai , bj ) = P (ck /ai )

Note that this does not mean ck and bj are statistically independent.

P (ck /bj ) ̸= P (ck )

(a) Show that

I(A; B, C) = I(A; B) + I(A; C) − I(B; C)

(b) Generalize part (a) to the case of n channels

2. Two BSCs, each with probability of error p, are cascaded, as shown in the sketch. The inputs

0 and 1 of A are chosen with equal probability. Find the following:

(a) H(A)
(b) H(B)
(c) (H(C)
(d) H(A,B)
(e) H(B,C)
(f) H(A,C)
(g) H(A,B,C)
(h) I(A;B;C)
 
1−p−q q p
3. Find the capacity of
p q 1−p−q
 
p p 0 0
p p 0 0
4. Find the capacity of the channel 
 0 0 p p

0 0 p p
5. A telephone channel has a bandwidth of 3000 Hz and the SN R = 20dB.

(a) Determine the channel capacity.

1
(b) If the SNR is increased to 25 dB, determine the increased capacity

6. Consider a channel consisting of two parallel AWGN channels with inputs X1 , X2 and outputs

Y1 = X1 + Z1

Y2 = X2 + Z2

The noises Z1 and Z2 are independent and have variances σ12 and σ22 with σ12 < σ22 However,
we are constrained to use the same symbol on both channels, i.e. X1 = X2 = X, where X is
constrained to have power E[X 2 ] = P . Suppose at the receiver, we combine the outputs to
produce Y = Y1 + Y2 . What is the capacity C1 of channel with input X and output Y ?

7. Consider the channels A, B and the cascaded channel AB shown in Figure. (i) Find CA the

capacity of channel A. (ii) Find CB the capacity of channel B. (iii) Next, cascade the two
channels and determine the combined capacity CAB . (iv) Explain the relation between CA ,
CB and CAB .

8. Consider the discrete memoryless channel with source X = {0, 1} and an independent on-off
jammer, Z, such that P (Z = 0) = P (Z = a) = 0.5. Find the capacity of this channel.

9. Suppose that (X, Y, Z) are jointly Gaussian and that X → Y → Z forms a Markov chain.
Let X and Y have correlation coefficient ρ1 and let Y and Z have correlation coefficient ρ2 .
Find I(X; Z).

10. Let the input random variable X to a channel be uniformly distributed over the interval
− 21 ≤ x ≤ 12 Let the output of the channel be Y = X + Z, where the noise random variable
is uniformly distributed over the interval − a2 ≤ z ≤ a2

(a) Find I(X; Y ) as a function of a.


(b) For a = 1 find the capacity of the channel when the input X is peak-limited; that is, the
range of X is limited to − 12 ≤ x ≤ 12 . What probability distribution on X maximizes
the mutual information I(X; Y )?
(c) Find the capacity of the channel for all values of a, again assuming that the range of X
is limited to − 12 ≤ x ≤ 21 .

You might also like