0% found this document useful (0 votes)
68 views

Tutorial 11

This document provides solutions to 5 problems related to communication channel capacity and bandwidth calculations: 1. It calculates the mutual information and capacity of a binary symmetric channel (BSC) with error probability p. The capacity is shown to be 1 - H(p) bits per channel use. 2. For a BSC, it shows that the capacity is maximized when the input probability is 1/2, giving the capacity as 1 - H(p). 3. It shows that the capacity of an ideal additive white Gaussian noise (AWGN) channel with infinite bandwidth is given by (log2e)P/N0, where P is the received signal power and N0 is the noise spectral density

Uploaded by

jagriti kumari
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
68 views

Tutorial 11

This document provides solutions to 5 problems related to communication channel capacity and bandwidth calculations: 1. It calculates the mutual information and capacity of a binary symmetric channel (BSC) with error probability p. The capacity is shown to be 1 - H(p) bits per channel use. 2. For a BSC, it shows that the capacity is maximized when the input probability is 1/2, giving the capacity as 1 - H(p). 3. It shows that the capacity of an ideal additive white Gaussian noise (AWGN) channel with infinite bandwidth is given by (log2e)P/N0, where P is the received signal power and N0 is the noise spectral density

Uploaded by

jagriti kumari
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 3

Tutorial 11

(Corresponding to Lecture 51-55)

September 21, 2017

1. Calculate the mutual information of a binary symmetric channel (BSC) with a proba-
bility of error = p.
Solution
Let us assume that the source X generates a symbol x1 with probability α and x2 with
probability 1 - α.
We know mutual information between the input X and output Y of a channel is given
as
I(X; Y ) = H(Y ) − H(Y |X) (1)
where, H(Y |X) is
XX
H(Y |X) = − p(x, y)log2 p(y|x)
y x

= −p(x1 , y1 )log2 p(y1 |x1 ) − p(x1 , y2 )log2 p(y2 |x1 ) − p(x2 , y1 )log2 p(y1 |x2 )
− p(x2 , y2 )log2 p(y2 |x2 )
= −αplog2 (1 − p) − αplog2 p − (1 − α)(1 − p)log2 (1 − p) − (1 − α)plog2 p
= −log2 (1 − p)[α(1 − p) + (1 − α)(1 − p)] − log2 p[αp + (1 − α)p]
= −(1 − p)log2 (1 − p) − plog2 (p)
= H(p) (2)
So mutual information I(X; Y ) is given as
I(X; Y ) = H(Y ) − H(p) (3)

2. Calculate the capacity of the BSC described in problem 1.


Solution
Capacity is the maximum possible value of mutual information. It has been already
shown that mutual information of a BSC is given as I(X; Y ) = H(Y ) − H(p).
Clearly, to maximize I(X; Y ) we need to maximize H(Y ), we cannot do anything about
H(p) as it is dependent only on p and p is the property of channel, which is not in
designers control. Let us assume p(y = y1 ) = α and p(y = y2 ) = 1 − α. Then
H(Y ) = −αlog2 α − (1 − α)log2 (1 − α) = H(α) (4)

1
which is maximized when
dH(α)
=0

d[(1 − α)log2 (1 − α) − αlog2 (α)]
or, − =0

d[(1 − α)ln(1 − α) − αln(α)]
or, (log2 e) =0

d[(1 − α)ln(1 − α) − αln(α)]
or, =0

or, ln(α) − ln(1 − α) = 0
1
or, α =
2
So the capacity is given as
1
C = H( ) − H(p) = 1 − H(p) bits per channel use (5)
2
3. Show that the capacity of an ideal additive white Gaussian noise (AWGN) channel with
P
infinite bandwidth is given by ln(2)N0
, where P is the received signal power and N20 is
noise spectral density.
Solution:
Let us assume bandwidth to be W Hz (W → ∞).
So capacity is given by
P
C∞ = lim W log2 (1 + )
W →∞ N0 W
P
= (log2 e) lim W ln(1 + )
W →∞ N0 W
P
= (log2 e) lim W
W →∞ N0 W
P
= (6)
ln(2)N0
This is because as x → 0, ln(1 + x) → x.
4. A communication channel of bandwidth 75 kHz is required to transmit binary data at a
symbol rate of 0.1 Mbauds/s using raised cosine pulses. Determine the roll-off factor α.
Solution:

1
Ts = 6
= 10−5 s
0.1 × 10
fB = 75 kHz (7)
Now, we know 1 + α = 2fB Ts .
So, α = 0.5

Page 2
5. The bit duration of a BPSK signal is 0.647 µs. Find the minimum transmission band-
width for roll-off factors α = 0 and 1
Solution:
1
Nyquist bandwidth W0 = = 772 kHz
2Tb
Transmission bandwidth for α =0 is W = (1 + α)W0 = 772 kHz
and for α = 1 is W = (1 + α)W0 = 1.544 MHz

Page 3

You might also like