Ece458 L7
Ece458 L7
[email protected]
Channel Capacity
2
TODAY’S TOPICS
Differential Entropy
Mutual Information for Continuous R.V.
[email protected]
Gaussian Channel
3
REMEMBER
[email protected]
Mutual information is a quantity that measures a relationship
between two random variables that are sampled simultaneously.
In particular, it measures how much information is
communicated, on average, in one random variable about
another.
4
DIFFERENTIAL ENTROPY
We introduce the concept of differential entropy, which is the
entropy of a continuous random variable.
[email protected]
Definition Let X be a random variable with cumulative
distribution function F(x) = Pr(X ≤ x). If F(x) is continuous, the
random variable is said to be continuous.
f (x) 1
f (x) is called the probability density function for X. The set where
f(x) > 0 is called the support set of X.
5
Differential Entropy
[email protected]
h( X ) f ( x) log f ( x)dx
S
6
Example: Uniform Distribution
[email protected]
a
1 1
h( X ) log dx log a
0
a a
Note: For a < 1, loga < 0, and the differential entropy is negative.
Hence, unlike discrete entropy, differential entropy can be
negative. However, 2h(X) = 2log a = a is the volume of the support
set, which is always nonnegative, as we expect.
7
Example: Normal Distribution
Let x2
1 2
X ~ ( x) e 2
2 2
[email protected]
Then calculating the differential entropy in nats, we obtain
x2 2
h( ) ln ( x) ln 2
2 2
EX 2 1 1 1 1 1
ln 2 2
ln 2 2
ln e ln 2 2
2 2
2 2 2 2 2
1
ln 2e 2
2
Changing the base of the logarithm:
1
h() log 2e 2 8
2
Note : Gaussian distribution maximizes the entropy over all distributions
with the same variance.
MUTUAL INFORMATION FOR CONTINUOUS R.V.
Definition The mutual information I (X; Y) between two random
variables with joint density f (x, y) is defined as:
[email protected]
f ( x, y)
I ( X ; Y ) f ( x, y) log dxdy
f ( x) f ( y )
9
[email protected]
10
GAUSSIAN CHANNEL
GAUSSIAN CHANNEL
The most important continuous alphabet channel: AWGN
11
CAPACITY OF GAUSSIAN CHANNEL
[email protected]
12
[email protected]
13
CAPACITY OF GAUSSIAN CHANNEL
[email protected]
When 𝑌 is normal and since 𝑍 is normal then the
optimizing input distribution 𝑋 is N(0,𝑃).
14
GAUSSIAN CHANNEL CAPACITY THEOREM
Theorem. The capacity of a Gaussian channel with power constraint
P and noise variance N is
[email protected]
15
BAND LIMITED GAUSSIAN CHANNEL
[email protected]
16
BAND LIMITED GAUSSIAN CHANNEL
[email protected]
17
EXAMPLE: TELEPHONE LINE
Telephone signal are band-limited to 3300 Hz
Capacity = 36 kb/s
18
IMPLICATIONS OF THE INFORMATION
CAPACITY THEOREM