0% found this document useful (0 votes)
30 views36 pages

ITC-Post Mid1

The document discusses differential entropy, which extends the concept of entropy to continuous probability distributions. Differential entropy is defined as the limit of the entropy as the interval between possible values approaches zero. Unlike discrete entropy, differential entropy can be infinite for a continuous random variable. The document provides examples of calculating differential entropy for uniform and Gaussian distributions.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
30 views36 pages

ITC-Post Mid1

The document discusses differential entropy, which extends the concept of entropy to continuous probability distributions. Differential entropy is defined as the limit of the entropy as the interval between possible values approaches zero. Unlike discrete entropy, differential entropy can be infinite for a continuous random variable. The document provides examples of calculating differential entropy for uniform and Gaussian distributions.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 36

INFORMATION THEORY & CODING

6th Semester, 2020-21

ANIRBAN BHATTACHARJEE
ASSISTANT PROFESSOR
DEPARTMENT OF ECE,NIT AGARTALA

5/12/2021 Department of ECE, NIT Agartala


DIFFERENTIAL ENTROPY

❑ Differential entropy (also referred to as continuous entropy) is a concept in information theory that
extends the idea of entropy, a measure of average uncertainty of a random variable, to
continuous probability distributions.
Consider a continuous random variable 𝑋 with a pdf 𝑓𝑋 𝑥 . Then the differential entropy is defined as follows:

❑ The term differential entropy is used to distinguish it from ordinary entropy.

❑ We consider the continuous random variable 𝑋 as the limiting form of a discrete random variable that assumes
the values 𝑥𝑘 = 𝑘∆𝑥, 𝑤ℎ𝑒𝑟𝑒 𝑘 = 0, ±1, ±2, … … . , 𝑎𝑛𝑑 ∆𝑥 𝑎𝑝𝑝𝑟𝑜𝑎𝑐ℎ𝑒𝑠 𝑧𝑒𝑟𝑜 ( i.e. the values are very closely
spaced).
❑ For the interval [𝑥𝑘 , 𝑥𝑘 + ∆𝑥], the probability that of occurrence of 𝑋 in the interval is 𝑓𝑋(𝑥𝑘 )∆𝑥. With ∆𝑥 → 0,the
ordinary entropy of the continuous random variable 𝑋 can be expressed as

1
𝐻 𝑋 = lim ෍ 𝑓𝑋(𝑥𝑘 )∆𝑥 log 2 =
∆𝑥→0 𝑓𝑋 (𝑥𝑘 )∆𝑥
𝑘=−∞
∞ ∞
1
= lim ෍ 𝑓𝑋(𝑥𝑘 ) log 2 ∆𝑥 − log 2 ∆𝑥 ෍ 𝑓𝑋 (𝑥𝑘 )∆𝑥
∆𝑥→0 𝑓𝑋(𝑥𝑘 )
𝑘=−∞ 𝑘=−∞
∞ ∞
1
=න 𝑓𝑋(𝑥)log 2 𝑑𝑥 − lim log 2∆𝑥 න 𝑓𝑋 𝑥 𝑑𝑥
−∞ 𝑓𝑋(𝑥) ∆𝑥→0 −∞

𝑯(𝑿) = 𝒉 𝑿 − 𝐥𝐢𝐦 𝐥𝐨𝐠 𝟐 ∆𝒙


∆𝒙→𝟎
𝑯(𝑿) = 𝒉 𝑿 − 𝐥𝐢𝐦 𝐥𝐨𝐠 𝟐 ∆𝒙
∆𝒙→𝟎

❑ From the relation we observe that as ∆𝑥 → 0, − log 2 ∆𝑥 approaches ∞ which means that the entropy of a
continuous random variable is infinite.

❑ Thus the uncertainty associated with a continuous random variable is infinity.

❑ In order to avoid this problem we adopt ℎ 𝑋 for a continuous random variable with the reference −log 2 ∆𝑥.

❑ The amount of information transmitted over a channel is the difference of the corresponding differential
entropies.

Uniform Distribution:
Determine the differential entropy of a random variable 𝑋 uniformly distributed over the interval (0, 𝑎).

𝐹𝑜𝑟 𝑎 < 1, ℎ 𝑋 𝑖𝑠 𝑛𝑒𝑔𝑎𝑡𝑖𝑣𝑒


Gaussian Distribution:
1 (𝑥−𝜇)2

𝑓𝑋 𝑥 = 𝑒 2𝜎2
𝜎 2𝜋

Let us consider any two continuous random variables 𝑋 & 𝑌(𝑚𝑒𝑎𝑛 𝜇 𝑎𝑛𝑑 𝑣𝑎𝑟𝑖𝑎𝑛𝑐𝑒 𝜎 2) , with corresponding pdfs
𝑓𝑋 𝑥 & 𝑓𝑌 𝑦 respectively.

We can write

(𝑤ℎ𝑒𝑟𝑒 𝑥 𝑖𝑠 𝑎 𝑑𝑢𝑚𝑚𝑦 𝑣𝑎𝑟𝑖𝑎𝑏𝑙𝑒)


Alternatively we can write
1
ℎ 𝑌 ≤ log 2𝑒 log 𝑒 𝑒 + log 𝑒 𝜎 2𝜋
2

ℎ 𝑌 ≤ log 2𝑒 log𝑒 𝑒 + log 𝑒 𝜎 2𝜋

ℎ 𝑌 ≤ log 2𝑒 log 𝑒 𝜎 2𝜋𝑒

ℎ 𝑌 ≤ log 2𝜎 2𝜋𝑒 ≤ ℎ(𝑋)

𝑤ℎ𝑒𝑟𝑒 ℎ 𝑋 = log 2𝜎 2𝜋𝑒

You might also like