ITC-Post Mid1
ITC-Post Mid1
ANIRBAN BHATTACHARJEE
ASSISTANT PROFESSOR
DEPARTMENT OF ECE,NIT AGARTALA
❑ Differential entropy (also referred to as continuous entropy) is a concept in information theory that
extends the idea of entropy, a measure of average uncertainty of a random variable, to
continuous probability distributions.
Consider a continuous random variable 𝑋 with a pdf 𝑓𝑋 𝑥 . Then the differential entropy is defined as follows:
❑ We consider the continuous random variable 𝑋 as the limiting form of a discrete random variable that assumes
the values 𝑥𝑘 = 𝑘∆𝑥, 𝑤ℎ𝑒𝑟𝑒 𝑘 = 0, ±1, ±2, … … . , 𝑎𝑛𝑑 ∆𝑥 𝑎𝑝𝑝𝑟𝑜𝑎𝑐ℎ𝑒𝑠 𝑧𝑒𝑟𝑜 ( i.e. the values are very closely
spaced).
❑ For the interval [𝑥𝑘 , 𝑥𝑘 + ∆𝑥], the probability that of occurrence of 𝑋 in the interval is 𝑓𝑋(𝑥𝑘 )∆𝑥. With ∆𝑥 → 0,the
ordinary entropy of the continuous random variable 𝑋 can be expressed as
∞
1
𝐻 𝑋 = lim 𝑓𝑋(𝑥𝑘 )∆𝑥 log 2 =
∆𝑥→0 𝑓𝑋 (𝑥𝑘 )∆𝑥
𝑘=−∞
∞ ∞
1
= lim 𝑓𝑋(𝑥𝑘 ) log 2 ∆𝑥 − log 2 ∆𝑥 𝑓𝑋 (𝑥𝑘 )∆𝑥
∆𝑥→0 𝑓𝑋(𝑥𝑘 )
𝑘=−∞ 𝑘=−∞
∞ ∞
1
=න 𝑓𝑋(𝑥)log 2 𝑑𝑥 − lim log 2∆𝑥 න 𝑓𝑋 𝑥 𝑑𝑥
−∞ 𝑓𝑋(𝑥) ∆𝑥→0 −∞
❑ From the relation we observe that as ∆𝑥 → 0, − log 2 ∆𝑥 approaches ∞ which means that the entropy of a
continuous random variable is infinite.
❑ In order to avoid this problem we adopt ℎ 𝑋 for a continuous random variable with the reference −log 2 ∆𝑥.
❑ The amount of information transmitted over a channel is the difference of the corresponding differential
entropies.
Uniform Distribution:
Determine the differential entropy of a random variable 𝑋 uniformly distributed over the interval (0, 𝑎).
Let us consider any two continuous random variables 𝑋 & 𝑌(𝑚𝑒𝑎𝑛 𝜇 𝑎𝑛𝑑 𝑣𝑎𝑟𝑖𝑎𝑛𝑐𝑒 𝜎 2) , with corresponding pdfs
𝑓𝑋 𝑥 & 𝑓𝑌 𝑦 respectively.
We can write