Lec3 Entropy 1
Lec3 Entropy 1
Lec3 Entropy 1
Presented By:
Vibha Jain
Assistant Professor
Manipal University, Jaipur
Entropy
▪ Entropy is a measure of the average information content per source symbol.
▪ Entropy is average of information provided by source.
▪ Let source transmitting messages M = {m1, m2, …, mq} with probability of occurrence is
P = {p1, p2, …, pq}.
▪ Total probability of all messages will p1+p1+ … + Pq = σ𝑖=1 𝑃𝑖 should be 1
𝑞
𝟏
▪ Information transmitted by single message is, Ik= 𝒍𝒐𝒈𝟐( ) bits
𝑷𝒌
𝟏
▪ Information transmitted by some message nk times is, Ik= 𝒏𝒌. 𝒍𝒐𝒈𝟐( ) bits
𝑷𝒌
. . .
𝟏
▪ total information transmitted by mq (transmitted nq number of time) is 𝒏𝒒. 𝒍𝒐𝒈𝟐( )
𝑷𝒒
Thus, Total information transmitted by source is:
𝟏 𝟏 𝟏
= 𝒏𝟏. 𝒍𝒐𝒈𝟐( ) + 𝒏𝟐. 𝒍𝒐𝒈𝟐( ) + . . . + 𝒏𝒒. 𝒍𝒐𝒈𝟐( )
𝑷𝟏 𝑷𝟐 𝑷𝒒
𝟏
= σ𝑞𝑖=1 𝒏𝒊. 𝒍𝒐𝒈𝟐( ) Where, ni is number of time symbol transmitted
𝑷𝒊
Therefor, Total Information
Entropy = ------------------------------
Number of messages
𝟏 𝟏
= )
𝑞
σ𝑖=1 𝒏𝒊. 𝒍𝒐𝒈𝟐(
𝒏 𝑷𝒊
𝑞 𝒏𝒊 𝟏
= σ𝑖=1 . 𝒍𝒐𝒈𝟐( )
𝒏 𝑷𝒊
𝒏𝒊
Since, is probability ➔ Pi
𝒏
𝟏
= )
𝑞
σ𝑖=1 𝑷𝒊. 𝒍𝒐𝒈𝟐(
𝑷 𝒊
Thus,
𝟏
Entropy H(S) = ) bits/ symbol
𝑞
σ𝑖=1 𝑷𝒊. 𝒍𝒐𝒈𝟐(
𝑷 𝒊
Entropy is average amount of Information that we can understand from source.
Example 1 Entropy…
Find the entropy of the following:
1) S = {s1, s2, s3, s4} ; P= {1/4, 1/4, 1/4, 1/4}
2) S = {s1, s2, s3, s4} ; P= {49/100, 49/100, 1/100, 1/100}
3) S = {s1, s2, s3, s4} ; P= {299/300, 1/300, 1/300, 1/300}
𝟏
Solution: Entropy H(S) = σ𝑖=1 𝑷𝒊. 𝒍𝒐𝒈𝟐( )
𝑞
𝑷𝒊
1) H(S) = 4 x (1/4 log2 (4))
= 2 bit/ symbols
2) H(S) = 2 x (49/100 log2 (100/49)) + 2 x (1/100 log2 (100))
= 0.98 log2 (100/49) + 0.02 log2 (100)
= 1.141 bit/ symbols
3) H(S) = 299/300 log2 (300/299)) + 3 x (1/300 log2 (300))
= 0.99 log2 (300/299) + 0.01 log2 (300)
= 0.097 bit/ symbols
Example 2 Entropy…
Find the entropy H(S) for the probabilities {1/2, 1/4, 1/4}
𝟏
Solution: Entropy H(S) = σ𝑖=1 𝑷𝒊. 𝒍𝒐𝒈𝟐( )
𝑞
𝑷𝒊
H(S) = 1/2 log2 (2)) + 2 x (1/4 log2 (4))
= 0.5 + 1
= 1.5 bit/ symbols
Example 3 Entropy…
Find the entropy H(S) for the probabilities {1/2, 1/4, 1/8, 1/8}
𝟏
Solution: Entropy H(S) = σ𝑖=1 𝑷𝒊. 𝒍𝒐𝒈𝟐( )
𝑞
𝑷𝒊
H(S) = 1/2 log2 (2)) + 1/4 log2 (4) + 2 x (1/8 log2 (8))
= 0.5 + 0.5 + 3/4
= 7/4 = 1.75 bit/ symbols
Properties of Entropy
2) H(S) = Log 2m if Pk = 1/M for all symbols M then symbol are equally probable. This
Upper Bound on entropy correspond to maximum uncertainty.
Properties of Entropy… LOWER BOUND
▪ Marginal Entropy
▪ Joint Entropy
▪ Conditional Entropy