Lecture Notes: July 2020 (Be Safe and Stay at Home)
Lecture Notes: July 2020 (Be Safe and Stay at Home)
Lecture Notes
on
July 2020
(Be safe and stay at home)
Course: Information Theory and Coding Course Code: BCSE3087 Dr. Saurabh Kumar Srivastava
School of Computing Science and Engineering
Course: Information Theory and Coding Course Code: BCSE3087 Dr. Saurabh Kumar Srivastava
School of Computing Science and Engineering
17.07.2020 Assignment
1-Hartley’s Assumptions-
The amount of information contained in two messages should be the sum of the
information contained in the individual messages. I (M)= I (M1) + I (M2)
Means-
The amount of information in ‘l/L’ messages of length should equal to the amount
of information in one message of length ‘l/L’. = L.M
Note: Hartley’s assumptions were based on –
1. Each symbol has equal probability of occurrence.
2. All symbols were independent to each other.
Course: Information Theory and Coding Course Code: BCSE3087 Dr. Saurabh Kumar Srivastava
School of Computing Science and Engineering
i. Tomorrow the sun will rise in the east. (S/U=1, Least information)
ii. India wins the cricket match series against Australia by 5-0. (S/U=1,
Least information)
iii. There is now fall in Mumbai. (Surprise/ Uncertainty=0, high information)
According to Shannon-
Information α (directly proportional to ) Uncertainty/ Surprise
Information is inversely proportional to Probability of occurrence of events.
P->1 I->0
P->0 I->1
Note- log2 (X) = log10(x)/ log10 (2) (Conversion formula for log2 the unit will be
bits)
Data Compression-
Course: Information Theory and Coding Course Code: BCSE3087 Dr. Saurabh Kumar Srivastava
School of Computing Science and Engineering
Ik = log2 (1/Pk)
Properties of information
Solution-
Ik= log2(1/Pk)
a.
Ik= log2(22)
Ik=2 bits
Course: Information Theory and Coding Course Code: BCSE3087 Dr. Saurabh Kumar Srivastava
School of Computing Science and Engineering
Question 2-
A card is selected at random from deck of playing cards (52). If you
have been told that it is red in color (26). Find out-
i. How much information you have received? (Pi)
ii. How much more information you needed to completely specify
the card? (Pii)
Solution
i. We have information that 26 cards would be red.
Probability of being red P(red)= 26/52=1/2
Information (Ired) = log2(2) =1 bit Answer
So, for
Lp1 message of m1 are transmitted.
Lp2 message of m2 are transmitted.
-
-
Course: Information Theory and Coding Course Code: BCSE3087 Dr. Saurabh Kumar Srivastava
School of Computing Science and Engineering
-
Lpm message of mM are transmitted.
Properties of Entropy-
1. Entropy is zero if the event is sure.
2. When Pk= 1/M for all ‘M’ messages then messages are equally
likely=> H=log2M
3. Upper bound on entropy is given as
Hmax=log2M
Question 1-
For a discrete memoryless source there are three symbols (m1, m2, m3)
with probability p1=x, p2=p3. Find the entropy of the source?
Solution-
p1=x
p2 and P2=?
p1+p2+p3=1
x+p2+p3=1
2p2=1-x
p2=1-x/2=p3
Course: Information Theory and Coding Course Code: BCSE3087 Dr. Saurabh Kumar Srivastava
School of Computing Science and Engineering
H=?
Question 2
Si S1 S2 S3 Sj Sn-1 Sn
Pi 1/2 1/4 1/8 1/2j 1/2n-1 1/2n
Question 3:
The source emits three messages with probabilities, p1=0.7, p2=0.2 and
p3=0.1. Calculate-
i. Source Entropy
ii. Max Entropy
iii. Source Efficiency
iv. Redundancy
Try this--
3
1
1. Source Entropy (H) =∑ 𝑝𝑘 log
𝑘=1 𝑝𝑘
Answer is: 1.1568 bits/ messages
3. Etasource= H/Hmax
Answer: 0.73
4. Redundancysource= 1- Etasource
Answer: 0.27
Course: Information Theory and Coding Course Code: BCSE3087 Dr. Saurabh Kumar Srivastava
School of Computing Science and Engineering
Question 4- A discrete source emits one of six symbols once every mili
seconds r=(1000 msgs/second). The symbol probabilities are ½, ¼, 1/8,
1/16, 1/32 and 1/32. Find the source Entropy and Information Rate.
R=r*H
Course: Information Theory and Coding Course Code: BCSE3087 Dr. Saurabh Kumar Srivastava