Module 1
Module 1
Prepared by
COURSE CONTENTS:
Prof. MERCHANT S N
Electrical Engineering
IIT Bombay
Email: - [email protected]
SUBJECT EXPERT:
Prof. KUMARASWAMY H V
Telecommunication Department
R V College of Engineering
Bangalore
Email: - [email protected]
Page 1 of 92
Contents
Sl. Contents Topics Page No.
No.
1 1.1 Module 1 4
NPTEL Video Links
2 1.2 Questions 5
3 1.3 Quiz 12
4 1.4 True or False 13
5 1.5 FAQ 14
6 1.6 Assignments 17
7 1.7 Additional links 19
8 1.8 Test your skill questions 22
9 2.1 Module 2 25
NPTEL Video Links
10 2.2 Questions 26
11 2.3 Quiz 33
12 2.4 True or False 35
13 2.5 FAQ 37
14 2.6 Assignments 39
15 2.7 Additional links 43
16 2.8 Test your skill 46
17 3.1 Module 3 48
NPTEL Video Links
18 3.2 Questions 49
19 3.3 Quiz 55
20 3.4 True or False 56
21 3.5 FAQ 58
22 3.6 Assignments 60
23 3.7 Additional links 62
24 3.8 Test your skill questions 65
25 4.1 Module 4 68
NPTEL Video Links
26 4.2 Questions 69
27 4.3 Quiz 77
28 4.4 True or False 78
29 4.5 FAQ 80
30 4.6 Assignments 84
31 4.7 Additional links 87
32 4.8 Test your skill questions 91
33 5 Reference Books 92
Page 2 of 92
Module-1
Page 3 of 92
1.1) NPTEL Video Link Module-1 Lecture Number 1 to 9
Sl. Module Lecture
No. No. No. Topic Covered Link
Introduction to Information Theory
1 Mod 01 Lec-01 and Coding (52:54) http://nptel.ac.in/courses/117101053/1
Definition of Information Measure
2 Mod 01 Lec-02 and Entropy (53:16) http://nptel.ac.in/courses/117101053/2
Extension of An Information Source
3 Mod 01 Lec-03 and Markov Source (56:07) http://nptel.ac.in/courses/117101053/3
Adjoint of An Information Source,
Joint and Conditional Information
4 Mod 01 Lec-04 Measures (56:47) http://nptel.ac.in/courses/117101053/4
Properties of Joint and Conditional
Information Measures and a
5 Mod 01 Lec-05 Markov Source (49:56) http://nptel.ac.in/courses/117101053/5
Asymptotic Properties of Entropy
and Problem Solving in Entropy
6 Mod 01 Lec-06 (56:21) http://nptel.ac.in/courses/117101053/6
Block Code and Its Properties
7 Mod 01 Lec-07 (51:03) http://nptel.ac.in/courses/117101053/7
Instantaneous Code and Its
8 Mod 01 Lec-08 Properties (52:31) http://nptel.ac.in/courses/117101053/8
Kraft-McMillan Equality and
9 Mod 01 Lec-09 Compact Codes (52:18) http://nptel.ac.in/courses/117101053/9
Page 4 of 92
1.2) Questions
Sl. Questions Video Time in
No. Number Minutes
1 With suitable example discuss the following types of information 1 7
i) Syntactic
ii) Semantic
iii) Pragmatic
2 What are the roles of Source and channel encoder and decoder in 1 13
communication system?
3 Which of the following statements convey more information and why? 2 4
i) Tomorrow may be holiday.
ii) Our college students won the state level football match.
iii) Rose is red.
4 In a source the probability of symbol A is higher than the probability of symbol B. 2
Which symbol will give higher information, Why?
5 Is it possible to transmit signals in a noisy channel without error? Justify your 2
answer.
6 If the rate of transmission R b is less than Channel capacity C, can we transmit 2
without error? Justify your answer.
7 What are zero memory sources? 2 20
8 If source has two symbols with probabilities P and (1-P). Plot the entropy as a 2 40
function of P and when the entropy will be maximum?
9 If source has eight symbols, what will be the maximum entropy of source, why? 2
10 Why information is expressed in log form? 1 40
2 10
11 A source will have four symbols S 1 , S 2 , S 3 and S 4 with probabilities 1/2, 1/4, 1/8 2
and 1/8 respectively.
i) Find the information conveyed by each symbol.
ii) Find the entropy of the system
12 A source will have three symbols S 1 , S 2 and S 3 with probabilities 0.7, 0.2 and 0.1 2
respectively.
i) Find the information conveyed by each symbol and comment on the
result.
ii) Also find the entropy of the system
13 What is source extension? What is the need of this in communication? 3 18
14 What are minimum and maximum values of entropy? 2 25
15 Define the following terms 2
i) Self information
ii) Entropy
iii) Rate of transmission
Page 5 of 92
17 A TV system will have 525X600 pixels. If each pixel is represented by 10 gray 2 13
levels, what is the information conveyed by each frame.
18 Consider a zero memory information source with a q-symbol alphabet 2 38
S ={s 1 , s 2 , s 3 ,….. ,s q } with associated probabilities P = {p 1 , p 2 , p 3 ,….. ,p q }
I) Show that ,(S) ч log q
II) H(S) is max when P i =1/q for all i
19 Show that the Entropy of the extension, Sn, of the Zero memory is ‘n’ times the 3 25
Entropy of the original source.
H(Sn)=nH(S)
20 What is redundancy in a source? How redundancy is related to the entropy of a 3 4
zero memory source?
21 A source will have three symbols S 1 , S 2 and S 3 with probabilities 0.8, 0.15 and 3 13
0.05 respectively.
i) Find the entropy of the system
ii) Also find 2nd and 3rd extension entropy
22 What is the significance of Markov source model? Mention different methods 3 38
used to represent Markov source model.
23 If the source alphabets are S={0,1) write the state diagram of 2nd order Markov 3 41
model
24 A 2nd order Markov model has the following probabilities 3 42
P(0/00)=P(1/11)=0.85
P(1/00)=P(0/11)=0.15
P(0/01)=P(0/10)= P(1/01)=P(1/10)=0.5. Write state and trellis diagram
25 A 2nd order model has the following probabilities 3 48
P(0/00)=P(1/11)=1
P(1/00)=P(0/11)=0
P(0/01)=P(0/10)= P(1/01)=P(1/10)=0.5
Write state and trellis diagram and comment on the result.
26 What is Homogeneous Markov model? What is its significance? 3 50
27 What is Ergodic Markov model? Explain with example. 3 50
28 Derive an expression for Entropy of a 2nd order Markov source. 3
29 Briefly discuss all the properties of entropy. 3
30 If E 1 and E 2 are two events which gives information I 1 and I 2 , what is the 2 10
information conveyed by the joint event E 1 E 2 i.e I(E 1 E 2 )?
31 What is the meaning of Markov Source of Mth order? 3 33
32 A 2nd order Markov model has the following probabilities 4 2
P(0/00)=P(1/11)=0.8
P(1/00)=P(0/11)=0.2
P(0/01)=P(0/10)= P(1/01)=P(1/10)=0.5
i) Write state and trellis diagram
ii) Find the probabilities of each state
iii) Find the Entropy of the source
33 How many states a 2nd order Markov Source will have? 4
34 Derive an expression for Entropy of Mth order Markov source 4 7
35 What is zero memory source? 4 22
36 Show that in Markov model 4 30
37 How State diagram is different from Trellis diagram in Markov modeling? 3 39
Page 6 of 92
38 Obtain the relation between in a Markov source 4 24
39 Define marginal probabilities and illustrate with an example. 4 48
41 Define the following terms 4 50
i) Marginal information measure
ii) Joint information measure
iii) Conditional information measure
44 For a stationary and Ergodic Markov model, show that H(S 1 ,S 2 )ч 2H(S) 5 22
45 Prove that the conditional amount of information F N (s)=H(s N /s N-1 ,----s 2 ,s 1 ) of the 5 32
Nth symbols in the case where the proceeding N-1 symbols are known is a
monotonic decreasing function of N.
46 Derive an expression for the amount of information of a discrete information 5 37
source with memory.
47 Prove that if H(V) is the amount of information for a message of length N then 6 5
the amount of information per symbol defined by H N (s) is a monotonically
decreasing
48 Show that if H(V) is the amount of information for a message of length N then 6 6
49 A 2nd order Markov model of source s={0,1) and it has the following probabilities 6 15
P(0/00)=0.8 P(0/01)=0.5
P(0/10)=0.5 P(0/11)=0.2
Write state diagram
50 A 2nd order Markov model of source s={0,1) and it has the following probabilities 6 16
P(0/00)=0.8 P(0/01)=0.5
P(0/10)=0.5 P(0/11)=0.2
i) How large is the amount of information of a trigram originating from this
information source?
ii) Determine the amount of information per symbol, denoted by H 2 (s)
51 A 2nd order Markov model of source s={0,1) and it has the following probabilities 6 24
P(0/00)=0.8 P(0/01)=0.5
P(0/10)=0.5 P(0/11)=0.2
i) How large is the amount of information of a bigram originating from this
information source?
ii) Determine the amount of information per symbol, denoted by H 2 (s)
52 A 2nd order Markov model of source s={0,1) and it has the following probabilities 6 28
P(0/00)=0.8 P(0/01)=0.5
P(0/10)=0.5 P(0/11)=0.2
i) How large is H 1 (s)
Page 7 of 92
ii) Find H 3 (s), H 2 (s) and H 1 (s)
iii) Find F 3 (s), F 3 (s) and F 1 (s)
53 Consider 1st order Markov model of source s= {s 1 , s 2 , s 3 ) and the transition 6 38
probabilities of the symbol s j with iтj are all equal to p/2.
Write the state diagram.
54 Consider 1st order Markov model of source s={s 1 , s 2 ,s 3 ) and the transition 6 40
probabilities of the symbol s j with iтj are all equal to p/2.
I. Determine the probabilities of symbols s 1 , s 2 and s 3
II. Find H(s 2 /s 1 ) and discuss the result with suitable graph.
55 What is the function of source code? 7 2
56 What are the difference between source coding and channel coding? 7
57 What is the minimum achievable average codeword length for a source? 7
58 How to design a source code which gives minimum length? 7
59 How a practical source code differs from optimum source code? 7
60 Define block code and give an example for the same. 7 10
61 Define code word. 7 10
62 What is a binary code? 7 15
63 What is nonsingular block code? give one example 7 15
64 A source has four symbols with codes as follows 7
SYMBOLS CODES
S1 0
S2 10
S3 110
S4 111
Is it uniquely decodable code? Justify your answer.
65 A source has four symbols with codes as follows 7 28
SYMBOLS CODES
S1 0
S2 11
S3 00
S4 01
Is it non singular code? Justify your answer.
66 A source has four symbols with codes as follows 7 40
SYMBOLS CODES
S1 0
S2 01
S3 011
S4 0111
i) Is it non singular code?
ii) Is it instantaneous code? Justify your answer.
67 What is the condition for a code to be uniquely decodable? 7 30
68 Define instantaneous code and give one example for both instantaneous and 7 44
non instantaneous code.
69 What is prefix of a code? What is its significance in coding? 7 46
70 What is the necessary and sufficient condition for a code to be instantaneous? 7 48
Page 8 of 92
8 4
71 A source has four symbols with codes as follows 8 8
SYMBOLS CODES
S1 0
S2 010
S3 01
S4 10
Page 9 of 92
80 A source has TEN symbols with source alphabet X={0, 1, 2} the code lengths are 9 18
as follows
SYMBOLS CODE
LENGTHS
S1 1
S2 2
S3 2
S4 2
S5 2
S6 2
S7 3
S8 3
S9 3
S10 3
81 A source has NINE symbols with source alphabet X={0, 1, 2} the code lengths are 9 20
as follows
SYMBOLS CODE
LENGTHS
S1 1
S2 2
S3 2
S4 2
S5 2
S6 2
S7 3
S8 3
S9 3
Will this code satisfies condition for Kraft’s inequality?
82 A source has NINE symbols with source alphabet X={0, 1, 2} the codes are as 9 21
follows
SYMBOLS CODE
LENGTHS
S1 0
S2 10
S3 11
S4 12
S5 20
S6 21
S7 220
S8 221
S9 222
Will this code satisfies condition for Kraft’s inequality? And is it instantaneous
Page 10 of 92
and uniquely decodable code?
83 Define an average length of a code, what is its significance? 9 25
84 What is compact code? 9 30
85 Show that for an r-ary source H r (S)ч L 9 33
Page 11 of 92
1.3) Quiz Questions
5 A symbol has probably of occurrence of 0.4 then the I(S)= -log 2 (0.4)
information conveyed by the symbol is_________
6 If source consists of 4 symbols then the maximum entropy of 2 bits
the source is __________
7 If symbol emitted by the source are independent, then the Memory less
source is referred as_____
8 The relation between rate of transmission and entropy of a Rs=H(s)r
source is ______
9 Entropy of the extension, Sn, of the Zero memory is _________ n
times the Entropy of the original source
10 Let the first extension entropy of a source is H(s)= 2.2 bits 4.4 bits/symbol
/symbol, then the entropy of 2nd and 3rd extension are ___and 6.6 bits/symbol
____
11 The expression for redundancy in a source is _____________ R=1-ɻ
12 In a Markov Source of Mth order means the each symbol output M
depends on previous ______ states
13 For a zero memory Markov source _____ H(s)
14 If X and Y are independent events then H(X,Y)=________ H(X)+H(Y)
15 A source has 128 symbols and all symbols are equally likely then 7 bits
the entropy of the system is __________
16 In a code set, if any code is prefix to some other code, then we Instantaneous code
cannot decode it as __________
17 In a singular code all the codeword’s for the symbols should be same length
of _______
18 A source code has an efficiency of 98%, then the redundancy in 2%
that code is _________
19 1 Nat =____________bits 1.44
20 From a deck a card is selected randomly. You are told that it is I(s)=log 2 [1/(1/13)]bits
Ace card, the amount of information conveyed is __________
Page 12 of 92
1.4) True or False
True or False
1 It is not possible to transmit signals in a noisy channel without error? T/F T
2 If the rate of transmission R b is less than Channel capacity C, then we can T/F F
transmit without error?
3 In a source if the probability of symbol A is higher than the probability of T/F F
symbol B then symbol A convey more information then symbol B
4 If source has eight symbols, then the maximum entropy of source is 3bits T/F T
5 Entropy of any source is always greater than 1 T/F F
6 In Markov source, relation between is T/F T
7 H(X,Y)=H(X)+H(Y) if X and Y are independent events. T/F T
8 In a code set if any code is prefix to some other code, then we cannot T/F T
decode it as instantaneous code
9 Uniquely decodable code is a subset of non singular code. T/F T
10 A block code is said to be uniquely decodable iff the nth extension of the T/F T
code is non singular for every finite n.
11 In a non singular code all the codeword’s for the symbols should be of T/F F
same length
12 A non singular code is a sub set of Instantaneous code T/F F
13 In a singular code all codes should be same for all symbols T/F T
14 Is Kraft Equality and Kraft-McMillan inequality are same? T/F T
15 Is this condition T/F F
Holds only for a uniquely decodable code but not for instantaneous code?
16 In any code the average length of the code should me maximum T/F F
17 In an r-ary source L ч H r (S) T/F F
18 In a code set if any code is prefix to some other code, then we cannot T/F T
decode it as instantaneous code
19 If the Markov model is of zero order then H(S 1 ,S 2 )ч 2H(S) T/F F
20 Relation between Hartley and bits is, 1 Hartley =3.32 bits T/F T
Page 13 of 92
1.5) FAQ
Sl. FAQ Video Time in
No. Number Minutes
1 If source has two symbols with probabilities P and (1-P). Plot the entropy 2 40
as a function of P and when the entropy will be maximum?
2 Why information is expressed in log form? 1 40
2 10
3 A source will have three symbols S 1 , S 2 and S 3 with probabilities 0.7, 0.2 2 15
and 0.1 respectively.
i) Find the information conveyed by each symbol and comment on
the result.
Also find the entropy of the system
4 A TV system will have 525X600 pixels. If each pixel is represented by 10 2 13
gray levels, what is the information conveyed by each frame.
5 Consider a zero memory information source with a q-symbol alphabet 2 38
S ={s 1 , s 2 , s 3 ,….. ,s q } with associated probabilities
P = {p 1 , p 2 , p 3 ,….. ,p q }
I) Show that H(S) ч log q
II) H(S) is max when P i =1/q for all i
6 Show that the Entropy of the extension, Sn, of the Zero memory is ‘n’ 3 25
times the Entropy of the original source.
i.e H(Sn)=nH(S)
7 A source will have three symbols S 1 , S 2 and S 3 with probabilities 0.8, 0.15 3 13
and 0.05 respectively.
i) find the entropy of the system
ii) Also find 2nd and 3rd extension entropy
8 Derive an expression for Entropy of a Mth order Markov source 3
9 Briefly discuss all the properties of entropy 3
10 A 2nd order model has the following probabilities 3 48
P(0/00)=P(1/11)=1
P(1/00)=P(0/11)=0
P(0/01)=P(0/10)= P(1/01)=P(1/10)=0.5
Write state and trellis diagram and comment on the result
11 A 2nd order Markov model has the following probabilities 4 3
P(0/00)=P(1/11)=0.85
P(1/00)=P(0/11)=0.2
P(0/01)=P(0/10)= P(1/01)=P(1/10)=0.5
i) Write state and trellis diagram
ii) Find the probabilities of each state
iii) Find the Entropy of the source
12 Derive an expression for Entropy of Mth order Markov source 4 7
13 Show that in Markov model 4 30
14 Obtain the relation between in a Markov source 4 24
15 Define marginal probabilities and illustrate with an example. 4 48
16 Prove the following identities of information measure 5 3
Page 14 of 92
i) H(Y/X) 0
ii) H(Y/X) ч H(Y)
17 Prove the following identities of information measure 5 12
i) H(X,Y)=H(X)+H(X/Y)
ii) H(X,Y)=H(Y)+H(Y/X)
22 A 2nd order Markov model of source s={0,1) and it has the following 6 15
probabilities
P(0/00)=0.8 P(0/01)=0.5
P(0/10)=0.5 P(0/11)=0.2
I) Write state diagram
II) How large is the amount of information of a trigram originating
from this information source?
III) Determine the amount of information per symbol, denoted by
H 3 (s)
23 A 2nd order Markov model of source s={0,1) and it has the following 6 24
probabilities
P(0/00)=0.9 P(0/01)=0.5
P(0/10)=0.5 P(0/11)=0.1
i) How large is the amount of information of a bigram originating
from this information source?
ii) Determine the amount of information per symbol, denoted by
H 2 (s)
24 What is the condition for a code to be uniquely decodable? 7 30
25 Define instantaneous code and give one example for both instantaneous 7 44
and non instantaneous code.
26 What is prefix of a code? What is its significance in coding? 7 46
27 What is the necessary and sufficient condition for a code to be 7 48
instantaneous? 8 4
28 A source has four symbols 8 25
S={S1, S2 ,S3, S4}
with source alphabet X={0 1}
Find which of the codes satisfying Kraft’s inequality and which are the
codes are instantaneous codes?
SYMBOLS CODE A CODE B Code C Code D CIDE E
S1 00 0 0 0 0
Page 15 of 92
S2 01 100 10 10 10
S3 10 110 110 110 110
S4 11 111 111 11 11
29 State and prove sufficient condition for Kraft’s inequality 8 40
30 Prove that for code to be instantaneous code the condition is 9 1
Will this condition holds good for uniquely decodable
code also? Justify your answer
31 State and prove Kraft-McMillan inequality 9 7
32 Prove that for a code to be uniquely decodable code the condition is 9 8
33 A source has TEN symbols with source alphabet X={0, 1, 2} the code 9 18
lengths are as follows
SYMBOLS CODE
LENGTHS
S1 1
S2 2
S3 2
S4 2
S5 2
S6 2
S7 3
S8 3
S9 3
S10 3
Will this code satisfies condition for Kraft’s inequality?
34 Define an average length of a code, what is its significance? 9 25
35 What is compact code? 9 30
36 Show that for an r-ary source H r (S)ч L 9 33
Page 16 of 92
1.6) Assignment Questions
Sl. Questions
No.
1 A zero memory source has a source alphabet
S = {s 1 , s 2 , s 3, s 4 , s 5 , s 6, s 7 , s 8 } with
P = {0.2, 0.2, 0.15, 0.15, 0.1, 0.1, 0.05, 0.05}
Find the entropy of the source
2 A 2nd order Markov model has the following probabilities
P(0/00)=P(1/11)=0.95
P(1/00)=P(0/11)=0.05
P(0/01)=P(0/10)= P(1/01)=P(1/10)=0.5
Write state and trellis diagram
3 A source will have three symbols S 1 , S 2 and S 3 with probabilities 0.7, 0.2 and 0.1
respectively.
i) find the entropy of the system
ii) Also find 2nd and 3rd extension entropy
4 A source has TEN symbols with source alphabet X={0, 1, 2} the code lengths are as follows
SYMBOLS CODE LENGTHS
S1 1
S2 2
S3 2
S4 2
S5 2
S6 2
S7 3
S8 3
S9 4
S10 4
Will this code satisfies condition for Kraft’s inequality?
5 A source has four symbols
S={S1, S2 ,S3, S4}
with source alphabet X={0 1}
Find which of the codes satisfying Kraft’s inequality and which are the codes are
instantaneous codes?
SYMBOLS CODE A CODE B Code C Code D CIDE E
S1 00 0 0 0 0
S2 01 100 10 10 10
S3 10 110 11 110 110
S4 11 111 111 100 11
nd
6 A 2 order Markov model has the following probabilities
P(0/00)=P(1/11)=0.9
P(1/00)=P(0/11)=0.1
P(0/01)=P(0/10)= P(1/01)=P(1/10)=0.5
Page 17 of 92
i) Write state and trellis diagram
ii) Find the probabilities of each state
iii) Find the Entropy of the source
7 In a facsimile transmission of picture, there are about 3 X 106 picture elements per
frame. For a good reproduction, 10 brightness Levels are necessary. Assume all these
levels equally levels equally likely to occur. Find the rate of information transmission if 1
picture is to be transmitted every 2 minutes.
10 In a Conventional telegraphy we use two symbols, the dot(.) and dash(_). Assuming the
dash is twice as long as the dot and half as probable. Find the average symbol rate and
the entropy rate
Page 18 of 92
1.7) Additional link
Module-1 General Links
http://www.youtube.com/watch?v=C-o2jcLFxyk&list=PLWMqMAYxtBM-IeOSmNkT-KEcgru8EkzCs
http://www.youtube.com/watch?v=R4OlXb9aTvQ
http://www.youtube.com/watch?v=JnJq3Py0dyM
http://www.yovisto.com/video/20224
http://www.youtube.com/watch?v=UrefKMSEuAI&list=PLE125425EC837021F
Page 19 of 92
2Fslides%2Fintroi
nfo.ppt&ei=lBe9
UreAJMPqrAet6Y
DwDA&usg=AFQj
CNEuNfVU4lYWF
YS1FQmN3-
l2gTR_eg&bvm=b
v.58187178,d.bm
k
5 If source has eight http://en.wikipe http://web.eecs. http://www.prin
symbols, what will dia.org/wiki/Entr utk.edu/~mclenn ceton.edu/~acha
be the maximum opy_%28informa an/Classes/420- ney/tmve/wiki10
entropy of source, tion_theory%29 594- 0k/docs/Entropy
why? F07/handouts/Le _%28information
cture-04.pdf _theory%29.html
6 What is source https://www.cs.a http://people.iris http://meru.cecs.
extension? what is uckland.ac.nz/co a.fr/Olivier.Le_M missouri.edu/cou
the need of this in urses/compsci31 eur/teaching/Inf rses/cecs401/dict
communication? 4s2c/resources/I ormationTheory_ 2.pdf
nfoTheory.pdf DIIC3_INC.pdf
7 What is redundancy http://www.yo http://en.wikipe http://www.ifi.uz
in a source? How utube.com/wat dia.org/wiki/Red h.ch/ee/fileadmi
entropy is related to ch?v=JnJq3Py0 undancy_%28inf n/user_upload/t
Entropy of a zero dyM ormation_theory eaching/hs09/L2
memory source? %29 _InformationThe
ory.pdf
8 What is the http://www.yo http://www.sm.l http://www.myo http://en.wikipe
significance of utube.com/wat uth.se/csee/cour ops.org/cocw/mi dia.org/wiki/Mar
Markov source ch?v=Pce7KKe ses/sms/047/200 t/NR/rdonlyres/E kov_information
model? Mention Uf5w 6/lectures/Lectur lectrical- _source
different methods e4_4.pdf Engineering-and-
used to represent Computer-
Markov source http://www.mat Science/6-
model. hpages.com/hom 450Principles-of-
e/kmath232/part Digital-
2/part2.htm Communication--
-
IFall2002/7BFBEC
46-0F71-48A6-
8D2A-
EA73F0C0BD03/0
/L502.pdf
9 What is http://www.mat http://www.mat http://www.ifp.ill
Homogeneous hs.uq.edu.au/MA h.ku.dk/~susann inois.edu/~srikan
Markov model? SCOS/Markov05/ e/kursusstokproc t/ECE567/Fall09/
What is its Sirl.pdf /ContinuousTime DTMC.pdf
significance? .pdf
Page 20 of 92
10 What is Ergodic http://www.yo http://www.mat http://sites.stat.p http://www.mat
Markov model? utube.com/wat h.dartmouth.edu su.edu/~jiali/cou h.wisc.edu/~valk
ch?v=ZjrJpkD3o /archive/m20x06 rse/stat416/note o/courses/331/M
1w /public_html/Lec s/chap4.pdf C2.pdf
ture15.pdf
11 Briefly discuss all http://www.yo http://www.reny http://cgm.cs.mc http://en.wikipe
the properties of utube.com/wat i.hu/~revesz/epy. gill.ca/~soss/cs64 dia.org/wiki/Entr
entropy ch?v=LodZWzr pdf 4/projects/simon opy_%28informa
bayY /Entropy.html tion_theory%29
12 What is Discrete https://course.ie. http://en.wikipe
memory less cuhk.edu.hk/~ier dia.org/wiki/Cha
Channel? What is its g5154/Ch7.pdf nnel_%28commu
significance? nications%29
Page 21 of 92
1.8) Test your skill
Sl. No. Questions
1 Before the beginning of a horse-race, a book-marker believes that several horses entered
in the race have the following probability of winning.
Horse A B C D E
P(winning) 0.01 0.2 0.26 0.29 0.24
He, then receives a message that, one of the horses is not participating in the race. Explain
how you would assess from an information theory point of view, the information value of
the message.
i) If the horse is question is known
ii) If it is not known
2 A data source has 8 symbols that are produced in blocks of four at a rate of 600 blocks/sec.
The first symbol in each block is always the same. The remaining three are filled by any of
the 8 symbols with equal probability. Find the entropy rate of this source.
3 A 2nd order Markov model has the following probabilities
P(0/00)=P(1/11)=0.8
P(1/00)=P(0/11)=0.12
P(0/01)=P(0/10)= P(1/01)=P(1/10)=0.5 .
Write state and trellis diagram
4 For the first order Markov source with a source alphabet S ={A, B, C} shown in figure.
i) Compute the probabilities of states
ii) Find H(S) and H(S2).
5 Consider a binary block code with 2n code-words of same length n. Show that the kraft
inequality is satisfied for such a code.
6 The state diagram of a stationary Markoff source is shown in figure. Assume the state of
all the sources are equally likely.
(i) Find the entropy of each state H i (i=-1,2,3).
(ii) Find the entropy of the source H.
(iii) Find G 1 , G 2 , and G 3 and verify that
Page 22 of 92
7 An information source modeled by a discrete ergodic Markoff random process whose
graph is shown in figure. Find the source entropy H and average information content per
symbol in messages containing one, two, and three symbols, that is find G 1 , G 2 , and G 3 .
8 Design a system to report the heading of a collection of 500 vans. The heading is to be
quantized into three levels: heading straight (S), turning left (L), and turning right (R). This
information is to be transmitted every second. Based on the test data given below,
construct a model for the source and calculate the source entropy.
1. On the average, during a given reporting interval, 250 vans were heading straight,
150 were turning left, and 100 vans were turning right.
2. Out of 250 vans that reported heading straight during a reporting period, 150 of
them (on the average) reported going straight during the next reporting period, 50
of them reported turning left during the next period, and 50 of them reported
turning right during the next period.
3. On the average, out of 150 vans that reported as turning during a signaling period,
50 of them continued their turn during the next period and the remaining headed
straight during the next reporting period.
4. The dynamics of the vans did not allow them to change their heading form left to
right or right to left during subsequent reporting periods.
Page 23 of 92