ITC Module1
ITC Module1
(CSE2013)
Module 1
ITC: Applications
Claude Shannon
Design of Coding Schemes: BMS aids in developing source and channel coding
strategies (e.g., Huffman coding), minimizing redundancy and maximizing reliability.
Benchmark for Complex Systems: Acts as a baseline to study and compare more
advanced information sources with memory or multi-symbol outputs.
Measure of Information
• For a source alphabet where is the number of symbols each with
probability distribution . Then the information gained for the outcome
is
• =0 for =1
Ans:
Example on Information content
Q: In a binary channel, of ‘0’ occurs with probability ¼ and ‘1’ occurs with probability
of ¾. Then, calculate the amount of information carried by each binit.
Example on Information content
Q: If there are M equally likely and independent symbols, then prove that amount of
information carried by each symbol will be: I(si)=N bits.
Where, M = 2N and N is an integer.
Entropy (Average Information per
symbol)
• In practical communication, we transmit long sequences of symbols.
• Let is the probability distribution of symbols defined over the source
alphabet The average information is defined as
probability distribution 𝑃.
distributed (i.i.d.) random variables, each following the same
entropy:𝐻(𝑃^5)=5H(P)=5x1=5⋅.
independent and identically distributed variables has
Examples
• An analog signal is band limited to =8 KHz Hz and sampled at Nyquist
Rate. The samples are quantized into 4 levels. Each level represents
one symbols. Thus there are 4 symbols. The probabilities of
occurrence of these 4 levels are and respectively. Obtain the
information rate.
R=H X 16000
Examples
• A high resolution black and white TV picture consists of about picture
elements and different brightness level. Pictures are repeated at the
rate of frames per second. All picture elements are assumed to be
independent and all levels have equal likelihood of occurrence.
Calculate the average rate of information conveyed by this TV picture
source.
• Given a telegraph source having two symbols, dot and dash. The dot
duration is s. The dash duration is times the dot duration. The
probability of the dot’s occurring is twice that of the dash and the
time between symbols is s. Calculate the average information per
symbol of the telegraph source.
• Efficiency of Source (ⴄ)=
• Redundancy of Source(γ) = 1- ⴄ
Q: A discrete source emits one of five symbols once in every milliseconds with
probability ½, ¼, 1/8, 1/16, 1/16. Determine the source entropy, information
rate, source efficiency and source redundancy.
Q: A DMS has four symbols x1, x2, x3, x4 with probability P(x1)=0.4, P(x2)=0.3,
P(x3)= 0.2, P(x4)=0.1.
i) Calculate H(x).
ii) Find the amount of information contained in the message x1 x2 x1 x3 and
x4 x3 x3 x2.
Q: A source emits three messages with the probability: P1= 0.7, P2 =0.2, P3=0.1
Calculate: Source entropy, maximum entropy, source efficiency and
redundancy.
Ans: H=1.508 bits/message, Hmax= 1.585 bits/message, ⴄ =0.73, γ= 0.27