0% found this document useful (0 votes)
3 views

ITC Module1

The document discusses Information Theory and Coding, focusing on concepts such as Shannon's capacity, uncertainty, information sources, and memoryless sources. It explains the significance of Binary Memoryless Sources (BMS) in digital communication, entropy as a measure of average information, and the calculation of information rates. Additionally, it provides examples and exercises related to information content, entropy, and source efficiency.

Uploaded by

durgasathwik92
Copyright
© © All Rights Reserved
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
3 views

ITC Module1

The document discusses Information Theory and Coding, focusing on concepts such as Shannon's capacity, uncertainty, information sources, and memoryless sources. It explains the significance of Binary Memoryless Sources (BMS) in digital communication, entropy as a measure of average information, and the calculation of information rates. Additionally, it provides examples and exercises related to information content, entropy, and source efficiency.

Uploaded by

durgasathwik92
Copyright
© © All Rights Reserved
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 31

Information Theory and Coding

(CSE2013)
Module 1
ITC: Applications

Claude Shannon

• The capacity can be computed


simply from the noise
characteristics of the channel.
• Shannon further argued that
random processes such as music
and speech have an irreducible
complexity below which the
signal cannot be compressed.
DATA and Information
Uncertainty and Information
• Suppose, an experiment involves the observation of the output. (Ex-
Tossing a fair dice). Every time we throw a dice, the outcome belongs
to be any one of the symbol 1, 2, 3, 4, 5, 6}.
• Consider the event . Before the event, there is an amount of
uncertainty.
• During the event, there is an amount of surprise.
• After the occurrence of event there is a gain in the amount of
information (which may be viewed as resolution of uncertainty).
Information Sources
• In communication system, it can be a device sending message, it can
be continuous or discrete.
• Continuous sources can be converted to discrete sources by sampling
and quantization.
• Therefore, discrete information source is a source which has only a
finite set of symbols as possible output.
• Ex: Mores Code:
• A dot lasts for one unit of time.
• A dash lasts for three units of time.
• Spaces between dots and dashes in the same letter last for one unit.
• Spaces between letters last for three units.
• Spaces between words last for seven units.
Memoryless Sources
We assume that symbols emitted by the source during successive attempts are
statistically independent (each symbol produced is independent of previous symbol).
Such a source is called Discrete Memory less Source (DMS). A source for which each
symbol produced is independent of the previous symbols is called DMS.
Ex:
When the symbols are binary 1’s and 0’s then it is called binary memoryless source
(BMS).
Binary Memoryless Sources (BMS)
Simplicity and Foundation for Digital Communication: BMS provides a basic model for
binary systems, representing the transmission of 0s and 1s in digital communication
systems.
Entropy and Data Compression: BMS helps measure information content (entropy) and
forms the basis for designing efficient data compression algorithms.

Modeling Error Probabilities: Used in error analysis of communication systems, such as


in the Binary Symmetric Channel (BSC), enabling the design of error-correcting codes

Design of Coding Schemes: BMS aids in developing source and channel coding
strategies (e.g., Huffman coding), minimizing redundancy and maximizing reliability.

Benchmark for Complex Systems: Acts as a baseline to study and compare more
advanced information sources with memory or multi-symbol outputs.
Measure of Information
• For a source alphabet where is the number of symbols each with
probability distribution . Then the information gained for the outcome
is

• =0 for =1

• the random variables and are statistically independent.


Measure of Information
• Unit: 𝐼 ( 𝒮= 𝑠 𝑖 ) =− log 𝑏 𝑝
• bits if b=2,
• Hartley or decit if b =10,
• nat (natural unit) if b = e.
Example on Information content
Q: A source produces one of four possible
symbols, having probability P(x1)= ½, P(x2)= ¼,
P(x3)= P(x4)=1/8. Obtain the information content
of each of these symbol.

Ans:
Example on Information content
Q: In a binary channel, of ‘0’ occurs with probability ¼ and ‘1’ occurs with probability
of ¾. Then, calculate the amount of information carried by each binit.
Example on Information content
Q: If there are M equally likely and independent symbols, then prove that amount of
information carried by each symbol will be: I(si)=N bits.
Where, M = 2N and N is an integer.
Entropy (Average Information per
symbol)
• In practical communication, we transmit long sequences of symbols.
• Let is the probability distribution of symbols defined over the source
alphabet The average information is defined as

The units of are bits/symbol.



• and , .
• (maximum entropy if equal probability distribution)

Note: indicates if and only if


Entropy (Average Information per
symbol)
inequality 0≤H(P)≤logK entropy H(P) of a probability distribution P=(p1,p2,…,pK) over K possible
outcomes.
Entropy (Average Information per
symbol)
Entropy of Binary Memoryless
Sources (BMS)
• Let be the BMS with probability distribution function as . Then the entropy
of BMS is:
Information Rate
• If the time rate at which source emits symbols is (symbols/sec), then the
information rate of the source is given by
bits/second
• H(P) is entropy or average information.
• r is rate at symbols are generated.
• Consider a mobile communication system: A 4G LTE network transmits at
a symbol rate of 15,000 symbols per second.
• Each symbol can represent 6 bits of information (due to advanced
modulation schemes like 64-QAM).
• The information rate is: Information Rate = Symbol Rate×Bits per Symbol
Information Rate = Symbol Rate×Bits per Symbol
• Information Rate=15,000 symbols/sec×6 bits/symbol=90,000 bps (90 kbps)
5G New Radio (5G NR) Information
Rate
5G New Radio (5G NR) Information
Rate
5G New Radio (5G NR) Information
Rate
Discrete Memoryless Source (DMS)
• Consider blocks of symbols rather than individual symbols with each
block consisting of n successive source symbols.

• refers to the entropy of a sequence of 𝑛 independent and identically

probability distribution 𝑃.
distributed (i.i.d.) random variables, each following the same

• If a single variable has an entropy 𝐻(𝑃)=1 bit, then a sequence of 5

entropy:𝐻(𝑃^5)=5H(P)=5x1=5⋅.
independent and identically distributed variables has
Examples
• An analog signal is band limited to =8 KHz Hz and sampled at Nyquist
Rate. The samples are quantized into 4 levels. Each level represents
one symbols. Thus there are 4 symbols. The probabilities of
occurrence of these 4 levels are and respectively. Obtain the
information rate.
R=H X 16000
Examples
• A high resolution black and white TV picture consists of about picture
elements and different brightness level. Pictures are repeated at the
rate of frames per second. All picture elements are assumed to be
independent and all levels have equal likelihood of occurrence.
Calculate the average rate of information conveyed by this TV picture
source.
• Given a telegraph source having two symbols, dot and dash. The dot
duration is s. The dash duration is times the dot duration. The
probability of the dot’s occurring is twice that of the dash and the
time between symbols is s. Calculate the average information per
symbol of the telegraph source.
• Efficiency of Source (ⴄ)=
• Redundancy of Source(γ) = 1- ⴄ
Q: A discrete source emits one of five symbols once in every milliseconds with
probability ½, ¼, 1/8, 1/16, 1/16. Determine the source entropy, information
rate, source efficiency and source redundancy.
Q: A DMS has four symbols x1, x2, x3, x4 with probability P(x1)=0.4, P(x2)=0.3,
P(x3)= 0.2, P(x4)=0.1.
i) Calculate H(x).
ii) Find the amount of information contained in the message x1 x2 x1 x3 and
x4 x3 x3 x2.
Q: A source emits three messages with the probability: P1= 0.7, P2 =0.2, P3=0.1
Calculate: Source entropy, maximum entropy, source efficiency and
redundancy.
Ans: H=1.508 bits/message, Hmax= 1.585 bits/message, ⴄ =0.73, γ= 0.27

You might also like