0% found this document useful (0 votes)
45 views39 pages

EC401 M2-Information Theory & Coding-Ktustudents - in

The document provides an overview of Module 2 of a course on information theory. It discusses topics like noiseless coding theorems, construction of basic source codes, Shannon-Fano algorithm, Huffman coding, channel capacity, and properties of binary symmetric and erasure channels. Examples are given to illustrate concepts like Huffman coding and Shannon-Fano algorithm. Previous exam questions related to source coding, channel capacity, noiseless coding theorems and properties of channels are also listed.

Uploaded by

Margoob Tanweer
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
45 views39 pages

EC401 M2-Information Theory & Coding-Ktustudents - in

The document provides an overview of Module 2 of a course on information theory. It discusses topics like noiseless coding theorems, construction of basic source codes, Shannon-Fano algorithm, Huffman coding, channel capacity, and properties of binary symmetric and erasure channels. Examples are given to illustrate concepts like Huffman coding and Shannon-Fano algorithm. Previous exam questions related to source coding, channel capacity, noiseless coding theorems and properties of channels are also listed.

Uploaded by

Margoob Tanweer
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 39

KTUStudents.

in
Module 2

For more study materials: WWW.KTUSTUDENTS.IN


Syllabus: Module 2
 Noiseless Coding Theroem
 Construction of Basic Source Codes

KTUStudents.in
 Shannon-Fano Algorithm
 Huffman Coding
 Channel Capacity
 Redundancy and Efficiency of a channel
 Binary Symmetric Channel (BSC), Binary Erasure
Channel (BEC)
 Capacity of Band Limited Gaussian Channels
For more study materials: WWW.KTUSTUDENTS.IN
Construction of Basic Source
Codes
 Huffman Coding
 Shannon-Fano Algorithm

KTUStudents.in

For more study materials: WWW.KTUSTUDENTS.IN


Huffman Coding
1. The source symbols are listed in order of decreasing
probability. The two source symbols of lowest
probability are assigned a 0 and a 1. This part of the step
is referred to as a splitting stage.
2. These two source symbols are regarded as being

KTUStudents.in
combined into a new source symbol with probability
equal to the sum of the two original probabilities. (The
list of source symbols, and therefore source statistics, is
thereby reduced in size by one.) The probability of the
new symbol is placed in the list in accordance with its
value.
3. The procedure is repeated until we are left with a final
list of source statistics (symbols) of only two for which a
0 and a 1 are assigned.

For more study materials: WWW.KTUSTUDENTS.IN


 Q) The five symbols of the alphabet of a discrete
memory less source and their probabilities are shown
below:

KTUStudents.in
Using Huffman algorithm derive the code words and
hence calculate the average length and entropy of the
discrete memoryless source.

For more study materials: WWW.KTUSTUDENTS.IN


 Solution:

KTUStudents.in

For more study materials: WWW.KTUSTUDENTS.IN


Cont…

KTUStudents.in

For more study materials: WWW.KTUSTUDENTS.IN


Cont…
 Average code word length:

 Entropy:

KTUStudents.in

For more study materials: WWW.KTUSTUDENTS.IN


Limitations of Huffman Coding
 Huffman encoding process (i.e., the Huffman tree) is
not unique.
 First, at each splitting stage in the construction of a

KTUStudents.in
Huffman code, there is arbitrariness in the way a 0
and a 1 are assigned to the last two source symbols.
 Second, ambiguity arises when the probability of a
combined symbol (obtained by adding the last two
probabilities pertinent to a particular step) is found to
equal another probability in the list.

For more study materials: WWW.KTUSTUDENTS.IN


 As a measure of the variability in code-word lengths of
a source code, we define the variance of the average
code-word length over the ensemble of source
symbols as

KTUStudents.in
 It is usually found that when a combined symbol is
moved as high as possible, the resulting Huffman code
has a significantly smaller variance than when it is
moved as low as possible. On this basis, it is
reasonable to choose the former Huffman code over
the latter.

For more study materials: WWW.KTUSTUDENTS.IN


Limitations of Huffman Coding
 A draw back of Huffman code is that it requires the
symbol probabilities. For real time applications,
Huffman encoding becomes impractical as the source
KTUStudents.in
statistics are not always known .

For more study materials: WWW.KTUSTUDENTS.IN


Shannon Fano Algorithm
 List the source symbols in the order of decreasing
probabilities.

KTUStudents.in
 Partition this ensemble into almost two equi probable
groups. Assign ‘0’ to one group and a ‘1’ to the other
group. These form the starting code symbols of the
codes.
 Repeat step 2 on each of the subgroups contain only
one source symbol, to determine the succeeding code
symbols of the code words.
 Read code directly.
For more study materials: WWW.KTUSTUDENTS.IN
Shannon Fano Algorithm
 Consider the message ensemble
S={S1,S2,S3,S4,S5,S6,S7,S8} with
P={1/4,1/4,1/8,1/8,1/16,1/16,1/16,1/16}
KTUStudents.in

For more study materials: WWW.KTUSTUDENTS.IN


Shannon Fano Algorithm

KTUStudents.in

For more study materials: WWW.KTUSTUDENTS.IN


Shannon Fano Coding
 Consider the message ensemble S={S1,S2,S3,S4,S5,S6,S7}
with P={0.4,0.2,0.12,0.08,0.08,0.08,0.04}.Construct
Shannon Fano Code.
KTUStudents.in
 Case of different solutions.
 Refer Notes.

For more study materials: WWW.KTUSTUDENTS.IN


Noiseless Coding Theorem
 Shannon’s First Theorem/Source Coding Theorem
 Theorem:

KTUStudents.in
 Let S be a DMS with S={s1,s2,….sq} and P={p1,p2,…pq}.It
is possible to construct a binary code that satisfies the
prefix property and has an average length L that
satisfies the inequality: H(S)≤L ≤H(S)+1

For more study materials: WWW.KTUSTUDENTS.IN


Refer Notes

Noiseless Coding Theorem

KTUStudents.in

For more study materials: WWW.KTUSTUDENTS.IN


Discrete Memory less Channels
 A discrete memoryless channel is a statistical model
with an input X and an output Y that is a noisy version
of X; both X and Yare random variables.
 Every unit of time, the channel accepts an input symbol

KTUStudents.in
X selected from an alphabet and, in response, it emit
an output symbol Y from an alphabet .
 The channel is discrete and memoryless.

For more study materials: WWW.KTUSTUDENTS.IN


 The channel is described in terms of:
 Input alphabet

 Output alphabet

KTUStudents.in
 Set of transition probabilities

 Transition probability matrix(Channel matrix):

For more study materials: WWW.KTUSTUDENTS.IN


Discrete Memory less Channels
 Sum of elements along any row of the matrix:

KTUStudents.in

For more study materials: WWW.KTUSTUDENTS.IN


Channel Capacity
 Shannon defines ‘C’ the channel capacity of a
communication channel as the maximum value of
Transinformation (Mutual Information), I(X,Y).
KTUStudents.in
 C = Max I(X,Y) = Max[H(Y)-H(Y/X)]
= Max [H(Y)] – Max [H[Y/X]]
= log2 K – h’
h’ = H(Y/X)

For more study materials: WWW.KTUSTUDENTS.IN


Channel Capacity of Binary
Symmetric Channel and Binary
Erasure Channel

KTUStudents.in

For more study materials: WWW.KTUSTUDENTS.IN


KTUStudents.in

For more study materials: WWW.KTUSTUDENTS.IN


KTUStudents.in

For more study materials: WWW.KTUSTUDENTS.IN


KTUStudents.in

For more study materials: WWW.KTUSTUDENTS.IN


KTUStudents.in

For more study materials: WWW.KTUSTUDENTS.IN


Channel Efficiency and Channel
Redundancy
 Channel Efficiency is defined as the ratio of actual rate
of mutual information to maximum possible rate.

KTUStudents.in
 It is represented by ή
 ή = I (X,Y) / C
 I (X,Y) is mutual information ; C is Channel Capacity
 Redundancy, E = 1- ή

For more study materials: WWW.KTUSTUDENTS.IN


Channel Encoding Theorem
 Given a source of M equally likely messages, with M≥1,
which is generating information at a rate R, and a
channel with a capacity C. If R ≤ C , then there exists a
KTUStudents.in
coding technique such that the output of the source
may be transmitted with a probability of error of
receiving the message that can be made very small.

For more study materials: WWW.KTUSTUDENTS.IN


Previous Year Questions
 Explain source coding and its ppts.
 State Shannon’s Channel Capacity theorem. What is its
significance in digital communication.


KTUStudents.in
Explain Huffman coding algorithm.
Explain the construction of code tree in Huffman coding.
 State and explain noiseless coding theorem. Narrate its
limitations.
 Explain different source codes. write the condition for the
code to be optimal.
 Compare the different channels used in communication.

For more study materials: WWW.KTUSTUDENTS.IN


Previous Year Questions
 State and Prove Shannon’s first theorem. Discuss its
limitations.

KTUStudents.in
 Define channel capacity. Write the expression
 State and Prove source encoding theorem.
 Find the capacity of Binary Symmetric channel.
 Derive the capacity of BEC

For more study materials: WWW.KTUSTUDENTS.IN


KTUStudents.in

For more study materials: WWW.KTUSTUDENTS.IN


KTUStudents.in

For more study materials: WWW.KTUSTUDENTS.IN


KTUStudents.in

For more study materials: WWW.KTUSTUDENTS.IN


KTUStudents.in

For more study materials: WWW.KTUSTUDENTS.IN


KTUStudents.in

For more study materials: WWW.KTUSTUDENTS.IN


KTUStudents.in

For more study materials: WWW.KTUSTUDENTS.IN


KTUStudents.in

For more study materials: WWW.KTUSTUDENTS.IN


KTUStudents.in

For more study materials: WWW.KTUSTUDENTS.IN


KTUStudents.in

For more study materials: WWW.KTUSTUDENTS.IN

You might also like