0% found this document useful (0 votes)
188 views

Source Coding Shannon Fano Coding

The document discusses source coding techniques. It describes Shannon Fano coding, which assigns shorter codes to more probable symbols. The steps of Shannon Fano coding are provided. Examples are given to illustrate constructing a Shannon Fano code for given probability distributions and calculating the coding efficiency. Source coding aims to remove redundancy in information.

Uploaded by

Athmaaram Sriram
Copyright
© © All Rights Reserved
Available Formats
Download as PPT, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
188 views

Source Coding Shannon Fano Coding

The document discusses source coding techniques. It describes Shannon Fano coding, which assigns shorter codes to more probable symbols. The steps of Shannon Fano coding are provided. Examples are given to illustrate constructing a Shannon Fano code for given probability distributions and calculating the coding efficiency. Source coding aims to remove redundancy in information.

Uploaded by

Athmaaram Sriram
Copyright
© © All Rights Reserved
Available Formats
Download as PPT, PDF, TXT or read online on Scribd
You are on page 1/ 24

UEC 1404

PRINCIPLES OF COMMUNICATION
SYSTEMS
UNIT -V
INFORMATION THEORY
Source Coding
Objective

1. To understand about source coding techniques.

2.  Solving problems related to source coding techniques.


Source coding

• Source encoder: It is an algorithm responsible for


removing the redundant portion in information.
• Source coding techniques
The various source coding techniques are:
i. Shannon Fano coding
ii. Huffman coding
iii. Lempel Ziv coding
iv. Prefix coding
Shannon Fano coding
• In Shannon Fano coding, a small number of bits
are assigned to higher probable events and a
large number of bits are assigned to smaller
probable events.
• Steps involved in Shannon Fano coding:
i) The source symbols are written in decreasing probabilities.
ii) The message set is partitioned into two most equi-probable
groups [A1] and [A2]. The probability value of group [A1] should
be approximately equal to the value of group [A2]. i.e. [A1]>=[A2]
or [A1]<=[A2].
iii) After partitioning, “0” is assigned to each message contained in
group [A1] and “1” to each message contained in group [A2].
iv) The same procedure is repeated for the groups [A1] and [A2].
i.e. Group [A1] will be divided into two equi-probable groups as
[A11] and [A12] and group [A2] will be divided into two equi-
probable groups as [A21] and [A22]. The code words in [A11]
start with 00, [A12] starts with 01; [A21] start with 10 and [A22]
start with 11.
v) The same procedure is repeated until each group contains only
one message.
vi) The coding efficiency which for Shannon Fano
coding is given by the expression
H S 

L K 1
 1 
H S    p k log 2  
where H S  is entropy of the source S . k 0  pk 
and L is average code word length.
K 1
L   pk lk
;where l k is length of binary code word assigned to
k 0

symbol.
vi) Redundancy can be calculated by following
formula.
Redundancy= 1   .
Solved Problems

• 1. Find out the Shannon Fano code for a


discrete memory less source with
probability statistics
0.1, 0.1, 0.2, 0.2, 0.4.
Answer
• Step 1:
Arrange the probabilities in the descending order
0.4, 0.2, 0.2, 0.1, 0.1
• Step 2:
i) Divide the total symbol or message into two
equi-probable groups as and It can be
A1  A2 orA1  A2 orA1  A2
0.4
Group A1 
0.2
0.2

Group A2 0.1
0.1

ii) Assign "0" to group A1 symbols and assign "1"
to group A2 symbols.
0.4 0
Group A1 
0.2 0
0.2 1

Group A2 0.1 1
0.1 1

iii) Divide the group A1 into two equi-probable
groups as A11 and A12
0.4 0 Group A11
Group A1 
0.2 0 Group A12
0.2 1

Group A2 0.1 1
0.1 1

iv) Assign "0" to symbols in group A11 and assign


"1" to symbols in group A12
0.4 0 0 Group A11
Group A1 
0.2 0 1 Group A12
0.2 1

Group A2 0.1 1
0.1 1

v) Further division of group A11 and A12 is not
possible. Hence divide the group A2 into A21 and
A22 by two equi-probable groups.
0.4 0 0 Group A11
Group A1 
0.2 0 1 Group A12
0.2 1 Group A21

Group A2 0.1 1 
0.1 1 Group A22
 

vi) Assign "0" to symbols in group and assign "1" to


symbols in group
0.4 0 0 Group A11
Group A1 
0.2 0 1 Group A12
0.2 1 0 Group A21

Group A2 0.1 1 1 
0.1 1 1 Group A22
 
vii) Further division of group A21 is not possible.
Hence divide group A22 into two equi-probable
groups A221 and A222
0.4 0 0 Group A11
Group A1 
0.2 0 1 Group A12
0.2 1 0 Group A21

Group A2 0.1 1 1  Group A22 Group A221
0.1 1 1 
  Group A222

viii) Assign "0" to symbols in group A221 and assign


"1" to the symbols in group A222 .
0.4 0 0 Group A11
Group A1 
0.2 0 1 Group A12
0.2 1 0 Group A21

Group A2 0.1 1 1 0 Group A22 Group A221
0.1 1 1 
 1 Group A222
• Step 3:

• Step 4:
To find out efficiency  , we must calculate the
average codeword length L  and entropy H S .
H S 

L
4
• Where L   p k l k
k 0

 p 0 l 0  p1l1  p 2 l 2  p3l3  p 4 l 4
= (0.4)(2) + (0.2)(2) + (0.2)(2) + (0.1)(3) + (0.1)(3)
=0.8 + 0.4+ 0.4+ 0.3+0.3

L  2.2 bits / symbol


4
H S    p k log 2  1 
k 0  pk 
 p 0 log 2  1   p1 log 2  1   p 2 log 2  1   p3 log 2  1   p 4 log 2  1 
 p0   p1   p2   p3   p4 

 0.4 0.2 log 10.2 0.2 log 10.2 0.1log 10.1 0.1log 10.1
 0.4 log 2 1 2 2 2 2
 0. 4
 0.4  0.2 log 10.2  0.2 log 10.2  0.1 log 10.1  0.1 log 10.1
log 10 1 10 10 10 10

log10 2 log 10 2 log 10 2 log 10 2 log 10 2


 0.4  1.3219   0.2  2.3219   0.2  2.3219   0.1  3.3219   0.1  3.3219 

 0.5287  0.4643  0.4643  0.33219  0.33219


 2.12 bits / symbol
H S   2.12 bits / symbol
H S 

L
2.12
  0.96
2.2
  96%
2. A discrete memoryless source has five
messages S1 , S 2 , S 3 , S 4 and S 5 with probabilities
pS1   0.4, pS 2   0.19, pS 3   0.16, pS 4   0.15,

pS 5   0.1.

Construct the Shannon-Fano code and calculate


the code efficiency.
Step 1:
Arrange the probabilities in the decreasing order
0.4, 0.19, 0.16, 0.15, 0.1
0.4 0 0 Group A11
Group A1 
0.19 0 1 Group A12
0.16 1 0 Group A21

Group A2 0.15 1 1 0 Group A22 Group A221
 0.1 1 1 
 1 Group A222
L  2.25 bits / symbol
H S   2.14 bits / symbol

2.14
  0.95
2.25

• 3) A discrete memoryless source emits 6


symbols with probabilities 0.3, 0.25, 0.05, 0.12,
0.08 and 0.2 respectively. Construct the
Shannon Fano code and compute its efficiency.
Summary

The following topics were discussed


 source coding techniques
 problems related to source coding techniques
Test Your Understanding

State source coding theorem?


How do you evaluate the efficiency of a source coder?
References
1. Proakis.J.G, Salehi.M, Fundamentals of Communication
Systems, Pearson Education, Second Edition, 2006.
2. S. Haykin, “Digital Communications”, John Wiley, 2005.
3. https://nptel.ac.in/courses
Thank you !

You might also like