100% found this document useful (1 vote)
1K views

Shannonfano Source Coding

This document compares the code efficiency of Shannon-Fano and Huffman coding algorithms. It provides code samples in Scilab to calculate the average code length, entropy, and efficiency for each method on a sample data set of 4 messages with probabilities. The results show that both Shannon-Fano and Huffman coding achieve 100% efficiency on this data set, with an average code length of 1.75 and entropy of 1.75.

Uploaded by

Chinmay Patil
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
100% found this document useful (1 vote)
1K views

Shannonfano Source Coding

This document compares the code efficiency of Shannon-Fano and Huffman coding algorithms. It provides code samples in Scilab to calculate the average code length, entropy, and efficiency for each method on a sample data set of 4 messages with probabilities. The results show that both Shannon-Fano and Huffman coding achieve 100% efficiency on this data set, with an average code length of 1.75 and entropy of 1.75.

Uploaded by

Chinmay Patil
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 3

Aim: To compare code efficiency of Shannon-Fano and Huffman coding

Shannon-Fano Code
clc;
clear all;
//close all;
m=input(Enter the no. of message ensembles : );
z=[];
h=0;l=0;
disp(Enter the probabilities for Shannon-Fano coding );
for i=1:m
printf(Ensemble %d\n,i);
p(i)=input();
p=gsort(p);
end
//Finding each alpha values
a(1)=0;
for j=2:m;
a(j)=a(j-1)+p(j-1);
end
printf(\n Alpha Matrix);
disp(a);
//Finding each code length
for i=1:m
n(i)= ceil(-1*(log2(p(i))));
end
printf(\n Code length matrix);
disp(n);
//Computing each code
for i=1:m
int=a(i);
for j=1:n(i)
frac=int*2;
c=floor(frac);
frac=frac-c;
z=[z c];
int=frac;
end
printf(Codeword %d,i);
disp(z);
z=[];
end
//Computing Avg. Code Length & Entropy
printf(Avg. Code Length);
for i=1:m
x=p(i)*n(i);
l=l+x;
x=p(i)*log2(1/p(i));
h=h+x;
end
disp(l);

printf(Entropy);
disp(h);
//Computing Efficiency
printf(Efficiency);
disp(100*h/l);
printf(Redundancy);
disp(100-(100*h/l));
Scilab Output
Enter the no. of message ensembles : 4
Enter the probabilities for 7hannon-fano coding in descending order
Ensemble 1
0.125
Ensemble 2
0.5
Ensemble 3
0.125
Ensemble 4
0.25
Alpha Matrix
0.
0.5
0.75
0.875
Code length matrix
1.
2.
3.
3.
Codeword 1
0.
Codeword 2
1. 0.
Codeword 3
1. 1. 0.
Codeword 4
1. 1. 1.
Avg. Code Length
1.75
Entropy
1.75
Efficiency
100.
Redundancy
0.

Huffman code
clc
m=input('Enter the no. of message ensembles : ');
h=0;l=0;
disp('Enter the probabilities for Huffman coding);
for i=1:m
printf('Ensemble %d\n',i);
p(i)=input('');
end
p=gsort(p);
//Sorting the array in descending order
disp('Enter the code word lengths for Huffman coding');
for i=1:m
printf('Codeword %d\n',i);
n(i)=input('');
end
printf('Avg. Code Length');
//Computing Avg. Code Length & Entropy
for i=1:m
x=p(i)*n(i);
l=l+x;
x=p(i)*log2(1/p(i));
h=h+x;
end
disp(l);
printf('Entropy');
disp(h);
printf('Efficiency');
//Computing Efficiency
disp(100*h/l);
Scilab Output
Enter the no. of message ensembles : 4
Enter the probabilities for Huffman coding in descending order
Ensemble 1
0.5
Ensemble 2
0.25
Ensemble 3
0.125
Ensemble 4
0.125
Enter the code word lengths for Huffman coding
Codeword 1
1
Codeword 2
2
Codeword 3
3
Codeword 4
3
Avg. Code Length
1.75
Entropy
1.75
Efficiency
100.

You might also like