GUI TO GENERATE SOURCE CODES
PRESENTED BY
AMIT NEMAGOUD 1BM17EC009
AMITH P
1BM17EC010
MANU SHREYAS C H 1BM17EC057
SOURCE CODING
Source coding (source compression coding) is a technique
of using the variable-length codes for different information.
The number of bits in a coded message representing the
information is reduced in order to reduce size of the code
that has to be transmitted.
In signal processing, data compression, source
coding or bit-rate reduction is the process of
encoding information using fewer bits than the original
representation
HISTORY
In the field of data compression,
Shannon–Fano coding, named after
Claude Shannon(1948) and Robert
Fano(1949), is a name given to two
different but related techniques for
constructing a prefix code.
Codes are generated based on a set
of symbols and their probabilities
(estimated or measured).
HISTORY
In 1951, David A. Huffman and his
MIT information theory classmates were
given the choice of a term paper or a
final exam.
In doing so, Huffman outdid Fano, who had
worked with information theory
inventor Claude Shannon to develop a
similar code. Building the tree from the
bottom up guaranteed optimality, unlike
top-down Shannon–Fano coding.
SHANNON FANO CODING
Create a list of probabilities or frequency counts for the given set of symbols
so that the relative frequency of occurrence of each symbol is known.
Sort the list of symbols in decreasing order of probability, the most probable
ones to the left and least probable to the right.
Split the list into two parts, with the total probability of both the parts being
as close to each other as possible.
Assign the value 0 to the left part and 1 to the right part.
Repeat the steps 3 and 4 for each part, until all the symbols are split into
individual subgroups.
SHANNON FANO CODING
“AAAABBBCCD”
MESSA STAGE STAGE STAGE STAGE CODE
GE 1 2 3 4
A 0.4 0 0
B 0.3 1 0 10
C 0.2 1 1 0 110
D 0.1 1 1 1 111
HUFFMAN CODING
First we need to write the characters in decreasing order of their
frequency or probability in a table.
Consider two characters with the minimum frequency
Create a new intermediate word with a frequency or probability equal
to the sum of the two characters frequencies. Make code of the first
extracted character as ‘0’ and the other extracted character as ‘1’.
Repeat steps#2 and #3 until the the table contains only 1 word.
Now start counting the codes from right side until you get to a basic
character in a table and that group of bit is a code word for that
particular character.
HUFFMAN CODING
MESSAG CODE
A 1
B 01
C 000
D 001