0% found this document useful (0 votes)
37 views

ECE4007 Information Theory and Coding: DR - Sangeetha R.G

The document discusses source coding and various entropy coding techniques including Huffman coding, Shannon-Fano coding, and Shanon-Fano-Elias coding. It provides details on how Huffman coding works by combining probabilities from lowest to highest and assigning binary codewords. It also explains how Shannon-Fano coding partitions the source symbols into nearly equal probability sets and assigns binary digits at each step.

Uploaded by

Tanmoy Das
Copyright
© © All Rights Reserved
Available Formats
Download as PPT, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
37 views

ECE4007 Information Theory and Coding: DR - Sangeetha R.G

The document discusses source coding and various entropy coding techniques including Huffman coding, Shannon-Fano coding, and Shanon-Fano-Elias coding. It provides details on how Huffman coding works by combining probabilities from lowest to highest and assigning binary codewords. It also explains how Shannon-Fano coding partitions the source symbols into nearly equal probability sets and assigns binary digits at each step.

Uploaded by

Tanmoy Das
Copyright
© © All Rights Reserved
Available Formats
Download as PPT, PDF, TXT or read online on Scribd
You are on page 1/ 40

ECE4007

INFORMATION THEORY AND


CODING
Dr.Sangeetha R.G
Associate Professor Senior
SENSE
Module 4

Module:4 Source Coding I 6 CO: 4


hours
Source coding theorem - Huffman coding - Non
binary Huffman codes - Adaptive Huffman coding -
Shannon Fano Elias coding - Non binary Shannon
Fano codes
Model of a Digital
Communication System

Message Encoder
e.g. English symbols e.g. English to 0,1 sequence

Information
Coding
Source

Communication
Channel

Destination Decoding

Can have noise


or distortion
Decoder
e.g. 0,1 sequence to English
Shannon’s Vision

Source Channel
Data
Encoding Encoding
Channel
Source Channel
User
Decoding Decoding
Source Coding:
Definition: A conversion of the output of a
discrete memory less source (DMS) into a
sequence of binary symbols i.e. binary code word,
is called Source Coding.
The device that performs this conversion is called
the Source Encoder.
Objective of Source Coding: An objective of
source coding is to minimize the average bit rate
required for representation of the source by
reducing the redundancy of the information source
Types of Code
• Fixed Length code
• Variable length code
• suppose that we wish to represent the 26 letters in the English
alphabets using bits.
We observe that 25 = 32 > 26. Hence, each of the letters can be
uniquely represented using 5 bits.
• Each letter has a corresponding 5 bit long codeword.
1. Fixed – Length Codes:
A fixed – length code is one whose code word length is fixed. Code 1
and Code 2 of above table are fixed – length code words with
length
2. Variable – Length Codes:
A variable – length code is one whose
code word length is not fixed. All
codes of above table except Code
1 and Code 2 are variable – length
codes.
A distinct code is uniquely decodable if the original source sequence
can be reconstructed perfectly from the encoded binary sequence. A
sufficient condition to ensure that a code is uniquely decodable is that
no code word is a prefix of another.
Entropy Coding
The design of a variable – length code
such that its average code word length
approaches the entropy of DMS is often
referred to as Entropy Coding.

There are basically two types of entropy


coding, viz.
1)Huffman Coding
2)Shannon – Fano Coding
Huffman coding
 Huffman coding results in an optimal code. It is the
code that has the highest efficiency.
 The Huffman coding procedure is as follows:
1) List the source symbols in order of decreasing
probability.
2) Combine the probabilities of the two symbols
having the lowest probabilities and reorder the
resultant probabilities, this step is called reduction 1.
The same procedure is repeated until there are two
ordered probabilities remaining.
3) Start encoding with the last reduction, which consists of
exactly two ordered probabilities. Assign 0 as the first
digit in the code word for all the source symbols
associated with the first probability; assign 1 to the
second probability.
4) Now go back and assign 0 and 1 to the second digit for
the two probabilities that were combined in the previous
reduction step, retaining all the source symbols
associated with the first probability; assign 1 to the
second probability.
5) Keep regressing this way until the first column is reached.
6) The code word is obtained tracing back from right to left.
Shanon-Fano Coding
• List the source symbols in order of
decreasing probability.
• Partition the set into two sets that are as
close to equiprobables as possible, and
assign 0 to the upper set 1 to the lower
set.
• Continue this process, each time
partitioning the sets with as nearly equal
probabilities as possible until further
partitioning is not possible.
Shanon-Fano-Elias Code
• Expected length of the code

You might also like