0% found this document useful (0 votes)
280 views3 pages

Digital Image Processing: Huffman Coding Example

This document provides an example of Huffman coding. It shows a source that produces symbols from an alphabet with probabilities. A Huffman tree is formed that assigns variable length codewords to each symbol based on probability. The average codeword length is calculated to be 2.2 bits per symbol, lower than the entropy of 2.08 bits per symbol. An alternative Huffman tree is also shown that achieves the same average codeword length.

Uploaded by

Aamir Chohan
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
280 views3 pages

Digital Image Processing: Huffman Coding Example

This document provides an example of Huffman coding. It shows a source that produces symbols from an alphabet with probabilities. A Huffman tree is formed that assigns variable length codewords to each symbol based on probability. The average codeword length is calculated to be 2.2 bits per symbol, lower than the entropy of 2.08 bits per symbol. An alternative Huffman tree is also shown that achieves the same average codeword length.

Uploaded by

Aamir Chohan
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 3

Digital Image Processing

Huffman Coding Example


Huffman Coding Example
 Suppose X is a source producing symbols; the symbols comes from the alphabet A={a1, a2, a3,
a4, a5}.
 Suppose that the probability of each symbol is as follows: {0.4, 0.2, 0.2, 0.15, 0.05}.
 Form the Huffman tree:
0
a1 Symbol | Probability | Codeword
0.4
0 a1 0.4 0
a2 1.0 a2 0.2 10
0.2
1 a3 0.2 110
0
a3 0.6 a4 0.15 1110
0.2 1
0 a5 0.05 1111
a4 0.4
0.15 1 Average
1 0.2 codeword
a5 length = 0.4*1 + 0.2*2 + 0.2*3 + 0.15*4 + 0.05*4 = 2.2
0.05
per
symbol

1
Entropy = H ( x)   Pi ( x) log 2 = 2.08
Bahadir K. Gunturk i Pi ( x) 2
Huffman Coding Example
 Another possible tree with the same source is:

0
a1 Symbol | Probability | Codeword
0.4
0 a1 0.4 0
a2 0 a2 0.2 100
0.2
0.4 1.0 a3 0.2 101
1
a3 1 a4 0.15 110
0.2
0 0.6 a5 0.05 111
a4
0.15 1 Average
1 0.2 codeword
a5 length = 0.4*1 + 0.2*3 + 0.2*3 + 0.15*3 + 0.05*3 = 2.2
0.05
per
symbol

Bahadir K. Gunturk 3

You might also like