03 Compression
03 Compression
=
=
1
0
1 ) (
L
i
i a p
8
Entropy
Average information per source output
=
=
1
0
2 ) ( log ) (
L
i
i i a p a p H
H is called the uncertainty or the entropy of
the source
If all the source symbols are equally probable
then the source has a maximum entropy
H gives the lower bound on the number of bits
required to code a signal
bits / symbol
Shannon Theorem
It is possible without loss of
information, a source signal with
entropy H bits/symbol, using H+e
bits/symbol,
Where e is arbitrary small quantity
e can be made arbitrarily small by
considering increasingly larger blocks of
symbols to be coded
9
Example:
Entropy of the following histogram
0
5
10
15
20
25
30
35
40
45
1 2 3 4 5 6
Count
0.5208 1.736 0.3 a
6
Total sum = 2.1432
0.1857 4.644 0.04 a
5
0.3322 3.322 0.1 a
4
0.2435 4.059 0.06 a
3
0.5288 1.322 0.4 a
2
0.3322 3.322 0.1 a
1
p(a
i
)* (-log
2
p(a
i
)) -log
2
p(a
i
) p(a
i
)
Entropy of this histogram = 2.14
Minimum average bits/symbol = 2.14
Coding
Huffman coding
Yields smallest number of bits/symbol per
source symbol
Lossless code
Uniquely decodable
Instantaneous (no future referencing is
needed)
Run-length coding
10
Huffman coding
Arrange symbol probabilities p
i
in decreasing order.
While there is more than one code
{
Merge the two nodes with smallest probabilities to form a new
node with probabilities equal to their sum
Arbitrarily assign 1 and 0 to each pair of branches merging into a
node
}
Read sequentially from the root node to the leaf node where the
symbol is located.
Huffman codes
11
Average Code length
(0.4) (1) + (0.3) (2) +
(0.1) (3) + (0.1) (4) +
(0.06 + 0.04) (5)
= 2.2 bits/symbol
Entropy of this histogram = 2.14
Predictive Coding
Interpixel / Interframe
Neighboring pixels have similar values
We can predict the values of these
neighboring pixels during the encoding
process
We can use the same prediction algorithm
during decoding process
12
Predictive Coding
Predictive Coding
13
Predictive Coding
Predictive Coding
Error image
Low entropy
Less number of bits required to encode this
image
Lossless
Compression can be achieved
14
Another Example of Predictive
Coding
Another Example of Predictive
Coding
15
Psychovisual Redundancy
Quantization : Lossy
Remove information that human visual system
cannot perceive
Convert from 8 bits to 4 bits
Convert from 8 bits to 4 bits