0% found this document useful (0 votes)
47 views15 pages

03 Compression

1. The document discusses various techniques for digital image processing and compression, including Fourier transforms, JPEG, MPEG, and image databases. 2. It describes different types of redundancies in digital images like coding, interpixel, and psychovisual redundancies that compression techniques aim to reduce. Lossy compression techniques like quantization remove information imperceptible to the human eye. 3. Key compression concepts covered are lossless versus lossy compression, image histograms, entropy, Huffman coding, and predictive coding. Predictive coding exploits interpixel redundancy by encoding prediction errors between neighboring pixels rather than raw pixel values.

Uploaded by

jnax101
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
47 views15 pages

03 Compression

1. The document discusses various techniques for digital image processing and compression, including Fourier transforms, JPEG, MPEG, and image databases. 2. It describes different types of redundancies in digital images like coding, interpixel, and psychovisual redundancies that compression techniques aim to reduce. Lossy compression techniques like quantization remove information imperceptible to the human eye. 3. Key compression concepts covered are lossless versus lossy compression, image histograms, entropy, Huffman coding, and predictive coding. Predictive coding exploits interpixel redundancy by encoding prediction errors between neighboring pixels rather than raw pixel values.

Uploaded by

jnax101
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 15

1

Advanced Digital Image


Processing
Dr. Ash Pahwa
DV Studio, Inc.
5751 Encina Road, Suite 101
Goleta, CA 93117
[email protected]
Schedule
May 5, 2006
Fourier Transformation and Compression
May 12, 2006
JPEG, JPEG 2000, MPEG 1 and MPEG 2
May 26, 2006
Image Databases
MPEG 7
2
Course web site
http://www.dv-studio.com/Files/DIP
Text Book
Digital Image Processing (Second Edition)
Gonzalez and Woods, 2002
Pearson/Prentice Hall
Digital Image Processing using MATLAB
Gonzalez, Woods, Eddins, 2004
Web site for both books
http://www.imageprocessingplace.com
3
Image Compression Model
Encoder
Decoder
Image
Compressed image
Original image
Image Compression Strategies
Compression can take place only if we have
redundant information
Image compression
Coding redundancy
Fewer bits to represent frequent symbols
Huffman coding : Lossless
Interpixel redundancy
Neighboring pixels have similar values
Predictive coding : Lossless
Psychovisual redundancy
Quantization : Lossy
Remove information that human visual system cannot perceive
Removal of high frequency data : Lossy
4
Lossless vs Lossy Compression
Information is lost during compression
Lossy
Consumer TV signals
Consumer images : web, Digital cameras
Information preserving
Lossless
Medical imaging
Coding redundancy
Fewer number of bits to represent
frequently occurring symbols
5
Image Histogram
0
5
10
15
20
25
30
0 1 2 3 4 5 6 7
Count
Image size = 10x10
Total pixels = 100
Bits per pixel = 3
Pixel values : 0 - 7
Coding: Fixed vs Variable Length
6
PDF and Code length
Total bits allocated
Fixed Length Coding
Average = 3 bits/pixel
Total bits = 100 pixels * 3 bits/pixel = 300 bits
Variable Length Coding
Average = 2*0.19 + 2 * 0.25 + 2*0.21 + 3*0.16
+ 4*0.08 + 5*0.06 + 6*0.03 + 6*0.02
Average = 2.78 bits/pixel
Total bits = 100 pixels * 2.78 bits/pixel = 278 bits
Compression ratio = 278/300 = 92%
7
Information Theory
Event E occurs with probability P(E)
Information = I(E) = -logP(E)
If log is base 2, unit of information is bit
Bits = -log
2
P(E)
Example
Toss a coin: Probability = 0.5
Information = -log
2
P(E) = -log
2
0.5 = 1 bit
1 bit will convey the result of this experiment
Information Theory
Assume an information source which
generates the symbols
{a
0
, a
1
, a
2
, a
L-1
}
Probability {a
i
} = p(a
i
)
I(a
i
) = log
2
p(a
i
)

=
=
1
0
1 ) (
L
i
i a p
8
Entropy
Average information per source output

=
=
1
0
2 ) ( log ) (
L
i
i i a p a p H
H is called the uncertainty or the entropy of
the source
If all the source symbols are equally probable
then the source has a maximum entropy
H gives the lower bound on the number of bits
required to code a signal
bits / symbol
Shannon Theorem
It is possible without loss of
information, a source signal with
entropy H bits/symbol, using H+e
bits/symbol,
Where e is arbitrary small quantity
e can be made arbitrarily small by
considering increasingly larger blocks of
symbols to be coded
9
Example:
Entropy of the following histogram
0
5
10
15
20
25
30
35
40
45
1 2 3 4 5 6
Count
0.5208 1.736 0.3 a
6
Total sum = 2.1432
0.1857 4.644 0.04 a
5
0.3322 3.322 0.1 a
4
0.2435 4.059 0.06 a
3
0.5288 1.322 0.4 a
2
0.3322 3.322 0.1 a
1
p(a
i
)* (-log
2
p(a
i
)) -log
2
p(a
i
) p(a
i
)
Entropy of this histogram = 2.14
Minimum average bits/symbol = 2.14
Coding
Huffman coding
Yields smallest number of bits/symbol per
source symbol
Lossless code
Uniquely decodable
Instantaneous (no future referencing is
needed)
Run-length coding
10
Huffman coding
Arrange symbol probabilities p
i
in decreasing order.
While there is more than one code
{
Merge the two nodes with smallest probabilities to form a new
node with probabilities equal to their sum
Arbitrarily assign 1 and 0 to each pair of branches merging into a
node
}
Read sequentially from the root node to the leaf node where the
symbol is located.
Huffman codes
11
Average Code length
(0.4) (1) + (0.3) (2) +
(0.1) (3) + (0.1) (4) +
(0.06 + 0.04) (5)
= 2.2 bits/symbol
Entropy of this histogram = 2.14
Predictive Coding
Interpixel / Interframe
Neighboring pixels have similar values
We can predict the values of these
neighboring pixels during the encoding
process
We can use the same prediction algorithm
during decoding process
12
Predictive Coding
Predictive Coding
13
Predictive Coding
Predictive Coding
Error image
Low entropy
Less number of bits required to encode this
image
Lossless
Compression can be achieved
14
Another Example of Predictive
Coding
Another Example of Predictive
Coding
15
Psychovisual Redundancy
Quantization : Lossy
Remove information that human visual system
cannot perceive
Convert from 8 bits to 4 bits
Convert from 8 bits to 4 bits

You might also like