Lecture 3 Compression in Multimedia
Lecture 3 Compression in Multimedia
ITU-07319
Introduction to Compression
OUTLINE
Introduction
What is Compression
Why is Compression Important in Multimedia Systems
Coding
Basic Types of Data compression
Classification of Data Compression Types
Performance Metrics
Lossless compression
INTRODUCTION
the number of bytes of data, Examples
stored, processed, and the ever-present,
transmitted keeps soaring, ever-growing
and in the process, keeps Internet;
transforming our world. the explosive
development of
Mega106, mobile
Giga109,
communications;
and the ever-
tera 1012,
Peta 1015, Exa 1018, increasing
Zetta 1021, importance of
yotta 1024 video
communication.
INTRODUCTION
Data
volume
Data Volume
After
Before
Compressio
Compression
n
Data Speed
Data Speed After
Before Compression
Compression
WHAT IS DATA COMPRESSION
Receive
SOURCE CODING
sen sen sen sen
d d d d
Receive
COMPRESSION
•
TECHNIQUES
There are two kinds of compression - Lossless and Lossy
• Compression techniques take advantage of redundancy in digital images.
• Types of redundancies
• Spatial redundancy: due to the correlation between neighbouring pixel
values.
• Spectral redundancy: due to the correlation between different color
planes or spectral bands.
• Lossy techniques, in addition, take advantage of HVS (Human Visual
System) properties.
CLASSIFICATION OF IMAGE
COMPRESSION
Repetitive Lossless
Statistical Bitplane
Sequence Predictive
Encoding Encoding
Encoding Coding
RLE Huffman Arithmetic Differential pulse-code
LZW modulation (DPCM)
LOSSLESS
COMPRESSION
• In Lossless Compression, data are reconstructed after
compression without errors. i.e. no information is lost.
1. TEXT INFORMATION
AFTER
BEFORE COMPRESSION COMPRESSION
Do not send money Do now send
money
AFTER
BEFORE COMPRESSION
COMPRESSION
TZS1,000,000,000
TZS 100,000,000
LOSSLESS
• . RECORDSCOMPRESSION
2. MEDICAL
For example, suppose we compressed a radiological image in a lossy
fashion; and the difference between the reconstruction Y and the original X
was visually undetectable. If this image was later enhanced, the previously
undetectable differences may cause the appearance of artifacts that could
seriously mislead the radiologist and result in wrong diagnosis and wrong
treatment.
AFTER COMPRESSION
BEFORE COMPRESSION
THE BRAIN OF THE
A SIGNIFICANT LUMP ON
PATIENT LOOKS
A PATIENT BRAIN
NORMAL
3. GEOGRAPHICAL DATA
BEFORE COMPRESSION
AFTER COMPRESSION
PRESENCE OF LAND
LAND LOOKS NORMAL
DEGRADATION
LOSSLESS
• COMPRESSION
3.. GEOGRAPHICAL DATA
Data obtained from satellites often are processed later to
obtain different numerical indicators of vegetation,
deforestation, and so on. If the reconstructed data are not
identical to the original data, processing may result in
“enhancement” of the differences. It may not be possible to
go back and obtain the same data over again. Therefore, it
is not advisable to allow for any differences to appear in the
compression process.
BEFORE COMPRESSION
AFTER COMPRESSION
PRESENCE OF LAND
LAND LOOKS NORMAL
DEGRADATION
LOSSY
COMPRESSION
• Many situations require compression
where we want the reconstruction (Y) to
be identical to the original (X).
• In a number of situations it is possible to
relax this requirement in order to get
more compression.
• In these situations, we look to lossy
compression techniques
LOSSY
COMPRESSION
• In addition to the tradeoff between coding efficiency –
coder complexity – coding delay, the additional aspect
of compression quality arises with the use of lossy
methods.
Block Lossy
Transform Subband Fractal Vector
Truncation Predictive quantization
Coding Coding Coding Coding Coding
2. Repeat until the OPEN list has only one node left:
Symbol A B C D E
Count 15 7 6 6 5
25
HUFFMAN CODING contd..
26
HUFFMAN CODING contd..
A=
B=
C=
D=
E=
HUFFMAN CODING contd..
Problems:
Solution:
Find a way to build the dictionary
adaptively.
LEMPEL-ZIV-WELCH (LZW) ALGORITHM
w = NIL;
while ( read a character k )
{
if wk exists in the dictionary
w = wk;
else
add wk to the dictionary;
output the code for w;
w = k;
}
THE LZW DECOMPRESSION ALGORITHM IS AS
FOLLOWS:
read a character k;
output k;
w = k;
while ( read a character k )
/* k could be a character or a code. */
{
entry = dictionary entry for k;
output entry;
add w + entry[0] to dictionary;
w = entry;
}
ENTROPY ENCODING SUMMARY
38
ENTROPY ENCODING SUMMARY
39
BASICS OF INFORMATION
THEORY
The entropy η of an information source with alphabet S = {s1, s2, . . . ,
sn} is:
n
1
H ( S ) pi log 2
i 1 pi
n
pi log 2 pi
i 1
44
RUN-LENGTH CODING
47
SHANNON–FANO ALGORITHM
Symbol H E L O
Count 1 1 2 1
48
SHANNON–FANO ALGORITHM
50
SHANNON–FANO ALGORITHM
E, O:(3).
53
Figure 5: Another coding tree for HELLO by Shannon-Fano
54
Table 2: Another Result of Performing Shannon-Fano
on HELLO (see Figure 5)
L 2 1.32 00 4
H 1 2.32 01 2
E 1 2.32 10 2
O 1 2.32 11 2
TOTAL # of bits: 10
55
SHANNON–FANO ALGORITHM
57
DICTIONARY-BASED CODING