Lossless Compression: Lesson 1
Lossless Compression: Lesson 1
Lesson 1:
m
Lesson 2:
m
Lesson 3:
m
Dictionary-based Coding
LZW
Lossless Compression
Multimedia Systems (Module 2 Lesson 1)
Summary:
r
Compression
m
m
r
r
r
Sources:
r The Data Compression Book,
With loss
Without loss
Shannon: Information
Theory
Shannon-Fano Coding
Algorithm
Huffman Coding Algorithm
Compression
Why Compression?
What is Redundancy?
m
Compression
Lossless Compression
Lossless Compress
Uncompress
Uncompress
M M
Information Theory
According to Shannon, the entropy@ of an information
source S is defined as:
H(S) =
m
m
i (pi
log
(1/pi ))
Information Theory
Discussion:
m
The units (in coding theory) of entropy are bits per symbol.
Model
Probabilities
Encoder
Codes
Output
Stream
2.
3.
4.
5.
Example
Symbol
Count
15
0
0
7
0
1
6
1
6
1
5
1
Symbol
Count
15
Info.
-log2(pi)
Code
Subtotal
# of
Bits
x 1.38
00
30
x 2.48
01
14
x 2.70
10
12
x 2.70
110
18
x 2.96
111
15
85.25
89
2.
Example
1
0
39
0
Count
Symbol
15
A
7
B
Symbol
Count
15
0
1
13
Info.
-log2(pi)
24
1
0
11
1
5
E
Code
Subtotal
# of
Bits
x 1.38
15
x 2.48
000
21
x 2.70
001
18
x 2.70
010
18
x 2.96
011
15
85.25
87