3 Source Coding
3 Source Coding
Source Coding
n Speech compression
n Example:
n If I toss a dice 1,000,000 times and record values from each trial
1,3,4,6,2,5,2,4,5,2,4,5,6,1,…….
n But Shannon said you only need 2.585 bits for storing each outcome.
So, the file can be compressed to yield size 2585000 bits.
Shannon’s Source Coding Theorem
n Example:
Types of Information Source
n Codes Types
n Blocking and Non-Blocking
Code(A) = Code(C)
we will not be able to
distinguish between A
and C.
Codes Types
n Sol: Non-singular
Codes Types
n Sol: Instantaneous
Codes Types
n Example:
Optimum Codes
n Non-Singular
n Uniquely Decodable
n Instantaneous
Optimum Codes
Fixed Length Codes
L=
n H log 2 ( N ) then H L
n Example:
n For code 1
n The coding rate, or average number of bits per symbol, is given by:
Lavg Li Pi
i
n Example:
n For code 2
n This code is uniquely decodable; we can decode instantenously (no back
tracking is required); once we have the bits of the encoded symbol we
can decode without waiting for more.
n This code satisfies the prefix condition, that is there is no code word
which is prefix ( same bit pattern) of a longer code word.
1 1 1 1 7
L avg X1 X 2 X 3 X 3
2 4 8 8 4
n The coding rate is equal to the entropy (coding rate = H = 7/4) .
Variable Length Codes
n Example:
n For code 3
n This code is decodable.
n The code does not have the prefix property and is not an instantaneous
code.
1 1 1 1 7
L avg X1 X 2 X 3 X 3
2 4 8 8 4
n The coding rate is equal to the entropy (coding rate = H = 7/4) .
Variable Length Codes
n Note:
n The code is called optimally efficient when Lavg = H (the coding rate
equals the entropy of the source).
n Example:
n Whether this code is optimally efficient code or not?
n Sol:
n H = 1/2 + 2/4 + 3/8 + 4/16 + 4/16 = 1(7/8)
n Sol:
1. Does this code satisfy the Kraft-Mcmillan inequality?
n Sol: