0% found this document useful (0 votes)
43 views10 pages

Digital Communication System

This document discusses lossless compression of digital signals. It begins by explaining how compression works by exploiting redundancy and introducing acceptable deviations. It then covers the basics of lossless compression, including entropy coding, Huffman codes, arithmetic coding, and prediction coding. Statistical dependencies between color components of video signals are discussed, showing that joint coding of correlated components can achieve coding gains. Markov models are also introduced as a way to account for memory in video sources.

Uploaded by

Imran Pochi
Copyright
© Attribution Non-Commercial (BY-NC)
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
43 views10 pages

Digital Communication System

This document discusses lossless compression of digital signals. It begins by explaining how compression works by exploiting redundancy and introducing acceptable deviations. It then covers the basics of lossless compression, including entropy coding, Huffman codes, arithmetic coding, and prediction coding. Statistical dependencies between color components of video signals are discussed, showing that joint coding of correlated components can achieve coding gains. Markov models are also introduced as a way to account for memory in video sources.

Uploaded by

Imran Pochi
Copyright
© Attribution Non-Commercial (BY-NC)
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 10

Digital communication system

Representation of the source signal by a stream of (binary) symbols Adaptation to the properties of the transmission channel

information source

source coder

channel coder

modulation

binary symbols

channel noise

information sink

source decoder

channel decoder

demodulation

We will be looking at this part


Bernd Girod: EE368b Image and Video Compression

digital channel

Lossless Compression no. 1

Shannons separation principle


Assume: 1. Point-to-point communication 2. Ergodic channel 3. Delay

Source

Source Source Adapt to source statistics coder coder (and distortion measure)

Source Source decoder decoder

Channel Channel Adapt to channel statistics coder coder

Channel Channel decoder decoder

Channel
Bernd Girod: EE368b Image and Video Compression Lossless Compression no. 2

How does compression work?


n

Exploit redundancy.
l l l

Take advantage of patterns in the signal. Describe frequently occuring events efficiently. Lossless coding: completely reversible Remove information that the humans cannot perceive. Match the signal resolution (in space, time, amplitude) to the application Lossy coding: irreversible distortion of the signal

Introduce acceptable deviations.


l l

Bernd Girod: EE368b Image and Video Compression

Lossless Compression no. 3

Lossless compression in lossy compression systems


n

Almost every lossy compression system contains a lossless compression system Lossy compression system
Quantizer Encoder Lossless Encoder Lossless Decoder Quantizer Decoder

Lossless compression system

We will discuss the basics of lossless compression first, then move on to lossy compression
Bernd Girod: EE368b Image and Video Compression Lossless Compression no. 4

Topics in lossless compression


n n n n n n n

Binary decision trees and variable length coding Entropy and bit-rate Huffman codes Statistical dependencies in image signals Sources with memory Arithmetic coding Redundancy reduction by prediction

Bernd Girod: EE368b Image and Video Compression

Lossless Compression no. 5

Example: 20 Questions
n

Alice thinks of an outcome (from a finite set), but does not disclose his selection. Bob asks a series of yes-no questions to uniquely determine the outcome chosen. The goal of the game is to ask as few questions as possible on average. Our goal: Design the best strategy for Bob.

Bernd Girod: EE368b Image and Video Compression

Lossless Compression no. 6

Example: 20 Questions (cont.)


n

Observation: The collection of questions and answers yield a binary code for each outcome.

0 0 0

1 1 0 1 1 0 0 1

0 1 0

1 1 1

B C
0

C D 0 E

D
n

Which strategy (=code) is better?


Bernd Girod: EE368b Image and Video Compression Lossless Compression no. 7

Fixed length codes


0 0 0 0 1 1 0 1 0 0 1 1

A
n n n

B C

DE

F G

Average description length for K outcomes lav = log2 K Optimum for equally likely outcomes Verify by modifying tree

Bernd Girod: EE368b Image and Video Compression

Lossless Compression no. 8

Variable length codes


n

If outcomes are NOT equally probable:


l l

Use shorter descriptions for likely outcomes Use longer descriptions for less likely outcomes Optimum balanced code trees, i.e., with equally likely outcomes, can be pruned to yield unbalanced trees with unequal probabilities. The unbalanced code trees such obtained are also optimum. Hence, an outcome of probability p should require about

Intuition:
l

l l

1 log 2 bits p

Bernd Girod: EE368b Image and Video Compression

Lossless Compression no. 9

Entropy of a memoryless source


n

Let a memoryless source be characterized by an ensemble U0 with: Alphabet {a0 , a1 , a2 ,..., aK 1} Probabilities {P(a0 ), P(a1 ), P(a2 ),..., P(aK 1 )}

Shannon: information conveyed by message ak : I(ak ) = log(P(ak )) Entropy of the source is the average information contents: H (U0 ) = E{I ( ak )} = P( ak ) log( P( ak )) = P(u0 ) log( P(u0 ))
k =0 u0 K 1

For log = log 2 the unit is bits/symbol


Bernd Girod: EE368b Image and Video Compression Lossless Compression no. 10

Entropy and bit-rate


n

Properties of entropy:

H(U 0 ) 0 max{H(U0 )} = log(K) with P(a j ) = P(a k )j, k


n

The entropy H(U 0 ) is a lower bound for the average word length l av of a decodable variable-length code for the symbols u 0 . Conversely, the average wordlength l av can approach H(U 0 ), if sufficiently large blocks of symbols are encoded jointly. Redundancy of a code:

R = lav H(U 0 )
Bernd Girod: EE368b Image and Video Compression Lossless Compression no. 11

Encoding with variable word length


n

A code without redundancy, i.e.

Example

lav = H(U 0 )
is achieved, if all individual code word length

ai P(a i ) a0 a1 a2 a3
0.500 0.250 0.125 0.125

redundant optimum code code 00 01 10 11 0 10 110 111

lcw (ak ) = log(P(ak ))


n

For binary code words, all probabilities would have to be binary fractions:

H(U 0 ) = 1.75 bits / symbol lav = 1.75 bits / symbol R=0


Lossless Compression no. 12

P(a k ) = 2 l

cw (a k

Bernd Girod: EE368b Image and Video Compression

Huffman-Code
n

Design algorithm for variable length codes proposed by Huffman (1952) always finds a code with minimum redundancy. Obtain code tree as follows: 1 Pick the two symbols with lowest probabilities and merge them into a new auxiliary symbol. 2 Calculate the probability of the auxiliary symbol. 3 If more than one symbol remains, repeat steps 1 and 2 for the new auxiliary alphabet. 4 Convert the code tree into a prefix code.
Bernd Girod: EE368b Image and Video Compression Lossless Compression no. 13

Huffman-Code - Example

Fixed length coding: l =3.00 Huffman code: lav =2.71 Entropy H(U0 )=2.68 Redundancy of the Huffman code: R=0.03

bits/symbol bits/symbol bits/symbol bits/symbol

Bernd Girod: EE368b Image and Video Compression

Lossless Compression no. 14

Probability density function of the luminance signal Y


Images: 3 EBU test slides, 3 SMPTE test slides, uniform quantization with 256 levels (8 bits/pixel)

H(U Y ) = 7.34 bits / pixel


Image with lowest entropy:

HL (UY ) = 6.97 bits / pixel


Image with highest entropy: HH (UY ) = 7.35 bits / p i x e l
Bernd Girod: EE368b Image and Video Compression Lossless Compression no. 15

Probability density function of the color difference signals R-Y and B-Y

H(U RY ) = 5.57 bits/pixel


HL (U R Y ) = 4.65 bits/pixel HH (U R Y ) = 5.72 bits / pixel
Bernd Girod: EE368b Image and Video Compression

H(U B Y ) = 5.24 bits/pixel


HL (U B Y ) = 4.00 bits / p i x e l HH (U B Y ) = 5.34 bits / p i x e l
Lossless Compression no. 16

Joint sources
n

Joint sources generate N symbols simultaneously. A coding gain can be achieved by encoding those symbols jointly. The lower bound for the average code word length is the joint entropy:

H(U1 ,U 2 ,...,U N ) = ... P(u1 ,u2 ,...,u N )log(P(u1 ,u2 ,...,u N ))


u1 u 2 uN

It generally holds that

H(U1 ,U 2 ,...,U N ) H(U1 ) + H(U2 ) + ... + H(U N )


with equality, if U1 ,U2 ,...,U N are statistically independent.
Bernd Girod: EE368b Image and Video Compression Lossless Compression no. 17

Statistical dependencies between video signal components Y, R-Y, B-Y


n

Data: 3 EBU-, 3 SMPTE test slides, each component Y, R-Y, B-Y uniformly quantized to 64 levels

H 0 = 3x6 bits / sample

18

bits/sample

H(U Y ,U R Y ,U BY ) = 9.044 bits/sample H(U Y ) + H(U R Y ) + H(U BY ) = 11.218 bits/sample H


n n

= 2.174 bits/sample

Statistical dependency between R, G, B is much stronger. If joint source Y, R-Y, B-Y is treated as a source with memory, the possible gain by joint coding is much smaller.

Bernd Girod: EE368b Image and Video Compression

Lossless Compression no. 18

Markov process
n

Neighboring samples of the video signal are not statistically independent: source with memory

P(uT ) P(uT | u T 1 ,u T 2 ,...,uT N )


n

A source with memory can be modeled by a Markov random process. Conditional probabilities of the source symbols u T of a Markov source of order N:

P(uT , ZT ) = P(uT | uT 1 ,uT 2 ,...,u T N )


state of the Markov source at time T
Bernd Girod: EE368b Image and Video Compression Lossless Compression no. 19

Entropy of source with memory


n

Markov source of order N: conditional entropy


H(U T | ZT ) = H(UT | UT 1 ,UT 2 ,..., U T N ) = E{ log( p(UT | U T 1 ,UT 2 ,...,UT N ))} = u ...u
T T N

p(uT , uT 1 , uT 2 ,..., uT N ) log( p(uT | uT 1 , uT 2 ,...,u T N ))

H(U T | ZT ) H(UT )
n

(equality for memoryless sources)

Average code word length can approach H(U T | ZT ) e.g. with a switched Huffman code. Number of states for an 8-bit video signal:

N=1 N=2 N=3

256 states 65536 states 16777216 states


Lossless Compression no. 20

Bernd Girod: EE368b Image and Video Compression

10

You might also like