ITC Lecture 01
ITC Lecture 01
Coding
Lecture 1
➢ Reference Books
❑ Principle of Digital Communication and Coding: Andrew J. Viterbi, Jim
K. Omura , United States of America.
❑ Principle of Communication Engineering: W. Jacobs, John Wiley.
❑ Information Theory and Reliable Communication: R. Gallager , John
Wiley
2
Tentative Grading Criteria
➢ Exams
❑ Final exam: 40%
❑ Mid-term exam: 20%
❑ Research Paper: 20%
➢ Sessional
❑ Assignments: 20%
3
➢ No mobile phone usage during class
4
Course contents
➢ Introduction to Mutual Information and Entropy
❑ Measuring Information
❑ Mutual Information
❑ Entropy, Relative Entropy
❑ Conditional Entropy
❑ Markov Chain
❑ Properties of Markov Chain
❑ Discrete Markov Chain
❑ Steady State
➢ Source Coding
❑ Source coding theorem
❑ Huffman coding
❑ Discrete memory less Channels
❑ Channel Capacity
➢ Channel Coding
❑ Error Correction Codes
❑ Convolutional codes
❑ Turbo codes
❑ Concatenated codes
5
Course contents
➢ Introduction to IT++
❑ Installation of IT++
❑ Linking with IT++
❑ IT++ Examples
❑ Different FECC implementation using IT++
➢ http://itpp.sourceforge.net/4.3.1/index.html
6
Source Coding
• Source Coding is efficiently compression of data for transmission or storage.
• Compression is achieved by exploiting redundancy
• Uncompressed 1080p high definition (HD) video at 24 frames/ second
– Pixels per frame: 1920x1080
– Bits per pixel: 8-bits x 3 (RGB) 1.5 hours: 806 GB Bit-rate: 1.2 Gb/s
Year Standards Application Application
Bit rate feature
1990 H.261 (ITU-T) 384Kb/s ISDN Networks (Data, Voice,
Image, Video)
1993 MPEG-1 (ISO) 1.5 Mb/s Storage (CD, HD)
1996 MPEG-2 (ISO, ITU-T) 4-9Mb/s Intra and Inter frame compression
1998 MPEG-4 Low and Medium scalability and error resilience
2003 H.264 /AVC(ITU-T, ISO) wide range Low bit rate video streaming to HDTV
8
Objectives
➢ The main objective is to introduce the fundamental limits of
communication with practical techniques to realize the limits
specified by information theory.
The course emphasizes:
o To deeply understand the mathematics of Information Theory
o Understand source coding techniques
o To discuss various channel coding techniques.
o Can apply the knowledge to real problems in communication applications.
9
Course Outline
10
Introduction
➢ Internet active users worldwide are 5.0 billion.
➢ Online videos make up more than 82% of all the internet traffic.
➢ Video effect of the devices on data traffic is more pronounced.
➢ Video transmission requires high data rate Internet users
❑ High transmission power
❑ Bandwidth
Popular Streaming Platforms
Video Streaming Users
Platforms
Youtube 2.0 billion
Netflix 150 million
UHD 15-17 Mbps
Vimeo 130 millions
Yahoo screen 120 millions HD 5-7 Mbps
DailyMotion 100 millions
SD 2Mbps
11
12
Information Theory
➢ The purpose of communication system is to carry information
bearing baseband signals from one place to another placed
over a communication channel.
13
Information Theory
➢ What is the ultimate limit of reliable communication over a
noisy channel, e.g. how many bits can be sent in one second
over a telephone line?
➢ Information Theory is a branch of probability theory which
may be applied to the study of the communication systems
that deals with the mathematical modelling and analysis of a
communication system rather than with the physical sources
and physical channels.
➢ Two important elements presented in this theory are Binary
Source (BS) and the Binary Symmetric Channel(BSC).
➢ A binary source is a device that generates one of the two
possible symbols ‘0’ and ‘1’ at a given rate ‘r’, measured in
symbols per second
14
Information Theory
➢ These symbols are called bits(binary digits) and are generated
randomly.
15
Channel Coding
➢ The source information is the channel capacity measurement and
the channel coding related by one of the Shannon theorems, the
channel coding theorem which is stated as “ if the information
rate of a given source does not exceed the capacity of a given
channel then there exists a coding technique that makes possible
transmission through this unreliable channel with an arbitrarily
low error rate.
16
Information Theory and Coding
➢ There are three main concepts in this theory:
❑ The first is the definition of a quantity that can be a valid measurement of
information which should be consistent with a physical understanding of its
properties.
❑ The second concept deals with the relationship between the information and
the source that generates it. This concept will be referred to as the source
information. Compression and encryptions are related to this concept.
❑ The third concept deals with the relationship between the information and
the unreliable channel through which it is going to be transmitted. This
concept leads to the definition of a very important parameter called the
channel capacity. Error-correction coding is closely related to this concept.
17
Digital Communication System
➢ Communication:
❑ Communication involves explicitly the transmission of information from one
point to another, through a succession of processes.
➢ Basic elements to every communication system
❑ Transmitter
❑ Channel
❑ Receiver
18
Digital Communication System
➢ Source Encoder
❑ Sampling
❑ Quantization
❑ coding
➢ Channel Encoder
❑ Control the errors
❑ Detect and correct the errors
❑ Add the redundant bits to source bitstream
➢ Modulator
❑ Convert the sequences into electrical signals for transmission
❑ Maps the binary sequences to signal wave
➢ Channel
❑ Physical medium
❑ Range of frequencies(channel bandwidth)
19
Digital Communication System
➢ Demodulator
❑ Removes the carrier signal and convert back to sequence of numbers.
❑ Output consists of data bits + redundant bits.
➢ Channel Decoder
❑ Channel imposed errors are removed.
❑ Reconstruct the information signal from the knowledge of channel encoder.
➢ Source Decoder
❑ Decode the sequence from encoding algorithm and generate source data.
20
Information
➢ Information is the intelligence/ideas or message in information
theory.
➢ Message may be electrical source(volt, current), speech/voice,
image and video
➢ Source of this message is called information source.
❑ Watching a video on YouTube channel
o Information is voice and video
o Information source youtube.com
➢ In communication system information is transmitted from source
to destination.
➢ Uncertainty (randomness or surprise)
➢ X={ x0,x1………….xn}
➢ P={ p0,p1………….pn}
➢ When p0=0 (no uncertainty)
➢ p0=1 (no uncertainty)
21
Information
➢ Measure of Information is the information content of the
message.
❑ Source emitting independent message
❑ M={ m0,m1…………}
❑ P={ p0,p1………….pn}
❑ Amount of information is given by
𝟏
o 𝑰𝒌 = 𝐥𝐨𝐠𝟐( )
𝑷𝒌
𝐥𝐨𝐠 𝟏𝟎 𝒙
o 𝒍𝒐𝒈𝟐𝒙 = ( )
𝐥𝐨𝐠𝟏𝟎𝟐
22
Information
➢ Properties of Information
1. More uncertainty about message then information is more
2. Receiver knowing information being transmitted then information is zero
3. I1 is the information of message m1 and I2 is message m2 then combined
information is I1+I2
4. If there are 𝑀 = 2𝑁 equally likely messages then amount of information
carried by each message will be N bits.
23
Information
➢ Properties of Information
1. More uncertainty about message then information is more
Proof
m1= m2= (𝟑)
𝟏
𝟒 𝟒
(um1 > um2) (Im1 > Im2)
1 1
𝐼 = log )
𝑚1 2 (1/4 𝐼𝑚2 = log 2(3/4 )
4
𝐼𝑚1 = log 2(4) 𝐼𝑚2 = log2( )
3
𝑙𝑜𝑔10 4/3
𝐼𝑚1 = 2log 2(2) 𝑙𝑜𝑔2 4/3 =
𝑙𝑜𝑔102
4
𝐼𝑚1 = 2 1 𝐼𝑚2 = log 2 = 𝑙𝑜𝑔104/3 = 0.415
3 𝑙𝑜𝑔10 2
𝐼𝑚1 = 2 𝐼𝑚2 = 0.415
24
Information
➢ Properties of Information
2. Receiver knowing information being transmitted then information is zero
Proof
Occurrence of an event is known then Pk=1
1
𝐼𝑘 = log2( )
𝑃𝑘
𝐼𝑘 = log2(1)
𝑙𝑜𝑔10(1)
𝐼𝑘 = ( )
𝑙𝑜𝑔102
=0 bits
Hence proved information is zero
25
Information
➢ Properties of Information
3. I1 is the information of message m1 and I2 is message m2 then combined information is
I1+I2
Proof
Occurring of message m1 with probability P1
Occurring of message m2 with probability P2
𝟏 𝟏
𝑰𝟏 = 𝐥𝐨𝐠𝟐( ) and 𝑰𝟐= 𝐥𝐨𝐠𝟐( )
𝑷𝟏 𝑷𝟐
Since messages m1 and m2 are independent
So composite probability is P1P2
1
𝐼1,2 = log2( 𝑃 )
1𝑃2
1 1
𝐼1,2 = log2
𝑃1 𝑃2
1 1
𝐼1,2 = log 2 ( ) + log 2 ( )
𝑃1 𝑃2
𝑰𝟏,𝟐 = 𝑰𝟏 + 𝑰𝟐
26
Information
➢ Properties of Information
4. If there are 𝑀 = 2𝑁 equally likely messages then amount of information
carried by each message will be N bits
Proof
If we have “𝑀 = 2𝑁” equal likely messages
Information carried by each message I= N bits
Since M equal likely messages then the probability of each
𝟏
message is
𝑴
1
𝐼𝑘 = 𝑙𝑜𝑔2( )
𝑃𝑘
𝑃𝑘 = 1
𝑀
1
𝐼𝑘 = 𝑙𝑜𝑔2( 𝑃 )= 𝑙𝑜𝑔2(𝑀)
𝑘
𝐼𝑘 = 𝑙𝑜𝑔2(2𝑁)
𝐼𝑘 = 𝑁 𝑙𝑜𝑔2(2)
𝐼𝑘 = N (1)
𝐼𝑘 =N
Hence Proved
27
Information
Problem: Calculate amount of information
1
if a) Pk=
4
3
b) Pk=
4
Solution
1
𝐼𝑘 = 𝑙𝑜𝑔2( )
𝑃𝑘
a) I = log ( 1
) b) I = log 1
)
k 2 1/4 k 2 (3/4
4
Ik = log 2 (4) Ik = log2( )
3
log 10 4/3
Ik = 2log 2 (2) log 2 4/3 =
log102
4 log 10 4/3
Ik = 2 1 Ik = log 2 = = 0.415
3 log 10 2
Ik = 2 Ik = 0.415
28
Information
A card is selected at random from a deck of playing cards. If you have
been told that it is red in colour, how much info you have received.
How much more info you needed to completely specify the cards.
How much info you have received How much more info you needed
to completely specify the cards
o Total cards=52 Probability of unique card =1/52
o Total Red cards=26 1
Now information 𝐼𝑘 = 𝑙𝑜𝑔2( )
P(R)= 26/52 𝑃𝑘
1
P(R)=1/2 𝐼𝑅 = 𝑙𝑜𝑔2 ( )=
1/52
Now information 𝐼𝑘 = 𝑙𝑜𝑔2( 1
) 𝑙𝑜𝑔2 52 = 5.7 𝑏𝑖𝑡𝑠
𝑃𝑘
1
𝐼𝑅 = 𝑙𝑜𝑔2( )
1/2 Extra information= 5.7-1 = 4.7 bits
𝐼𝑅 = 𝑙𝑜𝑔2 2
𝐼𝑅 = 1 bit
More Info needed to completely
Information that a card is specify the cards=4.7 bits
red
Information
➢ Consider discrete memoryless source “C” that output two bits at a
time. This source comprises two binary sources “A” and “B” whose
output are equally likely to occur and each source contributing one
bit. Suppose that the two sources within the source “C” are
independent. What is the information content of each output from
the source “C”.
➢ 𝑃𝐴 = 𝑃𝐵 = ½
➢
➢ 𝑃𝐶 = 𝑃𝐴 𝑋𝑃𝐵
➢ 𝑃𝐶 =1/4
Now information 𝐼𝑘 = 𝑙𝑜𝑔2( 1
) Sample space of “C” = {00.01,10,11}
𝑃𝐶
1
𝐼𝑅 = 𝑙𝑜𝑔2 ( ) = 𝑙𝑜𝑔2 (22) 𝑃𝐶 =1/4
1/4
𝐼𝑅 = 2 bits
30
Next Lecture
➢ Entropy
➢ Relative Entropy
➢ Joint Entropy
➢ Conditional Entropy
31