Nikesh Bajaj: Information Theory and Coding
Nikesh Bajaj: Information Theory and Coding
Overview
Introduction to Course
Information Theory and Coding Syllabus & Books
Course Logistic
jaj
ECE533
Prerequisite Knowledge
Information theory
Nikesh Bajaj
[email protected]
Digital Signal Processing,SECE
Lovely Professional University 2 By Nikesh Bajaj
Introduction to Course
Expectations and Aim
What are your Expectations?
What is Course?
Ba
Communication System
Purpose:
Transmitting the information to destination through some media or
channel.
Typical Block Diagram of Comm. System ???
sh
Examples:
What are Courses Expectations? Information
Tx
Source FM Radio
Telephone
Mobile Comm.
Channel Television
What comes in your mind when you listen word
Storage Channel
Coding? Information CD, DVD,
Rx
User Magnetic Tap
Other Other
Sources Sources
Other
Synchronization Channel
Other CODING
Synchronization Channel
Sources Sources
1-Efficiency
2-Reliability
Freq. Multiple Freq. Multiple
Demultiplexing Demodulation
Spreading Access
Rx Demultiplexing 3-Security Spreading
Demodulation
Access
Rx
1
9/17/2014
Philosophy Syllabus
Unit
1. Information Theory
The basic philosophy of the course is 2. Source Coding
Foundation of Error Correcting Codes
jaj
3.
Most of the ideas in modern coding are very intuitive and
natural.
4. Linear Block Codes
If someone had not invented them a few years ago, you 5. Cyclic Codes
could invent them yourself. 6. Convolutional Codes
Syllabus: Overview of Subject
Part 1: Information Theory and source coding
Information Theory
Source Coding
Channel capacity and Coding
Ba
##Check updated IP
Books:
sh
Part 2: Channel Coding Text Book
Linear Block codes Error Correction Codes by TODD K. MOON, Wiley Blackwell,
India, 1st Edition (2005) ISBN: 978-0471648000
Cyclic Codes
Other Specific Books
Convolutional Codes Ranjan Bose, Information Theory, Coding and Cryptography, TMH
Publication, 2005.
Cover, Thomas, and Joy Thomas. Elements of Information Theory. 2nd ed. Programming
New York, NY: Wiley-Interscience, 2006. ISBN: 9780471241959
MATLAB
Andre Neubauer, Jurgen Freudenberger, Volker Kuhn Coding Theory, Mapple
Algorithm, Architectures and Application.. John Wiley & Sons, Ltd.
Python
11 By Nikesh Bajaj 12 Eular.. others By Nikesh Bajaj
2
9/17/2014
jaj
HW2 :30
Programming Assignment + Test 2 (Open Book*)
HW3 :30
Design Problem (Unique to each students)
Online Group/Forum tinyurl.com/CodyNyk
for discussion and share
Readings & Exercises
Ba
Update yourselves with
IEEE Information Theory Society
http://www.itsoc.org
IEEE Communication Society
Readings: To go through (for better understanding) http://www.comsoc.org
sh
Exercises: Numerical, Problems, Programming Google Group/Forum
QTT! Question to Think! https://groups.google.com/forum/?fromgroups#!forum/codynyk
http://tinyurl.com/CodyNyk
Challenge Problem
Other online links
Open Problem of State-of-the-Art
http://www.usna.edu/Users/math/wdj/research.php
Contest for Code Designing (*may be)
15 By Nikesh Bajaj 16 By Nikesh Bajaj
ke
Source coding techniques Can analyze performance of Comm. Sys
Channel coding techniques
Can understands the need of any Comm
Sys.
Able to perform these techniques
In MATLAB or LabView or any other Lag.
Will be aware of State-of-the-Art in same
field
Develop your own coding techniques Will be able to contribute in same field
Research work
17 By Nikesh Bajaj 18 By Nikesh Bajaj
3
9/17/2014
j
Typical Block Diagram of Comm. System ???
Information Examples:
Tx
ja
Source FM Radio
Telephone
Mobile Comm.
Channel Television
Storage Channel
Information CD, DVD,
Rx
User Magnetic Tap
Information
Other
Sources
Source
Multiplexing
Ba
Formatting
Modulation
Source
Encoding
Freq.
Spreading
Synchronization
Encryption
Multiple
Access
Channel
Encoding
Tx
Channel
Communication blocks
Information Source/sink
Tx/Rx
Channel
Formatting
sh
Other Modulation/Demodulation
Sources Coding/Decoding
Source Coding
Coding/Decoding
Ni
Source Coding
Block Coding
Introduction: Information Theory
4
9/17/2014
jaj
(April 30, 1916 February 24, 2001)
Father of Information Theory
Claude Elwood Sir Isaac 1948: The Mathematical Theory of Communication, Bell
Shannon (April 30, Newton (4 Jean Baptiste Joseph
1916 February 24, January 1643 31 Fourier (21 March 1768 University of Michigan, MIT
2001) March 1727 16 May 1830) 25 26 By Nikesh Bajaj
Introduction
Communication theory deals with systems for transmitting
information from one point to another.
Introduction
Information theory was born with the discovery of the
fundamental laws of data compression and transmission.
The fundamental limit on these key aspects have their root in fully represent the source?
information theory (or mathematical theory of Answer: The Entropy H.
communication). The minimum rate at which reliable communication can take
place over the channel. Answer: Channel Capacity C.
27 By Nikesh Bajaj 28 By Nikesh Bajaj
ke
5
9/17/2014
jaj
Speech, Temperature Variation, Nature vision A group of aliens arrived on the earth this morning.
Discrete Sources
English alphabets, Computer files/data, digitized
voice or songs or video.
Source output is Random.
WHY?
DMS-Discrete Memory less Source Information content and probability are inversely related.
31 By Nikesh Bajaj 32 By Nikesh Bajaj
Self Information
Information content and probability are inversely related.
The self information of an event X=xi, having
probability P(xi) is defined as:
Ba Self Information
sh
? ? ? ? ?
1
I ( xi ) log 2 log 2 P( xi ) bits
P( xi )
6
9/17/2014
j
b) How much more information you need to know your exact grade?
c) How much information does this statement gives you, if you are quite Extreme cases
sure that you will at least pass in same course examination? If X and Y are independent, No information about x
aja
d) If your friend is an average student (normally does not score A or F from y or vice versa.
grade). How much information does he need to know his grad
exactly? If X and Y are dependent then information of x can be
determine from y.
B
Mutual Information
The Mutual Information is defined as
Mutual Information
Mutual Information:
Mutual information
Then
7
9/17/2014
jaj
This is Called Entropy of X
Mechanics Phenomena: disorder
Entropy: Problems
Calculate the average information in
bits/character in English assuming each
letter is equally likely.
Ba Entropy -Problems
Determine the entropy of any English letter
using only alphabets with frequency given at
Wikipedia (http://en.wikipedia.org/wiki/Letter_frequency) .
26 Q. Consider practical case: Compute the entropy of any given image
sh
1 1
H log 2 P=0.10 for a, e,o,t
P=0.07 for h,i,n,r,s Compute the entropy of any given signal
26 26
i 1 P=0.02 for c,d,f,l,m,p,u,y
P=0.01 for b,g,j,k,q,v,w,x,z
4.7 bits / char
Q. Entropy?
For a DMS, the entropy is bounded as
47 By Nikesh Bajaj 48
8
9/17/2014
Conditional Entropy
Average conditional self-information or the conditional
entropy is defined as
1-p0
jaj
0 0
p1
Tx p0
Rx
It is interpreted as the average amount of uncertainty in X after
1 1
1-p1
Y is observed
BSC
Therefore, the information can be given as
Prove it
Prove : Ba Example
sh
Entropy of X ??
Summary
Consider a pair X and Y of discrete variables
H (X) :average information in observing X
Example
H (Y) :average information in observing Y
H (X,Y) :average information in observing (X,Y)
H (X |Y) :average information in observing X when Y is known
Conditional Entropy H(X|Y)??
Ni
53 54
9
9/17/2014
jaj
Self-information or differential entropy of the random variable X is
I ( X ; Y ) H ( X ) H ( X / Y ) H (Y ) H (Y / X )
55
Ba
sh
ke
Ni
10