0% found this document useful (0 votes)
81 views10 pages

Nikesh Bajaj: Information Theory and Coding

This document provides an overview and introduction to an information theory and coding course. It discusses the course expectations, aims, and prerequisites. The course will cover information theory, source coding, and the foundations of error correcting codes. It also briefly discusses communication systems and examples like radio, telephone, mobile communication, and television. Coding is introduced in the context of improving efficiency, reliability, and security in communication systems.

Uploaded by

Nikesh Bajaj
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
81 views10 pages

Nikesh Bajaj: Information Theory and Coding

This document provides an overview and introduction to an information theory and coding course. It discusses the course expectations, aims, and prerequisites. The course will cover information theory, source coding, and the foundations of error correcting codes. It also briefly discusses communication systems and examples like radio, telephone, mobile communication, and television. Coding is introduced in the context of improving efficiency, reliability, and security in communication systems.

Uploaded by

Nikesh Bajaj
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 10

9/17/2014

**Information Theory and Coding


ECE533

Overview
Introduction to Course
Information Theory and Coding Syllabus & Books
Course Logistic

jaj
ECE533

Prerequisite Knowledge
Information theory
Nikesh Bajaj
[email protected]
Digital Signal Processing,SECE
Lovely Professional University 2 By Nikesh Bajaj

Introduction to Course
Expectations and Aim
What are your Expectations?

What is Course?
Ba


Communication System
Purpose:
Transmitting the information to destination through some media or
channel.
Typical Block Diagram of Comm. System ???
sh
Examples:
What are Courses Expectations? Information
Tx
Source FM Radio
Telephone
Mobile Comm.
Channel Television
What comes in your mind when you listen word
Storage Channel
Coding? Information CD, DVD,
Rx
User Magnetic Tap

3 By Nikesh Bajaj 4 By Nikesh Bajaj


ke

Information Source Channel Information Source Channel


Source Formatting Encryption Source Formatting Encryption
Encoding Encoding Encoding Encoding

Freq. Multiple Freq. Multiple


Multiplexing Modulation Tx Multiplexing Modulation Tx
Ni

Spreading Access Spreading Access

Other Other
Sources Sources

Other
Synchronization Channel
Other CODING
Synchronization Channel

Sources Sources
1-Efficiency
2-Reliability
Freq. Multiple Freq. Multiple
Demultiplexing Demodulation
Spreading Access
Rx Demultiplexing 3-Security Spreading
Demodulation
Access
Rx

Information Source Channel Information Source Channel


Formatting Decryption Formatting Decryption
Sink Decoding Decoding Sink Decoding Decoding

5 By Nikesh Bajaj 6 By Nikesh Bajaj

1
9/17/2014

Philosophy Syllabus
Unit
1. Information Theory
The basic philosophy of the course is 2. Source Coding
Foundation of Error Correcting Codes

jaj
3.
Most of the ideas in modern coding are very intuitive and
natural.
4. Linear Block Codes
If someone had not invented them a few years ago, you 5. Cyclic Codes
could invent them yourself. 6. Convolutional Codes

7 By Nikesh Bajaj 8 By Nikesh Bajaj


Syllabus: Overview of Subject
Part 1: Information Theory and source coding


Information Theory
Source Coding
Channel capacity and Coding
Ba
##Check updated IP

Books:
sh
Part 2: Channel Coding Text Book

Linear Block codes Error Correction Codes by TODD K. MOON, Wiley Blackwell,
India, 1st Edition (2005) ISBN: 978-0471648000
Cyclic Codes
Other Specific Books
Convolutional Codes Ranjan Bose, Information Theory, Coding and Cryptography, TMH
Publication, 2005.

9 By Nikesh Bajaj 10 By Nikesh Bajaj


ke

Books.. Prerequisite Knowledge


Communication system
Ni

Other Specific Books


Mathematics & Probability
Richard E. Blahut, Algebraic Codes for Data Transmission, CAMBRIGDGE Strong conceptual understanding
University Press. 2003

Cover, Thomas, and Joy Thomas. Elements of Information Theory. 2nd ed. Programming
New York, NY: Wiley-Interscience, 2006. ISBN: 9780471241959
MATLAB
Andre Neubauer, Jurgen Freudenberger, Volker Kuhn Coding Theory, Mapple
Algorithm, Architectures and Application.. John Wiley & Sons, Ltd.
Python
11 By Nikesh Bajaj 12 Eular.. others By Nikesh Bajaj

2
9/17/2014

Course Logistics Assignments & Homework


HW1 :30
Assignment (Not to submit) + Test 1

jaj
HW2 :30
Programming Assignment + Test 2 (Open Book*)

HW3 :30
Design Problem (Unique to each students)

13 By Nikesh Bajaj 14 By Nikesh Bajaj

What else.(Learn with Fun!!!)



Online Group/Forum tinyurl.com/CodyNyk
for discussion and share
Readings & Exercises
Ba



Update yourselves with
IEEE Information Theory Society
http://www.itsoc.org
IEEE Communication Society
Readings: To go through (for better understanding) http://www.comsoc.org
sh
Exercises: Numerical, Problems, Programming Google Group/Forum
QTT! Question to Think! https://groups.google.com/forum/?fromgroups#!forum/codynyk
http://tinyurl.com/CodyNyk
Challenge Problem


Other online links
Open Problem of State-of-the-Art
http://www.usna.edu/Users/math/wdj/research.php
Contest for Code Designing (*may be)
15 By Nikesh Bajaj 16 By Nikesh Bajaj
ke

Aim of Subject So What you will be


Strong understanding of various coding Reason about Coding in Communications
Ni


Source coding techniques Can analyze performance of Comm. Sys
Channel coding techniques
Can understands the need of any Comm
Sys.
Able to perform these techniques
In MATLAB or LabView or any other Lag.
Will be aware of State-of-the-Art in same
field
Develop your own coding techniques Will be able to contribute in same field
Research work
17 By Nikesh Bajaj 18 By Nikesh Bajaj

3
9/17/2014

PART -I Communication System


Information Theory Purpose:
Transmitting the information to destination through some media or
Source Coding channel.

j
Typical Block Diagram of Comm. System ???

Information Examples:
Tx

ja
Source FM Radio
Telephone
Mobile Comm.
Channel Television

Storage Channel
Information CD, DVD,
Rx
User Magnetic Tap

19 By Nikesh Bajaj 20 By Nikesh Bajaj

Information

Other
Sources
Source

Multiplexing
Ba
Formatting

Modulation
Source
Encoding

Freq.
Spreading

Synchronization
Encryption

Multiple
Access
Channel
Encoding

Tx

Channel
Communication blocks


Information Source/sink
Tx/Rx
Channel
Formatting
sh
Other Modulation/Demodulation
Sources Coding/Decoding
Source Coding

Freq. Multiple Channel Coding


Demultiplexing Demodulation Rx
Spreading Access
Multiplexing/Demultiplexing
Multiple Access
Encryption/Decryption
Information Source Channel
Formatting Decryption
Sink Decoding Decoding Equalization
21 By Nikesh Bajaj 22 Synchronization By Nikesh Bajaj
ke

Coding/Decoding
Ni

Source Coding
Block Coding
Introduction: Information Theory

Variable Length Coding


Lossless Compression
Lossy Compression
Predictive Coding
Channel Coding
Error correction Codes
Waveform
M-ary signaling, Orthogonal
TrellisCoded Modulation
Structured sequences
Block,
By Nikesh Bajaj 24
23 Convolution, Turbo By Nikesh Bajaj

4
9/17/2014

Guess.? Claude E Shannon

jaj
(April 30, 1916 February 24, 2001)
Father of Information Theory
Claude Elwood Sir Isaac 1948: The Mathematical Theory of Communication, Bell
Shannon (April 30, Newton (4 Jean Baptiste Joseph
1916 February 24, January 1643 31 Fourier (21 March 1768 University of Michigan, MIT
2001) March 1727 16 May 1830) 25 26 By Nikesh Bajaj


Introduction
Communication theory deals with systems for transmitting
information from one point to another.

Key issues in evaluating performance of a digital


communication system:
Ba


Introduction
Information theory was born with the discovery of the
fundamental laws of data compression and transmission.

The information theory deals only with mathematical modeling


and analysis of communication system, rather than with physical
sources and physical channels.
sh
Efficiency with which information from a given source
can be transmitted. Purpose:
Rate at which information can be transmitted reliably over given an information source and noisy channel, the
a noisy channel. information theory provide limits on
What is the minimum number of bits per symbol required to

The fundamental limit on these key aspects have their root in fully represent the source?
information theory (or mathematical theory of Answer: The Entropy H.
communication). The minimum rate at which reliable communication can take
place over the channel. Answer: Channel Capacity C.
27 By Nikesh Bajaj 28 By Nikesh Bajaj
ke

Shannons Considerations Information


Ni

In early days it was thought that increasing Syntactic


transmission rate over a channel increases Semantic
the error rate.
Pragmatic
Shannon showed that this is not true as long
as rate is below Channel Capacity.

Shannon has further shown that random


processes have an irreducible complexity
below which they can not be compressed.
29 By Nikesh Bajaj 30 By Nikesh Bajaj

5
9/17/2014

Information Source Uncertainty and Information


Information Source Consider these news
Tomorrow, the sun will rises from the east.
Analog Sources
There will be a thunderstorm in the afternoon.

jaj
Speech, Temperature Variation, Nature vision A group of aliens arrived on the earth this morning.
Discrete Sources
English alphabets, Computer files/data, digitized
voice or songs or video.
Source output is Random.
WHY?
DMS-Discrete Memory less Source Information content and probability are inversely related.
31 By Nikesh Bajaj 32 By Nikesh Bajaj


Self Information
Information content and probability are inversely related.
The self information of an event X=xi, having
probability P(xi) is defined as:
Ba Self Information
sh
? ? ? ? ?
1
I ( xi ) log 2 log 2 P( xi ) bits
P( xi )

Which means that less probable events need more bits.


Unit:
base 2:- bits
33 base e :- nats By Nikesh Bajaj 34 By Nikesh Bajaj
ke

Self Information Properties of self information


I(xm) > I(xn), if Pm < Pn;
Ni

Why base is 2..? 1.


Consider a fair coin, giving output as HEAD or 2. I(xk) = 0, if Pk = 1;
TAIL. How many bits require to represent the 3. I(xk) 0, since 0 Pk 1;
output?
4. For two independent message, the total
information is the sum of each
Consider same for block of m binary digit. x1 P1 x2 P
x x1 x2 P = P1P2
1 1 1 1
I x log 2 log 2 log 2 P log 2 P I x1 I x2
P P1 P2 1 2

35 By Nikesh Bajaj 36 By Nikesh Bajaj

6
9/17/2014

Grade Problem Mutual Information


Consider that your grad in one course examination is equally likely, if there Two random variable, X, Y; with outcomes xi, i=1,
are 5 grades A B C D F. If your course teacher tells you that your grade
is not F. Compute 2 ...n and yj, j=1, 2 ...m.
a) How much information does this statement give you? Information about x from y: mutual information

j
b) How much more information you need to know your exact grade?
c) How much information does this statement gives you, if you are quite Extreme cases
sure that you will at least pass in same course examination? If X and Y are independent, No information about x

aja
d) If your friend is an average student (normally does not score A or F from y or vice versa.
grade). How much information does he need to know his grad
exactly? If X and Y are dependent then information of x can be
determine from y.

37 By Nikesh Bajaj 38 By Nikesh Bajaj


B
Mutual Information
The Mutual Information is defined as
Mutual Information
Mutual Information:

Lets consider same two extremes


sh

Information of x from y is identical to


information of y from x
39 By Nikesh Bajaj 40 By Nikesh Bajaj
ke

Party Problem Conditional Self Information


Ni

---- Conditional self information of x when y is


given.

Mutual information
Then

41 By Nikesh Bajaj 42 By Nikesh Bajaj

7
9/17/2014

Average Mutual Information Average Self Information


Average Self Information

jaj
This is Called Entropy of X
Mechanics Phenomena: disorder

43 By Nikesh Bajaj 44 By Nikesh Bajaj

Entropy: Problems
Calculate the average information in
bits/character in English assuming each
letter is equally likely.
Ba Entropy -Problems
Determine the entropy of any English letter
using only alphabets with frequency given at
Wikipedia (http://en.wikipedia.org/wiki/Letter_frequency) .
26 Q. Consider practical case: Compute the entropy of any given image
sh
1 1
H log 2 P=0.10 for a, e,o,t
P=0.07 for h,i,n,r,s Compute the entropy of any given signal
26 26
i 1 P=0.02 for c,d,f,l,m,p,u,y
P=0.01 for b,g,j,k,q,v,w,x,z
4.7 bits / char

Q. Entropy?

45 By Nikesh Bajaj 46 By Nikesh Bajaj


ke

Entropy Properties of Entropy


Properties of Entropy:
If a source has n different letters and each
Ni



For a DMS, the entropy is bounded as

letter has same probability then Entropy:


0 H ( X ) log 2 N

Where N is the total number of symbols of the source.


The lower bound on entropy corresponds to no uncertainty.

The upper bound corresponds to maximum uncertainty.


Entropy of random binary source, if
P(0)=P(1)=q 1
H ( X ) log 2 N if Pk for all k
N

47 By Nikesh Bajaj 48

8
9/17/2014

Conditional Entropy
Average conditional self-information or the conditional
entropy is defined as
1-p0

jaj
0 0

p1
Tx p0
Rx
It is interpreted as the average amount of uncertainty in X after
1 1
1-p1
Y is observed
BSC
Therefore, the information can be given as

Prove it

Since I(X;Y) 0, therefore H(X) H(X |Y )


49 50

Prove : Ba Example
sh
Entropy of X ??

51 By Nikesh Bajaj 52 By Nikesh Bajaj


ke

Summary
Consider a pair X and Y of discrete variables
H (X) :average information in observing X

Example


H (Y) :average information in observing Y
H (X,Y) :average information in observing (X,Y)
H (X |Y) :average information in observing X when Y is known
Conditional Entropy H(X|Y)??

Ni

H (Y |X) :average information in observing Y when X is known


I (X ;Y): average mutual information between X and Y

53 54

9
9/17/2014

Information Measures for Continuous Random


variables
If X and Y are random variables with joint PDF p(x,y) and marginal PDFs p(x)
and p(y), the average mutual information between X and Y is define as

jaj
Self-information or differential entropy of the random variable X is

The average conditional entropy of the random variable X given Y

Also, The average mutual information between X and Y is define as

I ( X ; Y ) H ( X ) H ( X / Y ) H (Y ) H (Y / X )

55

Ba
sh
ke
Ni

10

You might also like