0% found this document useful (0 votes)
5 views9 pages

Lec-1 Introduction. Information Theory

The document outlines the course ECE533 on Information Theory and Coding, detailing its objectives, syllabus, and prerequisites. It covers key topics such as source coding, channel coding, and various coding techniques, emphasizing the importance of understanding communication systems. The course aims to equip students with the ability to analyze and develop coding techniques for effective communication.

Uploaded by

praveen.yadav
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
5 views9 pages

Lec-1 Introduction. Information Theory

The document outlines the course ECE533 on Information Theory and Coding, detailing its objectives, syllabus, and prerequisites. It covers key topics such as source coding, channel coding, and various coding techniques, emphasizing the importance of understanding communication systems. The course aims to equip students with the ability to analyze and develop coding techniques for effective communication.

Uploaded by

praveen.yadav
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 9

1/12/2013

**Information Theory and Coding


ECE533

Overview
Introduction to Course
rse
Advance Coding Theory Syllabus & Bookss
Information Theory and Coding
ECEP581 Course Logistic
ECE533
Prerequisite
quisit Knowledge
wledg
Informationn theo
theory
Nikesh Bajaj
[email protected]
Asst. Prof., ECE Dept.
Lovely Professional University 2

Introduction to Course Communication System


Comm
Purpose:
r
rp
Expectations and Aim Transmitting the information to destination through some media or
What are your Expectations? channel.
Typical Block Diagram of Comm. System ???
What is Course?
Information Examples:
Tx FM Radio
Source
Telephone
Mobile Comm.
Channel Television
What comes in your mind when you listen wordd
Storage Channel
Information CD, DVD,
Rx
User Magnetic Tap

3 4

Information Source
Sour
ou c Channel
Ch Information Source Channel
Source Formatting Encryption
ncrypt
y ion Source Formatting Encryption
Encodingg Encoding Encoding Encoding

Freq.
q. Mu
Multiple
M Freq. Multiple
Multiplexing Modulation Tx Multiplexing Modulation Tx
Spreading
ng Access
Ac Spreading Access

Other Other
Sources Sources

Other
Sources
Synchronization
Syn
ynchr
h oni
on zation Channel
Other
Sources
CODING
Synchronization Channel

1-Efficiency
2-Reliability
Freq. Multiple Freq. Multiple
Demultiplexing Demodulation
Spreading Access
Rx Demultiplexing 3-Security Spreading
Demodulation
Access
Rx

Information Source Channel Information Source Channel


Formatting Decryption Formatting Decryption
Sink Decoding Decoding Sink Decoding Decoding

5 6

1
1/12/2013

Philosophy Syllabus
Unit
1. Information Theory and Source Coding
The basic philosophy of the course is 2. Foundation of Error Correctingg Codes
Cod
3. Groups and Vector Space
Most of the ideas in modern coding are very intuitive and
4. Linear Block Codes
natural.
5. Cyclic Codes
If someone had not invented them a few years ago, you
could invent them yourself. 6. Number Theory
heory and
dAAlgebra
ra
7. BCH and Reed-Solomon
olomon Codes es
8. Convolutional Codes
odes
9. Trellis and Turbo
u codes
odes
10. Bounds on Codes and Other
ther Codes
Cod
7 8

##Check updated IP

Syllabus: Overview of Subject Books:


Part 1: Information Theory and source coding
Source Coding
Channel capacity and Coding
Part 2: Channel Coding -I
Linear Block codes Text Book

Cyclic Codes Error Correction Codes by TODD K. MOON, Wiley Blackwell,


India, 1st Edition (2005) ISBN: 978-0471648000
Part 3: Channel Coding-II Other Specific Books
BCH Codes Ranjan Bose, Information Theory, Coding and Cryptography, TMH
Publication, 2005.
Convolutional Codes
Trellis Coded Modulation
9 10

Books.. Prerequisite Knowledge


Communication system
Other Specific Books
Mathematics & Probability
Richard E. Blahut,
ahut, Alge
Algebraic
Al br c Code
brai Codes for Data Transmission,
ans CAMBRIGDGE Strong conceptual understanding
University Press. 2003

Cover, Thomas, and


nd Joy Thomas. Elements of IInformation Theory. 2nd ed. Programming
New York, NY: Wiley-Interscience,
ley-Int 2006. ISBN: 9780471241959
MATLAB
Andre Neubauer, Jurgen Freud
Freudenberger, Volker Kuhn Coding Theory, Mapple
Algorithm, Architectures andd Appli
Appl
Application..
c John Wiley & Sons, Ltd.
Python
11 12 Eular.. others

2
1/12/2013

Course Logistics Assignments & Hom


Homework
HW1 :30
Assignment (Not to submit)
mit) + Testt 1

HW2 ::30
Programming
ammin Assignment
nmen + Test 2 (Open
(O Book)
Book

HW3 :30
Design Problem
oblem (Un
(Unique tto each students)

13 14

Update yourselves with


Online Group/Forum tinyurl.com/CodyNyk IEEE
E Information
Inf Theory Society
for discussion and share http://www.itsoc.org
tt
ttp
Readings & Exercises IEEE Communication Society
Readings: To go through (for better understanding)
ing)
g http://www.comsoc.org
Exercises: Numerical, Problems, Programming Google Group/Forum
QTT! Question to Think! https://groups.google.com/forum/?fromgroups#!forum/codynyk
http://tinyurl.com/CodyNyk
Challenge Problem
Other online links
-of-the-
http://www.usna.edu/Users/math/wdj/research.php
Contest
15 16

Aim of Subject
ubject
Strong understanding
nderst d ng of various coding Reason about Coding in Communications
Source coding techniques
tech Can analyze performance of Comm. Sys
Channel
han
anne coding techn
techniques
h iq
Can understands the need of any Comm
Sys.
Able to perform
Ab rf rm these te
rfo techniques
ATLA or LabView or any other Lag.
In MATLAB
-of-the-
field
Develop yourr own coding techniques Will be able to contribute in same field
Research work
17 18

3
1/12/2013

PART -I Communication System


Information Theory Purpose:
Transmitting the information to destination
ination through
throug some media or
Source Coding channel.
Typical Block Diagram of Comm.
omm
m . System
tem ???

Information Exa
Examples:
Tx FM Radio
Source
Telephone
T
Te le
Mobile Comm.
Mob
Channel
Channe Television
T
Te

Storage Channel
Information CD, DVD,
Rx
User Magnetic Tap

19 20

Information Source Channel


Source Formatting Encryption
Encoding Encoding

Communication
m blocks
Freq. Multiple
Multiplexing Modulation Tx Informa
Information Source/sink
Spreading Access
Tx/Rx
x/Rx
Other
Sources Channel
Cha
Formatting
Synchronization Channel
Other Modulation/Demodulation
Sources Coding/Decoding
Source Coding
Freq. Multiple Channel Coding
Demultiplexing Demodulation Rx
Spreading Access
Multiplexing/Demultiplexing
Multiple Access
Encryption/Decryption
Information Source Channel
ann
nnel
Formatting Decryption
Sink Decoding Decoding
coding
n Equalization
21 22 Synchronization

Coding/Decoding
/Decoding
Source Coding
ing
Blockk Codi
Coding
Variable Length Coding
Codi Introduction: Information Theory
Lossless Compression
ion
Lossyy Com
Compre
Compression
ss
Predictive Coding
Channel
Ch nel Coding
ding
Error
rror corre
correction Codes
Waveform
form
M-ary signaling,
ign
gna Orthogonal
TrellisCodedModulation
oded M
Structured sequences
quences
Block,
24
23 Convolution, Turbo
rbo

4
1/12/2013

Claude E Shannon
hannon

(April 30, 1916 February


ebru
r a 24,4, 2001)
Father of Information
ion Theo
Theory
Claude Elwood Sir Isaac 1948:
Shannon (April 30, Newton (4 Jean Baptiste Joseph
1916 February 24, January 1643 31 Fourier (21 March 1768 University off Michigan,
Mich MIT
2001) March 1727 16 May 1830) 26

Introduction Introduction
Communication theory deals with systems for transmitting Information
a on th
ati theory was born with the discovery of the
information from one point to another. fundamental
n al laws of data compression and transmission.
nt

Key issues in evaluating performance of a digital The information theory deals only with mathematical modeling
communication system: and analysis of communication system, rather than with physical
Efficiency with which information from a given sourcee sources and physical channels.
can be transmitted. Purpose:
Rate at which information can be transmitted reliably over given an information source and noisy channel, the
a noisy channel. information theory provide limits on
What is the minimum number of bits per symbol required to
The fundamental limit on these key aspects have their root
oot in fully represent the source?
information theory (or mathematical theory of Answer: The Entropy H.
communication). The minimum rate at which reliable communication can take
place over the channel. Answer: Channel Capacity C.
27 28

Information

In early days itt was


as thought
t ht that in
increasing
re Syntactic
transmission rate overer a channel
nnel increases
in Semantic
the error rate.
rate
Pragmatic
Shannonon showed
how d that this is not
n true as long
as rate is below
low Channel Ca
Capacity.

Shannon has further


her shown that random
processes have an irreducible
irredu complexity
below which they can
an not be compressed.
29 30

5
1/12/2013

Information Source Uncertainty and Information


Inform
Information Source Consider these news
Tomorrow, the sun will rises from the east.
Analog Sources
There will be a thunderstorm
m in the
th afternoon.
ter oon.
tern
Speech, Temperature Variation, Nature vision
A group of aliens arrived on the earth
ea this morn
ear morning.
r in
Discrete Sources
English alphabets, Computer files/data, digitized
voice or songs or video.
Source output is Random.
WHY?
DMS-Discrete Memory less Source Informationn content
con and
nd proba
probability are inversely related.
31 32

Self Information Self Information


nform
Information content and probability are inversely related.
The self information of an event X=xi, having
probability P(xi) is defined as:
? ? ? ? ?
1
I ( xi ) log 2 log 2 P ( xi ) bits
P( xi )

Which means that less probable events need more bits.


its.
Unit:
base 2:- bits
33 base e :- nats 34

Self Information
ation Properties of self information
Why base iss 22..?
.? I(xm) > I(xn), if Pm < Pn;
Consider a fair coin
coin, giving
ng output
out as HEAD or I(xk) = 0, if Pk = 1;
TAIL.. How many bits require re to represent
r the I(xk) 0, since 0 k 1;
output?
put?
For two independent message, the total
information is the sum of each
Consider
der same
sam for block of m binary digit. x1 1 x2 2
1 2 P = P1P2

1 1 1 1
I x log 2 log 2 log 2 log 2 I x1 I x2
P P1 P2 P1 P2
35 36

6
1/12/2013

Mutual Information Mutual Information


ation
Two random variable, X, Y; with outcomes xi, i=1, The Mutual Information is defined
def ned as
defi
2 ...n and yj, j=1, 2 ...m.
Information about x from y: mutual information
Extreme cases Lets consider same two
wo extr
ext
extremes
emes
If X and Y are independent, No information about x
from y or vice versa.
If X and Y are dependent then information of x can be
determine from y.

37 38

Mutual Information Conditional Self Information


Mutual Information: Cond
Conditional self information of x when y is
given.
gi

Mutual information
Then
Information of x from y is identical to
information of y from x
39 40

Average Mutual Information Average Self Information


Average Self Information

This is Called Entropy of X


Mechanics Phenomena: disorder

41 42

7
1/12/2013

Entropy: Problems Entropy


Calculate the average information in If a source has n different letters and
a each
bits/character in English assuming each letter has same probability
ility thenn Entro
Entropy:
letter is equally likely.
26 1 Q. Consider practical case:
1 P=0.10 for a, e,o,t
H log 2 P=0.07 for h,i,n,r,s
26 26
i 1 P=0.02 for c,d,f,l,m,p,u,y
P=0.01 for b,g,j,k,q,v,w,x,z
Entropy of random
rand binary source,
sourc if
4.7 bits / char P(0)=P(1)=q
Q. Entropy?

43 44

Properties of Entropy
Properties of Entropy:
For a DMS, the entropy is bounded as

0 H ( X ) log 2 N 0
1-p 0
0
p1
Where N is the total number of symbols of the source. Tx p0
Rx
The lower bound on entropy corresponds to no uncertainty. 1 1-p 1
1

The upper bound corresponds to maximum uncertainty. BSC

1
H ( X ) log 2 N if Pk for all k
N

45 46

Conditional En
Entropy Prove :
Average conditionall sself-information
lf in mation or the
lf- th co
conditional
ditio
entropy is defined as

It is interpreted
reted as the av
aaverage
erage amount
a oun
am unt of uncertainty in X after
Y is observed
Therefore, the information
orm
r can be given as

Prove it

Since I(X;Y X X |Y )
47 48

8
1/12/2013

Example Example
Conditional Entropy H(X|Y)???

Entropy of X ??

49 50

Summary
Info
IInformation
f rma Measures
Mea for Continuous Random
Consider a pair X and Y of discrete variables
variables
riab
a les
H (X) :average information in observing X
and p(y),
y) th
tthe
he average mutual information between X and Y is define as
H (Y) :average information in observing Y
H (X,Y) :average information in observing (X,Y)
H (X |Y) :average information in observing X when Y is known
H (Y |X) :average information in observing Y when X is known
I (X ;Y): average mutual information between X and Y
Self-information or differential entropy of the random variable X is

The average conditional entropy of the random variable X given Y


T

Also, The average mutual information between X and Y is define as

I ( X ;Y ) H(X ) H(X /Y) H (Y ) H (Y / X )


51 52

You might also like