0% found this document useful (0 votes)
19 views41 pages

Unit II

The document discusses the significance of information in digital communication, highlighting its relation to probability and the foundational concepts of Information Theory established by Claude Shannon in 1948. It covers topics such as the measure of information, entropy, information rate, and Shannon's theorem regarding channel capacity. The document emphasizes the importance of quantifying information to enhance communication efficiency and reliability.

Uploaded by

koliteja006
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
19 views41 pages

Unit II

The document discusses the significance of information in digital communication, highlighting its relation to probability and the foundational concepts of Information Theory established by Claude Shannon in 1948. It covers topics such as the measure of information, entropy, information rate, and Shannon's theorem regarding channel capacity. The document emphasizes the importance of quantifying information to enhance communication efficiency and reliability.

Uploaded by

koliteja006
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 41

Digital Communication

Dr. Mahadev S. Patil


Professor and Head
Department of ETC Engineering
Kasegaon Education Society’s

RAJARAMBAPU INSTITUTE OF TECHNOLOGY,


Islampur, Dist. Sangli, Maharashtra, India - 415 414
Unit –II
Information Theory
Significance of Information

Course Outcomes:
1. Compute statistical parameters in source coding and
channel coding.

Topic Learning Outcomes:


1.Illustrate measure of information and it’s relation
with probability.
Information ?

• Information is that commodity produced by


transmitter or source such as human beings or
electronics circuits which is full of surprise.
• The intelligence signal is said to carry
information if the probability of occurance is
small ie; to say if a certain event is less
probable, then it conveys more information.
Illustration

• Consider following three examples


1.The temperature in Kashmir in the month of
December is 40 degree C --- P= 0.0001
2. It will rain tomorrow --- P= 0.03
3. Sun rises in the East --- P= 1

Conclusion: If a certain event has very less


probability of occurrence then such a less
probability event conveys more information. So
information is that intelligence carrying signal
which is full of surprise and unpredictable.
Measure of information

• How to quantify information?


• All events are probabilistic!
Where and when IT started?
Roots?

The roots of modern digital communication stem from


the ground-breaking paper “A Mathematical Theory
of Communication” by Claude Elwood Shannon in
1948.

In his 1948 paper he build a rich theory to the


problem of reliable communication, now called
“Information Theory” or “The Shannon Theory”
in honor of him.
Information Theory
Shannon’s paper
Block diagram of Digital Communication system
Information content of a message
The output of a discrete information source is a message that
consists of a sequence of symbols.
•The actual message that is emitted by the source during a message
interval is selected at random from a set of possible messages.
•The communication system is designed to reproduce at the receiver
either exactly or approximately the message emitted by the source.
•As mentioned earlier, some messages produced by an information
source contain more information than other messages. The question
we ask ourselves now is, how can we measure the information
content of a message quantitatively so that we can compare the
information content of various messages produced by the source?
•In order to answer this question, let us first review our intuitive
concept of the amount of information.
How to quantify information?
Since less probable event conveys more information, it
implies information is inversely proportional to probability
of occurrence of that event.
How to quantify information?
Unit of information?
Relation between different units of information?
Entropy of Information

Topic Learning Outcomes:


1.Illustrate entropy of information
Information ?

• Information is that commodity which is full of


surprise.
• The intelligence signal is said to carry
information if the probability of occurance is
small ie; to say if a certain event is less
probable, then it conveys more information.
How to quantify information?
Unit of information
Relation between different units of information?
Q

Calculate the amount of information in bits if p=1/4


Q
Calculate the amount of information if binary digits occur
with equal likelihood in binary PCM
Q

In binary PCM if ‘0’ occur with probability ¼ and 1 occur


with probability ¾ then calculate the amount of information
Conveyed by each binary digit
Q

A fair coin is tossed in the air and if head falls.


How much information is conveyed by this event?
Entropy (Average information)
For message sources in communication systems, average
information provided by source is most important because if
average information provided by source is known, then everything
is known about the source.
Symbols No of times the symbol is produced
S1 m1
S2 m2
S3 m3
. .
. .
Sq mq
Entropy………
Then the total information provided by symbol S1 is
IS1+IS1+IS1+……………………….m1 times
=m1IS1
Similarly total information provided by symbol symbol S2 is
IS2+IS2+IS2+……………………….m2 times
=m2IS2
Therefore the total information ‘I’ provided by source is
I= m1IS1 + m2IS2 + m3IS3 + …………………..+mqISq bits
Let N= m1 + m2 + m3 + …………………..+mq be the total number
of symbols produced by the source.
Then average information or entropy of the source denoted by H(x)
is defined as information produced per symbol and is given by
H(x)= I/N
H(X)= (m1IS1 + m2IS2 + m3IS3 + …………………..+mqISq)/N
Entropy………
Entropy………
Information Rate

Topic Learning Outcomes:


1.Explain information rate as applied to digital
communication
Information rate

No of sources are providing information to the channel


One source faster than other provides more symbols per unit time
More information is transferred by faster source
Therefore information rate term is coined
Channel capacity

Topic Learning Outcomes:


1.Describe Shannon’s theorem as applied to channel
capacity
Channel

Efficiency:
It is the ratio of rate of information transmission
to the channel capacity.
ie; η = R/C

Capacity:
The maximum number of bits that can be transmitted
through the channel. Measured in bits/sec
Shannon’s Theorem

This theorem puts the limit on which the information can be


transmitted through the channel

Positive statement: R≤C Prob. of error → 0

Negative statement: R≥C Prob. of error →1


bit error probability
Channel capacity of Continuous channel
Channel capacity, the tightest upper bound on information
rate (excluding error correcting codes) of arbitrarily low bit error
rate data that can be sent with a given average signal
power S through an additive white Gaussian noise channel of
power N, is:

C is the channel capacity in bits per second


B is the bandwidth of the channel in hertz
S is the total received signal power over bandwidth, in watts
N is the total noise or interference power over bandwidth, in watts
S/N is the signal-to-noise ratio (SNR) expressed as a linear power
ratio (not as logarithmic decibels).
This formula is called Shannon Hartley Theorem
Shannon’s Theorem
BW to SNR - trade off

S/N 0.5 1 2 5 10 15 20 25

B/C

B/C

S/N
Implications of Shannon’s-Hartley Theorem

It gives upper limit that can be reached in a way of reliable


data transmission rate over Gaussian channel

Trade off between signal to noise ratio and bandwidth

Bandwidth compression
Constructions of source coding

1. Shannon – Fano coding


2. Huffman’s coding

• Source coding
• Channel coding
• Code efficiency
• Code redundancy
• Average codeword length L =Average no of bits
per source symbol
• Code efficiency
Take away?

What have we
learnt today??
Thank you

You might also like