0% found this document useful (0 votes)
190 views55 pages

Chapter 1 - Information Theory

The document describes the basic components of a digital communication system: 1) A source encoder compresses data from the source into the minimum number of bits before transmission. 2) A channel encoder adds redundant bits for error correction before modulation and transmission through the channel. 3) Noise is introduced during transmission through the channel that can corrupt the received signal. Channel capacity and Shannon's theorem relate to the maximum error-free transmission rate.

Uploaded by

Boruto69
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
190 views55 pages

Chapter 1 - Information Theory

The document describes the basic components of a digital communication system: 1) A source encoder compresses data from the source into the minimum number of bits before transmission. 2) A channel encoder adds redundant bits for error correction before modulation and transmission through the channel. 3) Noise is introduced during transmission through the channel that can corrupt the received signal. Channel capacity and Shannon's theorem relate to the maximum error-free transmission rate.

Uploaded by

Boruto69
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 55

Chapter 1

INFORMATION THEORY
INFORMATION THEORY: C312.1

1.1 Introduction.
Block diagram of basic digital communication system
INFORMATION THEORY: C312.1

1.1 Introduction.
Block diagram of basic digital communication system
Following are the sections of the digital communication system.
➢Source
The source can be an analog signal. Example: A Sound signal
➢Input Transducer
This is a transducer which takes a physical input and converts it to an electrical
signal (Example: microphone). This block also consists of an analog to
digital converter where a digital signal is needed for further processes. A digital
signal is generally represented by a binary sequence.
INFORMATION THEORY: C312.1

1.1 Introduction.
Block diagram of basic digital communication system
➢Source Encoder
The source encoder compresses the data into minimum number of bits. This process
helps in effective utilization of the bandwidth. It removes the redundant bits.
➢Channel Encoder
The channel encoder, does the coding for error correction. During the transmission
of the signal, due to the noise in the channel, the signal may get altered and hence to
avoid this, the channel encoder adds some redundant bits to the transmitted data.
These are the error correcting bits.
INFORMATION THEORY: C312.1

1.1 Introduction.
Block diagram of basic digital communication system
➢Digital Modulator
The signal to be transmitted is modulated here by a carrier. The signal is also
converted to analog from the digital sequence, in order to make it travel through the
channel or medium.
➢Channel
The channel or a medium, allows the analog signal to transmit from the transmitter
end to the receiver end.
➢Digital Demodulator
This is the first step at the receiver end. The received signal is demodulated as well
as converted again from analog to digital. The signal gets reconstructed here.
INFORMATION THEORY: C312.1

1.1 Introduction.
Block diagram of basic digital communication system
➢Channel Decoder
The channel decoder, after detecting the sequence, does some error corrections. The
distortions which might occur during the transmission, are corrected by adding some
redundant bits. This addition of bits helps in the complete recovery of the original
signal.
➢Source Decoder
The resultant signal is once again digitized by sampling and quantizing so that the
pure digital output is obtained without the loss of information. The source decoder
recreates the source output.
INFORMATION THEORY: C312.1

1.1 Introduction.
Block diagram of basic digital communication system
➢Output Transducer
This is the last block which converts the signal into the original physical form,
which was at the input of the transmitter. It converts the electrical signal into
physical output (Example: loud speaker).
➢Output Signal
This is the output which is produced after the whole process. Example − The sound
signal received.
INFORMATION THEORY: C312.1

1.2 Measure of information


➢Communication systems are designed to transmit the information generated by a
source to some destination.
➢The source of information may be analog source or discrete source. The simplest
type of discrete source is one that emits a sequence of letters selected from a finite
alphabet.
➢For example, a binary source emits a binary sequence of the form 100101110...,
where the alphabet consists of the two letters {0, 1 }.
➢More generally, a discrete, information source with an alphabet of possible letters,
say {x1, x2, ... XL}, emits a sequence of letters selected from the alphabet.
INFORMATION THEORY: C312.1

1.2 Measure of information


➢To construct a mathematical model for a discrete source, we assume that each
letter in the alphabet {x1, x2, ... xL} has a given probability Pk of occurrence.
Symbol: x1, x2, x3 , …xk ,... xM
Probabilities: P1,P2 ,P3 , Pk ,….PM
Of course, this set of probabilities must satisfy the condition
M

 P =1
k=0
k
INFORMATION THEORY: C312.1

1.2 Amount of information


Information Content of A Message
➢Now we will try to find a measure of how much "information" is produced by
such a source ? The amount of information depends on the "uncertainty" or
"surprise" as discussed below.
➢Consider the event X = xk, which describes the emission of symbol xk by the
source with probability Pk.
➢The amount of information gained after observing the event X = xk, with
probability Pk,  1 
I (k ) = log  
 pk 
INFORMATION THEORY: C312.1

1.2 Amount of information


Information Content of A Message
INFORMATION THEORY: C312.1

1.2 Average information or Entropy


INFORMATION THEORY: C312.1

1.2 Average information or Entropy


INFORMATION THEORY: C312.1

1.2 Average information or Entropy


INFORMATION THEORY: C312.1

1.2 Information rate


INFORMATION THEORY: C312.1

1.2 Information rate


INFORMATION THEORY: C312.1

1.2 Channel capacity – Definition and Expression


➢It is defined as the measure of the amount of information a channel can transfer
from the source to the destination in any communication channel. It is denoted by
‘C’
➢It can also be defined as the maximum possible bit rate a channel can support
without introducing any error . The unit is bits/sec or bps.
➢Another term closely related to the channel capacity is information rate ‘R’. It is
defined as the rate at which the information is transmitted by a communication
source per/sec. it’s unit is also bit/sec. ‘R’ should be equal to or less than C.
INFORMATION THEORY: C312.1

1.3 Hartley’s laws related to channel capacity


➢Hartley’s 1st law states that in the total absence of noise the capacity of a
communication channel can be calculated as C = 2 B log 2 (N)

where B = bandwidth of communication channel in Hertz.


N = no. of coding level in the system
NOTE: When a binary coding system is used bit ‘0’ is represented by a certain
voltage level and bit ‘1’is represented by another voltage level. Therefore N = 2.

 C = 2 B log 2 (2)
INFORMATION THEORY: C312.1

1.3 Hartley’s laws related to channel capacity


➢Suppose we want to calculate the channel capacity of telephone channel which
uses binary coding system.
➢Therefore, Hartley’s law can be applied as follows,

 C = 2 B log 2 (N)
➢For a standard telephone communication channel B = 3100 Hz (standard
telephone bandwidth), since standard telephone channel is in the range of 300 Hz-
3400 Hz.
INFORMATION THEORY: C312.1

1.3 Hartley’s laws related to channel capacity

 C = 2 * 3100 log 2 (2)


 C = 6200 bps
➢This means maximum information that a channel can transfer is directly
proportional to the no. of coding level (N).
INFORMATION THEORY: C312.1

1.3 Hartley’s laws related to channel capacity


➢Hartley’s 2nd law states that the total information that can be sent in a given
transmission time ‘t’ is given by

H = ( C * t ) bits
 H = {2 B log 2 (N)} * t  bits

➢Both the Hartley’s law valued only for perfectly noiseless channel.
INFORMATION THEORY: C312.1

1.4 Shannon Theorem


➢For a given communication source with the information rate ‘R’ and channel
capacity ‘C’, the Shannon theorem can be stated as follows

➢If R < C, there exist a coding technique such that the information can be
transmitted over the channel with less probability of error inspite of the presence
of noise.
➢If R = C, the channel capacity is fully utilized and the communication efficiency
is 100%.
➢If R > C, it is NOT possible to transmit the information without errors.
INFORMATION THEORY: C312.1

1.4 Shannon & Hartley theorem


➢The Shannon-Hartley theorem states that the channel capacity is given by

 S
C = B log 2 1 +  bits / sec ond
 N
where C is the capacity in bits per second, B is the bandwidth of the channel in
Hertz, and S/N is the signal-to-noise ratio.
INFORMATION THEORY: C312.1

1.6 Channel noise and its effect


➢The digital transmission systems are used for the transmission of a sequence of
binary digits, 0’s and l’s.
➢In general the binary digits are encoded in such a way that a "1" is represented by
x1 (t) and a "0" is represented by signal x2 (t), where x1(t) and x2(t) each have a
duration (T) secs.
➢The resulting signal may be transmitted directly or used for modulating a carrier.
➢The received signal is corrupted by noise. Therefore there is a probability that the
receiver will make an error in deciding whether a 1 or a 0 was transmitted.
➢Consider a binary sequence 1 0 1 1 being transmitted. While travelling from the
transmitter to receiver, noise gets added to it.
INFORMATION THEORY: C312.1

1.6 Channel noise and its effect


➢Thus the signal received by a receiver is corrupted by noise as shown in Fig.
below.

➢The first transmitted bit is represented by voltage + A volts which extends over the
time t1 to t2, i.e. over one bit interval.
INFORMATION THEORY: C312.1

1.6 Channel noise and its effect


➢Noise has been superimposed on this signal. In order to make a judgement of
whether a 1 or a 0 is received, the receiver, samples the received signal once in
every bit interval.
➢In the first bit interval of Fig. if the sampling happens to take place at instant
( t1 +Δt ) then the receiver will decide that a 0 has been received thus introducing an
error.
➢In order to reduce the probability of error the sampling instant in each interval
should be selected in such a way that the signal amplitude is maximum at the
instant of sampling.
INFORMATION THEORY: C312.1

1.7.1 Comparison with binary coding system ,Communication efficiency


➢The efficiency of these two coding schemes can be analyzed by considering an
example of transmitting 13 equiprobable messages. The probability of selecting
one message is p=1/13.
➢If binary coding system is used then the information carried by each message will
be  1 
I = log 2   bits
1 
 13 
➢To transmit this information we require 4 bits, therefore efficiency of binary
coding system in transmitting 1/13 messages is
3.7
X 100 = 92.5 %
4
INFORMATION THEORY: C312.1

1.7.1 Comparison with binary coding system , Communication


efficiency
1
➢If we use decimal coding system then I is equal toI = log10 decimal bits (decits)
1
13
➢To transmit this information we require 2 decits, therefore efficiency of decimal
coding system is
1.11
X 100 = 55.5 %
2

➢Thus the binary coding system has better efficiency than the decimal coding
system.
INFORMATION THEORY: C312.1

1.7 Multilevel coding systems


According to Hartley’s law
 C = 2 B log 2 (N)
That is the channel capacity is directly proportional to the number of coding level.
Hence, it is possible to increase the number of coding level in order to increase the
channel capacity.
INFORMATION THEORY: C312.1

1.7 Multilevel coding systems


INFORMATION THEORY: C312.1

1.7 Multilevel coding systems


INFORMATION THEORY: C312.1

1.7 Multilevel coding systems


INFORMATION THEORY: C312.1

1.7 Multilevel coding systems


INFORMATION THEORY: C312.1

1.7 Multilevel coding systems


INFORMATION THEORY: C312.1

1.7 Multilevel coding systems


The End

You might also like