0% found this document useful (0 votes)
18 views7 pages

DCS Unit - 1

The document discusses digital communication systems, outlining their components such as source encoders, channel encoders, and modulators, as well as the advantages and disadvantages of digital communication compared to analog communication. It also covers key concepts like information rate, entropy, and Shannon's theorems related to source coding and channel capacity. The document emphasizes the importance of digital signals in transmitting data efficiently and securely.

Uploaded by

pratikcoder01
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
18 views7 pages

DCS Unit - 1

The document discusses digital communication systems, outlining their components such as source encoders, channel encoders, and modulators, as well as the advantages and disadvantages of digital communication compared to analog communication. It also covers key concepts like information rate, entropy, and Shannon's theorems related to source coding and channel capacity. The document emphasizes the importance of digital signals in transmitting data efficiently and securely.

Uploaded by

pratikcoder01
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 7

Nirmala K DIGITAL COMMUNICATION SYSTEM( 22428) EJ4I

Chapter 1 18 MKS

DIGITAL COMMUNICATION AND CODING METHODS

MOTIVATION- The communication system in which number of devices are arranged in a


particular manner to send or transfer the data or information from sending end to receiving end
by means of digital signals is called as digital communication.

Contents:

1.1 Block diagram of Digital communication

OR

Page | 1
Nirmala K DIGITAL COMMUNICATION SYSTEM( 22428) EJ4I

It consists of:-

1. Discrete information source


2. Source Encoder
3. Channel Encoder
4. Digital Modulator
5. Channel
6. Digital Demodulator
7. Channel Decoder
8. Source Decoder
9. Destination
1. Discrete information source:- the information source generates the message signal to be
transmitted . In digital communication the information source produces a message signal
which is not continuously varying with respect to time. Rather the message signal is
intermittent with respect to time.
Eg:- Data from computer , Teletype ,etc.

2. Source Encoder:- The symbol generated by the information are given to source encoder .
They are first converted into digital form by the source encoder. Every binary 1 & 0 is
called bits . The group of bits is called a codeword. Therefore the source encoder assigns
unique codewords to the symbols .
Eg:- Pulse Code Modulator, Vector Quantizer, Delta Modulators

3. Channel Encoder :-It is used to add some redundant bits to the input sequence . These
extra bits are added to avoid the errors which occur due to noise and interference during
communication process.

4. Digital Modulator:- the digital output of channel encoder will be superimposed on the
carrier wave signal in the digital modulator. The carrier signal used fordigital modulation
is a C.W. of very high frequency it will represent high frequency for “1” and low
frequency for “0”. Modulation technique used is FSK,ASK,PSK,MSK,etc.

5. Channel:- It is the medium used to carry the data or information from transmitter to
receiver . It may be guided or unguided medium.

6. Digital Demodulator:- at receiver it is used to convert the input modulated signal to the
sequence of binary bits.

7. Channel Decoder :-It is used to remove the redundant bits added to the input sequence .
These extra bits are removed at receiver to check for any error with certain predefined
logic.

Page | 2
Nirmala K DIGITAL COMMUNICATION SYSTEM( 22428) EJ4I

8. Source Decoder:- The information received from channel decoder are given to source
decoder . it is used to convert the data into its original form and transfer it to the
destination.

9. Destination:- It is the final end , used to receive and display the information and data.

Advantages of Digital Communication:-

i. It has simple design due to high speed PC’s.


ii. Cost is less.
iii. High data carrying capacity.
iv. High security due to data encryption
v. Using multiplexing text, video, audio data can be merged and transmitted over common
channel.
vi. Effect of noise is less.
vii. Error detection is easier
viii. Very high processing speed.
ix. Very high accuracy.
x. Fast speed of response.

Disadvantages:-
i. Large Bandwidth .
ii. It needs Synchronization.

Analog Communication Digital Communication.


It transmits analog information signals from It transmits digital information signals from
source to destination. source to destination.
Bandwidth requirement is less. A Bandwidth requirement is very large. D
Design is complicated. D Design is simple.
Cost is high. D Cost is less.
Speed of response is slow. D Speed of response is fast.
Effect of noise is more. D Effect of noise is less. A
EMI interference is more. D EMI interference is less. A
Error Detection is difficult. D Error Detection is easier. A
AM,FM,PM techniques is used. ASK,FSK,PSK techniques is used.
Coding is not possible. D Coding is possible. A
Less security. D More security . A

1.2 Communication channel characteristics :Bit rate, Baud rate, Bandwidth,


repeater distance, applications:

Page | 3
Nirmala K DIGITAL COMMUNICATION SYSTEM( 22428) EJ4I

1.3 Concept of amount of Information:-


Let us consider the communication system which transmits messages
m1,m2,m3……. With probability of occurrence p1,p2,p3……. The amount of information
transmitted through the message Mk with probability Pk is given as:--

Amount of Information = Ik = log2(1/Pk)

Page | 4
Nirmala K DIGITAL COMMUNICATION SYSTEM( 22428) EJ4I

Formula:-
The unit of information is given as bits.
 Entropy:- The entropy of source refers to the average information content per symbols
in long message. It is defined in terms of bits/symbol
Information Rate= Symbol Rate x Source Entropy
= sym/sec x bits/sym
= Bits / sec

Consider that we have M different messages . Let these messages be


m1,m2,m3………….mm and they have probability of occurrences as p1,p2,p3,………….pm.

Suppose that a sequence of L messages is transmitted than if L is very large, then we may say
that,
P1L message of m1 are transmitted
P2L message of m2 are transmitted
P3L message of m3 are transmitted
.
.
.
.
PmL message of mm are transmitted
Hence the information due to message m1 will be:-
I1=log2(1/P1)

Since there are P1L number of messages of m1, the total information due to all message of m1
will be:-
I1(total)=P1L log2(1/P1)
Similarly
I2(total)=P2L log2(1/P2)
.
.
Im(total)=PmL log2(1/Pm)

Page | 5
Nirmala K DIGITAL COMMUNICATION SYSTEM( 22428) EJ4I

I(total)- I1+I2+………….ImT
Average Information /message=Total Information/Number of message
=I(total)/L
Entropy(H) = P1log2(1/P1) + P2log2(1/P2)+…………..Pmlog2(2/Pm)

Entropy(H) = M∑k=1 Pk
log2(1/Pk)
Thus

 Information Rate: The information rate is given as:-


R= r x H
Where R= information rate in bit/sec
H= Entropy in bits/msg
r = rate at which massage are generated
Defination:-It represent the minimum average data rate required to transmit information
from source to destination.
 Shannon’s First Theorem (Source coding Theorem )
It states that “ Given a discrete information memory less source of entropy H, the
average code word length L for any source encoding is bounded as

Page | 6
Nirmala K DIGITAL COMMUNICATION SYSTEM( 22428) EJ4I

L ≥ H or L min = H

 Channel Capacity:- It is the rate of transmission of information in bits/sec.

It is also defined as the maximum information rate with which the error probability is within the
tolerable limits.

 Shannon’s Second Theorem:-

It states that given a source of M equally likely messages with M>>>1, which
is generating information at a rate “R” then the channel capacity is given as :-

R≤C

 Shannon – Hartley theorem for Gaussian channel:-

This theorem is complimentary to the Shannon’s second theorem , it applies to a


channel in which the noise is Gaussian.

Channel capacity is given as :-

C=Blog2(1 + S/N)

Where B= channel bandwidth


S= Signal power.
N= noise with in the channel bandwidth.

Source coding-Huffman Coding


(added in 1.4 notes)

Page | 7

You might also like