0% found this document useful (0 votes)
69 views14 pages

EC8501 Digital Communication

1) The document is an exam for a digital communications course. It contains multiple choice and long answer questions testing concepts like entropy, mutual information, channel coding, and pulse shaping techniques. 2) Key questions cover defining entropy and its maximum and minimum values for certain probability distributions. Properties of mutual information like symmetry and non-negativity are also addressed. 3) Pulse shaping techniques for digital modulation like adaptive quantization with forward/backward estimation and adaptive prediction with forward/backward estimation are examined through block diagrams and explanations.

Uploaded by

saru priya
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
69 views14 pages

EC8501 Digital Communication

1) The document is an exam for a digital communications course. It contains multiple choice and long answer questions testing concepts like entropy, mutual information, channel coding, and pulse shaping techniques. 2) Key questions cover defining entropy and its maximum and minimum values for certain probability distributions. Properties of mutual information like symmetry and non-negativity are also addressed. 3) Pulse shaping techniques for digital modulation like adaptive quantization with forward/backward estimation and adaptive prediction with forward/backward estimation are examined through block diagrams and explanations.

Uploaded by

saru priya
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 14

Register No.

R.M.K. COLLEGE OF ENGINEERING AND TECHNOLOGY


(An Autonomous Institution)
RSM Nagar, Puduvoyal– 601 206
QP CODE: F22502
B.E. / B.Tech Degree End Semester Examinations – November/December 2022
Semester (Full Time)
Regulations – 2017
B.E - ECE
EC8501 – Digital Communication
(Specify Any Chart or Tables etc. to be Perm itted)
Time: 3 Hours Maximum Marks: 100
Part A (10 x 2 = 20 Marks)

Answer ALL Questions

1. Mutual information I(X;Y) is a measure of the uncertainty about the channel input, which is
resolved by observing the channel output. It is also defined as the amount of information
transferred when xi transmitted when yj received.

where I(xi,yj) - Mutual Information


p(xi,yj) - Conditional probability that xi was transmitted and yj was received
p(xi) - Probability of symbol xi for transmission

2. H(S ) =
= +
= 2.1219 bits/symbol

3. 1. Slope of overload distortion – Occurs due to Smaller Step Size.


2. Granular noise - It occurs due to large step size and very small amplitude variation in the
input signal.

4. DC Component, Self-clocking, Error Detection, Bandwidth Compression, Differential


Encoding (Polarity Inversion), Noise immunity. (ANY FOUR)

5. 1.In digital communications, the equalizer's purpose is to reduce intersymbol interference to


allow recovery of the transmit symbols. It may be a simple linear filter or a complex
algorithm.
2. Equalizing filters must cancel out any group delay and phase delay between different
frequency components.

6. Correlative level coding is used to transmit a baseband signal with the signaling rate of 2Bo
over the channel of bandwidth Bo. This is made physically possible by allowing ISI in the
transmitted signal in controlled manner. This ISI is known to receiver. The correlative coding
is implemented by duo binary signaling and modified duo binary signaling.

7. 1. It has lower bandwidth efficiency.


2. The binary data is decoded by estimation of phase states of the signal. Detection and
recovery algorithms are very complex.
3. It is more sensitive to phase variations.
8.

9. Channel coding is often used in digital communication systems to protect the digital
information from noise and interference and reduce the number of bit errors. It is mostly
accomplished by selectively introducing redundant bits into the transmitted information
stream.

10. G(P) = D3 + D2 + 1
M(P) = [ 1 0 1 0] = 1*D3 + 0*D2 + 1*D + 0*D0 = D3 + D
X(P) = M(P) G(P)
= (D3 + D) (D3 + D2 + 1 )
= D6 + D5 +D3 + D4 + D3+ D
X(P) = D6 + D5 + D4 + D
= [ 1 1 1 0 0 1 0]

Part B (5 x 13 = 65 Marks)

Answer ALL Questions

11. a) i) The entropy of a discrete random variable, representing the output of a


source of information, is a measure of the average information content per
source symbol. (1M)
K −1

H (S ) = E [I (s k )] = ∑ p k I (s k )
k =0
K −1  1 
H (S ) = ∑ p k log2  
k =0  pk 
Property 1: H(S) = 0, if, and only if, the probability pk = 1 for some k, and
the remaining probabilities in the set are all zero; this lower bound on
entropy corresponds to no uncertainty. (2M)
Proof
 
We know that, H (S ) = ∑ pk log2  1 
K −1

k =0  pk 

Consider pk=1 for a particular value of k and pk=0 for all other values of k
then the above equation becomes
1 
H (S ) = 0 + 0 + .... + 0 + log2   + 0 + .... + 0
1
 
 log10 1 
H (S ) =  
 log10 2 
H (S ) = 0

Property 2: H(S) = log K, if, and only if, pk = 1/K for all k (i.e., all the
symbols in the source alphabet S are equiprobable); this upper bound on
entropy corresponds to maximum uncertainty. (3M)
Proof:
Consider the probability of all K messages as (1/k)
1
p=
0 p=
1 p=
2 = pK =
.... −1
K
K −1  1 
H (S ) = ∑ p k log2  
k =0  pk 

substituting the values of k and expanding the above equation we will get,
 1  1   1 
H (S ) p 0 log2   + p1 log2   + ... + pK −1 log2 
= 
 p0   p1   pK −1 

Substituting the probability values,


1 1 1 
H (S )   log2 (K ) +   log2 (K ) + ... + 
=  log2 (K )
K
  K
  K 
Since, log2(K) term is present K number of times the equation becomes,

1
H (S ) =   K log2 (K ) 
K 
H (S ) = log2 K
ii)

(OR)
b) i) The mutual information I(X;Y) is a measure of the uncertainty about the
channel input, which is resolved by observing the channel output.
Mutual information I(xi,yj) of a channel is defined as the amount of
information transferred when xi transmitted and yj received.
x 
log p  i 
y
I (x i , y j ) =  j  bits
p (x i )
I(xi,yj) - Mutual Information
 xi 
p  - Conditional probability that xi was transmitted and yj was
y
 j 
received
p(xi) - Probability of symbol xi for transmission (1M)
Properties of Mutual Information
a) Symmetry Property and its Proof
The mutual information of a channel is symmetric in the
sense that I(X;Y) =I(Y;X) (2M)
b) Expansion of the Mutual Information property and its Proof
The mutual information of a channel is related to the joint
entropy of the channel input and channel output by
I ( X ;Y ) = H ( X ) + H (Y ) − H ( X ,Y ) (2M)
c) Non negativity Property and its Proof
The mutual information is always nonnegative. We cannot
lose information, on the average, by observing the output of a channel.
I(X;Y) ≥ 0 (2M)
ii)

12. a) Definition – (1M)


DM Tranmsitter:
Block diagram (2M)
Explanation (4M)

DM Receiver:
Block diagram (2M)
Explanation (4M)
(OR)
b) Adaptive quantization with forward estimate (AQF)
Block diagram (1M)
Explanation (2M)

Adaptive quantization with backward estimate (AQB)


Block diagram (1M)
Explanation (2M)

Adaptive Prediction with Forward Estimation (APFE)


Block diagram (1M)
Explanation (2M)

Adaptive Prediction with Backward Estimation (APBE)


Block diagram (1M)
Explanation (3M)
13. a) Time Domain Criterion 1 → i =k (3M)
p (iTb − kTb) =

0 → i ≠ k
where p(0) =1, due to normalizing. If p(t) satisfies the above condition,
then the signal is free from ISI. y (ti ) = µai which indicates zero ISI in
the absence of noise. Hence the condition assures perfect reception in the
absence of noise.

Frequency Domain Criterion
∑ P (f − nRb ) =
n = −∞
Tb (3M)

represents the frequency domain condition for zero ISI. Also describes
Nyquist criterion for distortion less baseband transmission.
Raised Cosine Spectrum (7M)

 1
 2Bo , → f < f 1

 1   π ( f − f 1)  
P=
(f )  1 + cos    , → f 1 ≤ f < 2Bo − f 1
 4B o

 
 2B o − 2f 1 
 

0, → f ≥ 2Bo − f 1


(OR)
b) Definition (3M)
When the signal is passed through the channel, distortion is introduced in
terms of amplitude and phase. This distortion creates the problem of ISI and
hence the signal detection becomes difficult i.e., closure of eye occurs. This
distortion can be compensated with the help of equalizers (filters) which
helps in improving the system performance.
Adaptive Equalization (7M)

N
y (nT )
= ∑
i
Wix (nT
=0
− iT )
e(nT) = d(nT) - y(nT)
Operating Modes of Adaptive Equalizer (3M)
Training Mode
Decision-directed Mode

14. a) Principle: (2M)


In a coherent binary PSK system, binary 1 and 0 are represented by S1(t)
and S2(t) respectively,
(t) = - cos(2 t)

Generation: (3M)

Detection: (3M)

Probability of error: (5M)


(0)= erfc( )
(1) = erfc( )

(OR)
b) Principle: (2M)
In QPSK the phase of the carrier takes any one of the four values such as ,

, and . This is achieved by grouping the input binary sequence into


dibits. Each angle corresponds to a dibit combination.

Thus

Generation: (3M)

Detection: (3M)

Probability of error: (5M)

= 1- =

15. a) n=7, K=4, n-k=q=3

(i) Construct code words for this (7,4) code [6M]


Block size of message vector is 4 bits. Totally 16 possible message vectors
from 0000 to 1111

Generator Matrix

Code Vectors : [c1 c2 c3 ] = [m1 m2 m3 m4]

c1 = m1 m3 m4 ; c2 = m1 m2 m4 ; c3 = m1 m2 m3

(ii) Show that this code is Hamming code [2M]


No of check bits, q≥3 here q=3
Block Length, n= 2q-1= 23-1=7
No of message bits, k=n-q = 7-3 = 4
dmin = 3
Hence, it is a Hamming Code.
(iii) Decode the received word 0101100 [5M]

Syndrome is the 7th row of HT. Hence 6th bit is in error.


Correct Word Y = [ 0 1 0 1 1 0 1]
(OR)
b)
Part C (1 x 15 = 15 Marks)

Answer ALL Questions

16. a) i) Datastream : 11011010 (EACH CODING FORAMT 4M)

Datastream : 1100110 (EACH CODING FORAMT 4M)


ii) For a uni-polar format of the NRZ type

Ak = {a, symbol 1
0, symbol 0 }
with P(Ak= 0 ) = P(Ak= a ) = 1 / 2 Hence for n = 0, it can be written as,

RA(n) = E[AkAk-0] = (0)2P(Ak =0) + (a)2P(Ak =a) = a2/2

Consider the next product AkAk-n for n ≠ 0 , RA(n) = E[AkAk-n]

This product has four possible values namely 0, 0 , 0 and . Assuming that
the successive symbols in the binary sequence are statistically independent
these four values occur with a probability of 1/4 each. Hence for n ≠ 0, this
can be written as,

P(Ak= 0 ) = P(Ak= 0 ) = P(Ak= 0 ) = P(Ak= a2 ) = 1 / 4

E[AkAk-n ] = 3(0) (1/4) + a2 (1/4) = a2 / 4

RA(n) = a2/2 ,n = 0 ; a2/4 ,n ≠ 0

For the basic pulse v(t) we have a rectangular pulse of unit amplitude and
duration Tb. Hence the Fourier transform of v(t) equals to

V(f) = Tbsinc (fTb) (3M)

The power spectral density is given by

+ (4M)
(OR)
b) i) Δ = 0.75W
fs = 30(2W) = 60W (2M)
The maximum permissible value of a0 is
Assume fo = W

ao(max) = 7.16 v (6M)


ii) Signaling rate
PCM System
r= v * fs =8 X 10 KHz = 80 Kbps (1M)

DM System (2M)

r= v * fs =1 X 10 KHz = 10 Kbps (Transmits only one bit per sample)

Bandwidth
PCM System (2M)
BT = Signaling rate/2 = 80000/2 = 40 KHz

DM System (2M)

BT = Signaling rate/2 BT = 10000/2 = 5 KHz



Knowledge Level (Blooms Taxonomy)


Remembering Understanding Applying (Application of
K1 K2 K3
(Knowledge) (Comprehension) Knowledge)
K4 Analyzing (Analysis) K5 Evaluating (Evaluation) K6 Creating (Synthesis)

Course Outcomes
After successful completion of the course, the students should be able to
CO1 Describe the concepts of information theory and coding
CO2 Compare the various waveform coding techniques
CO3 Describe the baseband transmission and reception schemes
CO4 Illustrate the different digital modulation schemes and equalization techniques
CO5 Determine PSD and BER of various digital modulation schemes
CO6 Construct different error control codes
Knowledge Level – Question wise Mapping
Part Part A
Question 1 2 3 4 5 6 7 8 9 10
K Level K2 K1 K1 K1 K1 K1 K1 K1 K2 K1

COs CO1 CO1 CO2 CO2 CO4 CO3 CO4 CO4 CO6 CO6

Part Part B Part C


Question 11 a) 11 b) 12 a) 12 b) 13 a) 13 b) 14 a) 14 b) 15 a) 15 b) 16 a) 16 (b)
K Level K1 K2 K2 K2 K2 K4 K2 K2 K2 K2 K2 K2

COs CO1 CO1 CO2 CO2 CO3 CO4 CO4 CO4 CO6 CO6 CO2 CO2
CO5 CO5

You might also like