EC8501 Digital Communication
EC8501 Digital Communication
1. Mutual information I(X;Y) is a measure of the uncertainty about the channel input, which is
resolved by observing the channel output. It is also defined as the amount of information
transferred when xi transmitted when yj received.
2. H(S ) =
= +
= 2.1219 bits/symbol
6. Correlative level coding is used to transmit a baseband signal with the signaling rate of 2Bo
over the channel of bandwidth Bo. This is made physically possible by allowing ISI in the
transmitted signal in controlled manner. This ISI is known to receiver. The correlative coding
is implemented by duo binary signaling and modified duo binary signaling.
9. Channel coding is often used in digital communication systems to protect the digital
information from noise and interference and reduce the number of bit errors. It is mostly
accomplished by selectively introducing redundant bits into the transmitted information
stream.
10. G(P) = D3 + D2 + 1
M(P) = [ 1 0 1 0] = 1*D3 + 0*D2 + 1*D + 0*D0 = D3 + D
X(P) = M(P) G(P)
= (D3 + D) (D3 + D2 + 1 )
= D6 + D5 +D3 + D4 + D3+ D
X(P) = D6 + D5 + D4 + D
= [ 1 1 1 0 0 1 0]
Part B (5 x 13 = 65 Marks)
H (S ) = E [I (s k )] = ∑ p k I (s k )
k =0
K −1 1
H (S ) = ∑ p k log2
k =0 pk
Property 1: H(S) = 0, if, and only if, the probability pk = 1 for some k, and
the remaining probabilities in the set are all zero; this lower bound on
entropy corresponds to no uncertainty. (2M)
Proof
We know that, H (S ) = ∑ pk log2 1
K −1
k =0 pk
Consider pk=1 for a particular value of k and pk=0 for all other values of k
then the above equation becomes
1
H (S ) = 0 + 0 + .... + 0 + log2 + 0 + .... + 0
1
log10 1
H (S ) =
log10 2
H (S ) = 0
Property 2: H(S) = log K, if, and only if, pk = 1/K for all k (i.e., all the
symbols in the source alphabet S are equiprobable); this upper bound on
entropy corresponds to maximum uncertainty. (3M)
Proof:
Consider the probability of all K messages as (1/k)
1
p=
0 p=
1 p=
2 = pK =
.... −1
K
K −1 1
H (S ) = ∑ p k log2
k =0 pk
substituting the values of k and expanding the above equation we will get,
1 1 1
H (S ) p 0 log2 + p1 log2 + ... + pK −1 log2
=
p0 p1 pK −1
1
H (S ) = K log2 (K )
K
H (S ) = log2 K
ii)
(OR)
b) i) The mutual information I(X;Y) is a measure of the uncertainty about the
channel input, which is resolved by observing the channel output.
Mutual information I(xi,yj) of a channel is defined as the amount of
information transferred when xi transmitted and yj received.
x
log p i
y
I (x i , y j ) = j bits
p (x i )
I(xi,yj) - Mutual Information
xi
p - Conditional probability that xi was transmitted and yj was
y
j
received
p(xi) - Probability of symbol xi for transmission (1M)
Properties of Mutual Information
a) Symmetry Property and its Proof
The mutual information of a channel is symmetric in the
sense that I(X;Y) =I(Y;X) (2M)
b) Expansion of the Mutual Information property and its Proof
The mutual information of a channel is related to the joint
entropy of the channel input and channel output by
I ( X ;Y ) = H ( X ) + H (Y ) − H ( X ,Y ) (2M)
c) Non negativity Property and its Proof
The mutual information is always nonnegative. We cannot
lose information, on the average, by observing the output of a channel.
I(X;Y) ≥ 0 (2M)
ii)
DM Receiver:
Block diagram (2M)
Explanation (4M)
(OR)
b) Adaptive quantization with forward estimate (AQF)
Block diagram (1M)
Explanation (2M)
represents the frequency domain condition for zero ISI. Also describes
Nyquist criterion for distortion less baseband transmission.
Raised Cosine Spectrum (7M)
1
2Bo , → f < f 1
1 π ( f − f 1)
P=
(f ) 1 + cos , → f 1 ≤ f < 2Bo − f 1
4B o
2B o − 2f 1
0, → f ≥ 2Bo − f 1
(OR)
b) Definition (3M)
When the signal is passed through the channel, distortion is introduced in
terms of amplitude and phase. This distortion creates the problem of ISI and
hence the signal detection becomes difficult i.e., closure of eye occurs. This
distortion can be compensated with the help of equalizers (filters) which
helps in improving the system performance.
Adaptive Equalization (7M)
N
y (nT )
= ∑
i
Wix (nT
=0
− iT )
e(nT) = d(nT) - y(nT)
Operating Modes of Adaptive Equalizer (3M)
Training Mode
Decision-directed Mode
Generation: (3M)
Detection: (3M)
(OR)
b) Principle: (2M)
In QPSK the phase of the carrier takes any one of the four values such as ,
Thus
Generation: (3M)
Detection: (3M)
= 1- =
Generator Matrix
c1 = m1 m3 m4 ; c2 = m1 m2 m4 ; c3 = m1 m2 m3
Ak = {a, symbol 1
0, symbol 0 }
with P(Ak= 0 ) = P(Ak= a ) = 1 / 2 Hence for n = 0, it can be written as,
This product has four possible values namely 0, 0 , 0 and . Assuming that
the successive symbols in the binary sequence are statistically independent
these four values occur with a probability of 1/4 each. Hence for n ≠ 0, this
can be written as,
For the basic pulse v(t) we have a rectangular pulse of unit amplitude and
duration Tb. Hence the Fourier transform of v(t) equals to
+ (4M)
(OR)
b) i) Δ = 0.75W
fs = 30(2W) = 60W (2M)
The maximum permissible value of a0 is
Assume fo = W
DM System (2M)
Bandwidth
PCM System (2M)
BT = Signaling rate/2 = 80000/2 = 40 KHz
DM System (2M)
Course Outcomes
After successful completion of the course, the students should be able to
CO1 Describe the concepts of information theory and coding
CO2 Compare the various waveform coding techniques
CO3 Describe the baseband transmission and reception schemes
CO4 Illustrate the different digital modulation schemes and equalization techniques
CO5 Determine PSD and BER of various digital modulation schemes
CO6 Construct different error control codes
Knowledge Level – Question wise Mapping
Part Part A
Question 1 2 3 4 5 6 7 8 9 10
K Level K2 K1 K1 K1 K1 K1 K1 K1 K2 K1
COs CO1 CO1 CO2 CO2 CO4 CO3 CO4 CO4 CO6 CO6
COs CO1 CO1 CO2 CO2 CO3 CO4 CO4 CO4 CO6 CO6 CO2 CO2
CO5 CO5