0% found this document useful (0 votes)
117 views

Itc MCQ

1. The document discusses channel classification and capacities. It includes examples of 4-ary PAM sources and QAM sources, and calculates various entropies and mutual informations for these sources. 2. Several examples are given of calculating entropy, conditional entropy, joint entropy, and mutual information for random variables based on provided probability distributions. 3. Key concepts covered include independent sources, QAM symbol formation, entropy, conditional entropy, joint entropy, and mutual information.

Uploaded by

Gurulakshmi A B
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOC, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
117 views

Itc MCQ

1. The document discusses channel classification and capacities. It includes examples of 4-ary PAM sources and QAM sources, and calculates various entropies and mutual informations for these sources. 2. Several examples are given of calculating entropy, conditional entropy, joint entropy, and mutual information for random variables based on provided probability distributions. 3. Key concepts covered include independent sources, QAM symbol formation, entropy, conditional entropy, joint entropy, and mutual information.

Uploaded by

Gurulakshmi A B
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOC, PDF, TXT or read online on Scribd
You are on page 1/ 5

Module – 2 Channel classification and capacities

1) The levels of two independent 4− ary PAM sources are SA=SB= {−3α,−α,α,3α}
with Pr(3α)=Pr(α)=2Pr(−α)=2Pr(−3α) . A QAM source SQ is now formed from
symbols x=a+jb, where a, b are symbols from sources SA, SB. The source entropy H(SA) is

1.92

1.65

1.43

1.22
2) The levels of two independent 4-ary PAM sources are SA=SB= {−3α,−α,α,3α}
with Pr(3α)=Pr(α)=2Pr(−α)=2Pr(−3α). A QAM source SQ is now formed from
symbols x=a+jb, where a, b are symbols from sources SA, SB. The entropy H(SQ) is

1.92

2.86

2.55

3.84

3) Consider the table 1 shown where the entry for si, rj denotes Pr(X=si,Y=rj). The
entropy H(X) is

1.5

3
4) Consider the table shown where the entry for si, rj denotes Pr(X=si,Y=rj). The
entropy H(Y) is

1.5

3
5) The levels of two independent 4-ary PAM sources are SA=SB={−3α,−α,α,3α} with
Pr(3α)= Pr(α) = 2Pr(−α) = 2Pr(−3α). A QAM source SQ is now formed from symbols
x=a+jb, where a, b are symbols from sources SA,SB. The conditional entropy H(SA|SB) is,

1.92

2.86

3.84

6) Consider the table shown where the entry for si,rj denotes Pr(X=si,Y=rj). The joint entropy
H(X,Y) is,

1.5

7) Consider the table 1 shown where the entry for si,rj


denotes Pr(X=si,Y=rj). The conditional entropy H(X|Y) is,

1
1.5

8) Consider the table 1 shown where the entry for si,rj denotes Pr(X=si,Y=rj). The mutual
information I(X;Y) is,

1.5

3
9) The joint entropy H(X,Y) equals,

H(X)+H(Y)

H(X)+H(X|Y)

H(X|Y)+H(Y|X)+I(X;Y)

H(X)−H(X|Y)+H(Y)
10) Consider two sources X,Y . The mutual information I(X;Y) equals,

H(X)+H(Y)

H(X)−H(Y)

H(X)+H(X|Y)

H(X)+H(Y)–H(X,Y)

11) The levels of two independent 4-ary PAM sources areSA=SB={−3α,−α,α,3α}


with Pr(3α)=Pr(α)=2Pr(−α)=2Pr(−3α). A QAM source SQ is now formed from
symbols x=a+jb, where a,b are symbols from sources SA,SB. The mutual information I(SA,SB)
is,
0

1.92

2.86

3.84

12) Consider the random variable X with probability density function Ke−β|x| for −∞<x<∞
and β≥0. What is the value of the constant K
?

β2

β2

12

13) Consider the binary erasure channel shown in figure above where the input X∈{0,1}
and the output Y∈{0,1,e}, and the symbol e is termed as an erasure. As shown in the
figure Pr(Y=e|X=0)= Pr(Y=e|X=1)=p. Let Pr(X=0)=α. What is the
entropy H(X)?

H(α)
(1−p)H(α)

H(p)+(1−p)H(α)

14) Consider the binary erasure channel shown in figure above where the input X∈{0,1}
and the output Y∈{0,1,e}, and the symbol e is termed as an erasure. As shown in the
figure Pr(Y=e|X=0)= Pr(Y=e|X=1)=p. Let Pr(X=0)=α. What is the
entropy H(Y)?

H(α)

H(p)

H(α)+H(p)

(1−p)H(α)+H(p)

You might also like