R2031043
R2031043
C D
C D
041. When the input noise is white, the optimum filter is known as A
A Matched filter B Base band signal receiver
C Optimum filter D Integrate and dump receiver
042. The probability of error -----rapidly as Es/ -------- D
A Increases, increases B Decreases, decreases
C Increases, decreases D Decreases, increases
043. Probability of error for baseband signal receiver is given by C
A B
erfc [ 2Es/ ] 1/2 erfc [ Es/ 4] 1/2
C D
erfc [ Es/ ] 1/2 erfc [ 0.6Es/ ] 1/2
044. The output of Baseband signal receiver is B
A Square voltage B Ramp voltage
C Pulse voltage D Random voltage
045. Which detection technique gives less bit error rate A
A Coherent B non coherent
C envelope detector D Square law detector
046. Probability of error depends on B
A Signal energy and signal wave shape B Signal energy and not on signal wave
shape
C Signal wave shape and not on Signal D Not Signal energy and not on signal
energy wave shape
047. In ---- the frequency of the carrier signal is varied based on the information in a digital D
signal
A ASK B QPSK
C PSK D FSK
048. Which of the following statements is/ are correctS1: The probability of error decreases B
rapidly as Es/ decreases S2: Maximum value of probability of error is 0.5
A S1 and S2 are correct B only S2 correct
C only S1 correct D Both S1 and S2 are wrong
049. For M equally likely messages, the average amount of information H is B
A H = log10M B H = log2M
C H = log10M2 D H = 2log10M
050. The coding efficiency is given by A
A 1-Redundancy B 1+Redundancy
C 1/Redundancy D 2/Redundancy
051. A source emits one of four possible messages m1, m2, m3 with the probabilities of 1/2, B
1/4, 1/4. calculate entropy
A 2.5 bits/symbol B 1.5 bits/symbol
C 1.75 bits/symbol D 2.25 bits/symbol
N
052. If there are M=2 equally likely messages then amount of information carried by each B
message will be----bits
A M B N
C N-1 D N-2
053. What is the maximum value of probability of error B
A 1 B 1/2
C 1/4 D 1/8
054. Which of the following gives moderate probability of error B
A PSK B FSK
C ASK D QPSK
055. As per Shannon theorem the condition for error free transmission is ----- R-Information B
rate, C- channel capacity
A R greater than C B R less than or equal to C
C R much greater than C D R much less than C
056. As the bandwidth approaches infinity , the channel capacity becomes C
A Infinite B Zero
C 1.44 S/ D One
057. Which of the following is incorrect B
A I(x; y) = H(x) +H(y)- H(x, y) B I(x; y) = H(x) - H(y/x)
C I(x; y) = H(x) - H(x/y) D I(x; y) = H(y) - H(y/x)
058. What is the relationship between amount of information and probability of an event A
A Inversely proportional B Directly proportional
C Square relation D No relation
059. A given source will have maximum entropy if the messages produced are D
A Mutually exclusive B Statistically independent
C Two in number D Equiprobable
060. Which of the following is incorrect D
A Mutual information of a channel is B Mutual information is always positive
symmetric
C Mutual information of noise free D Mutual information is always negative
channel is H(X) or H(y)
061. The efficiency of Huffman code is linearly proportional to C
A Average length of code B Maximum length of code
C Average entropy D Received bits
062. The condition for Mutual information of noise free channel is C
A H(x) = 0 B H(y) = 0
C H(X/y) = 0 D H(X,y) = 0
063. A source emits one of four possible messages m1, m2, m3, m4 with the probabilities of A
1/2, 1/4, 1/8, 1/8. calculate information content of message m1 is
A 1 bit B 2 bits
C 3 bits D 4 bits
064. Following is not a unit of information C
A Bit B Decit
C Hz D Nat
065. The channel capacity of a discrete memory less channel with a bandwidth of 2 MHz B
and SNR of 31 is
A 20 Mbps B 10 Mbps
C 30 Mbps D 60 Mbps
066. The channel capacity of a discrete memory less channel with a bandwidth of 5 MHz A
and SNR of 15 is
A 20 Mbps B 10 Mbps
C 30 Mbps D 60 Mbps
067. Which of the following is incorrect C
A H(x, y) = H(x/y) + H(y) B H(y/x) = H(x, y) - H(x)
C I(x, y) = H(x) - H(y/x) D I(x, y) = H(y) - H(y/x)
068. A source emits one of four possible messages m1, m2, m3, m4 with the probabilities of C
1/2, 1/4, 1/8, 1/8. calculate information content of message m3 is
A 1 bit B 2 bits
C 3 bits D 4 bits
069. A source emits one of four possible messages m1, m2, m3, m4 with the probabilities of C
1/2, 1/4, 1/8, 1/8. calculate entropy
A 2.5 bits/symbol B 1.5 bits/symbol
C 1.75 bits/symbol D 2.25 bits/symbol
070. The relation between information Rate(R) , Entropy (H) and symbol rate(r) is A
A R= r*H B R= r/H
C R= r+H D R= r-H
071. The average information per individual message is known as C
A Encoding B Information Rate
C Entropy D Decoding
072. Channel capacity for noise-free channel is D
A C= max [H(X) - H(X/Y) ] B C= max [H(Y) - H(Y/X) ]
C C= max I(X;Y) D C= max H(X)
073. The units of entropy when logarithm base is 10, 2, and e, respectively. D
A dats, bits, and bans B bits, bytes, and Hartley
C bytes, dits, and nat D bans, bits, and nat
074. Consider a discrete random variable X with possible outcomes xi, i= 1, 2, .....,n. The C
self information of the even X=xi is defined as, (where in xi, i represents a subscript)
A I(xi)=log(p(xi)) B I(xi)=log(p(xi))/p(xi)
C I(xi)=-log(p(xi)) D I(xi)=log(p(xi))-p(xi)
075. An event has two possible with probability p1 = and p2 = 1/64 . The rate of information A
with 16 outcomes per second is
A 38/4 bits/sec B 38/64 bits/sec
C 38/2 bits/sec D 38/32 bits/sec
076. What is mutual information for the channel with H(X) = 1.571 bit/message H(X/Y) = B
0.612 bit/message
A 1.959 bit/message B 0.959 bit/message
C 1.462 bit/message D 0.689 bit/message
077. Entropy gives B
A Rate of information B Measure of uncertainty
C Amount of information D Probability of message
078. The ideal communication channel is defined for a system which has D
A Finite capacity B Bandwidth = 0
C Signal to Noise ratio = 0 D Infinite capacity
079. The entropy for a fair coin toss is exactly C
A 3 bits B 5 bits
C 1 bit D 2 bits
080. For which value(s) of p is the binary entropy function H(p) maximized? B
A 0 B 0.5
C 1 D 2
081. A discrete source emits one of 5 symbols once every millisecond with probabilities 1/2, A
1/4, 1/8, 1/16, 1/16respectively. What is the information rate
A 9400 bits/sec B 18800 bits/sec
C 19400 bits/sec D 16800 bits/sec
082. The relation between entropy and mutual information is A
A I(X;Y) = H(X) - H(X/Y) B I(X;Y) = H(X/Y) - H(Y/X)
C I(X;Y) = H(X) - H(Y) D I(X;Y) = H(Y) - H(X)
083. The information I contained in a message with probability of occurrence is given by (k A
is constant)
A I = k log21/P B I = k log2P
C I = k log21/2P D I = k log21/P2
084. 1 nat is equal to C
A 3.32 bits B 2.32 bits
C 1.44 bits D 3.44 bits
085. A source produces 26 symbols with equal probabilities. What is the average D
information produced by the source
A Less than 4 bits/symbol B 6 bits/symbol
C 7 bits/symbol D Between 4 bits/symbol and 6
bits/symbol
086. Assertion (A) : Entropy of a binary source is maximum if the probabilities of occurrence B
of the both events are equal Reason (R) : The average amount of information per
source symbol is called entropy for a memory less source
A Both A and R are individually true and B Both A and R are individually true but
R is the correct explanation of A R is not the correct explanation of A
C A is true but R is false D A is false but R is true
087. Discrete source S1 has 4 Equiprobable symbols while discrete source S2 has 1 B
Equiprobable symbols. When the entropy of these two sources is compared, entropy of
A S1 is greater than S2 B S1 is less than S2
C S1 is equal to S2 D Depends on rate of symbols/second
088. Which of the following statement is true? B
A Redundancy is minimum in Shannon- B Redundancy is minimum in Huffman
Fano coding compared to Huffman coding compared to Shannon-Fano
coding coding
C Shannon-Fano coding is known as D Lempel-Ziv coding requires
optimum code probabilities
089. In Shannon-Fano coding B
A the messages are first written in the B the messages are first written in the
order of increasing probability order of decreasing probability
C lowest two probabilities are combined D lowest three probabilities are
combined
090. The mutual information I (X,Y) = H(X) -H(X/Y) between two random variables X and Y D
satisfies
A I (X,Y) > 0 B I (X,Y) 0
C I (X,Y) 0, equality holds when X and D I (X,Y) 0, equality holds when X and
Y are uncorrelated Y are independent
091. The average information associated with an extremely unlikely message is zero. What C
is the average information associated with an extremely likely message
A Zero B Infinity
C Depends on total number of D Depends on speed of transmission of
messages the message
092. What is mutual information for the channel with H(Y) = 1.49 bit/message H(Y/X) = 0.53 C
bit/message
A 1.96 bit/message B 1 bit/message
C 0.96 bit/message D 0.689 bit/message
093. Hamming distance between 101 and 110 is B
A 1 B 2
C 3 D 4
094. A discrete memory source X with two symbols x1 and x2 and p(x1) = 0.9 , p(x2) = 0.1. B