Assignment Week 4
Assignment Week 4
Assignment- Week 4
TYPE OF QUESTION: MCQ
Number of questions: 10 Total mark: 10 X 2 = 20
______________________________________________________________________________
QUESTION 1:
A man is known to speak the truth 2 out of 3 times. He throws a die and reports that the number
obtained is 4. Find the probability that the number obtained is actually 4 :
A. 2/3
B. 3/4
C. 5/22
D. 2/7
2
𝑃(𝐵|𝐴) = 7
_________________________________________________________________
QUESTION 2:
Consider the following graphical model, mark which of the following pair of random variables
are independent given no evidence?
A. a,b
B. c,d
C. e,d
D. c,e
Correct Answer : A. a,b
Detailed Solution : Nodes a and b don’t have any predecessor nodes. As they don’t have any
common parent nodes, a and b are independent.
______________________________________________________________________________
QUESTION 3:
Two cards are drawn at random from a deck of 52 cards without replacement. What is the
probability of drawing a 2 and an Ace in that order?
A. 4/51
B. 1/13
C. 4/256
D. 4/663
Detailed Solution :
A : Drawing a 2
1*4 4
𝑃(𝐴𝐵) = 13*51
= 663
______________________________________________________________________________
QUESTION 4:
Consider the following Bayesian network. The random variables given in the model are
modeled as discrete variables (Rain = R, Sprinkler = S and Wet Grass = W) and the
corresponding probability values are given below.
P(R) = 0.1
P(S) = 0.2
P(W | R, S) = 0.8
P(W | R, ¬ S) = 0.7
P(W | ¬ R, S) = 0.6
P(W | ¬ R, ¬ S) = 0.5
A. 1
B. 0.5
C. 0.22
D. 0.78
𝑃(𝑊,𝑆,𝑅) 𝑃(𝑊𝑆𝑅)
Detailed Solution : 𝑃(𝑆|𝑊, 𝑅) = 𝑃(𝑊,𝑅)
=
𝑃(𝑊𝑆𝑅)+𝑃(𝑊𝑆𝑅)
𝑃(𝑊𝑆𝑅) = 𝑃(𝑊|𝑆, 𝑅) * 𝑃(𝑅) * 𝑃(𝑆) = 0. 8 * 0. 1 * 0. 2 = 0. 016
𝑃(𝑊𝑆𝑅) = 𝑃(𝑊|𝑆, 𝑅) * 𝑃(𝑅) * 𝑃(𝑆) = 0. 7 * 0. 1 * 0. 8 = 0. 056
____________________________________________________________________________
QUESTION 5:
What is the naive assumption in a Naive Bayes Classifier?
Correct Answer: B. All the features of a class are independent of each other
Detailed Solution: Naive Bayes Assumption is that all the features of a class are independent of
each other which is not the case in real life. Because of this assumption, the classifier is called
Naive Bayes Classifier.
_____________________________________________________________________________
QUESTION 6:
A drug test (random variable T) has 1% false positives (i.e., 1% of those not taking drugs show
positive in the test), and 5% false negatives (i.e., 5% of those taking drugs test negative).
Suppose that 2% of those tested are taking drugs. Determine the probability that somebody who
tests positive is actually taking drugs (random variable D).
A. 0.66
B. 0.34
C. 0.50
D. 0.91
______________________________________________________________________________
QUESTION 7:
It is given that 𝑃(𝐴|𝐵) = 2/3 and 𝑃(𝐴|𝐵) = 1/4. Compute the value of 𝑃(𝐵|𝐴).
A. ½
B. ⅔
C. ¾
D. Not enough information.
Correct Solution : D. Not enough information.
Detailed Solution : There are 3 unknown probabilities 𝑃(𝐴), 𝑃(𝐵), 𝑃(𝐴𝐵)which can not be
computed from the 2 given probabilities. So, we don’t have enough information to compute
𝑃(𝐵|𝐴).
______________________________________________________________________________
QUESTION 8:
Answer Questions 8-9 with the data given below:
A patient goes to a doctor with symptoms S1, S2 and S3. The doctor suspects disease D1and D2
and constructs a Bayesian network for the relation among the disease and symptoms as the
following:
Correct Answer: D.
Detailed Solution: From the figure, we can see that D1 and D2 are not dependent on any
variable as they don’t have any incoming directed edges. S1 has an incoming edge from D1,
hence S1 depends on D1. S2 has 2 incoming edges from D1 and D2, hence S2 depends on D1
and D2. S3 has an incoming edge from D2, S3 depends on D2. Hence, (D) is the answer.
______________________________________________________________________________
QUESTION 9:
Suppose P(D1) = 0.5, P(D2)=0.6 , P(S1|D1)=0.4 and P(S1| D1’ )= 0.6. Find P(S1)
A. 0.14
B. 0.36
C. 0.50
D. 0.66
Detailed Solution:
______________________________________________________________________________
QUESTION 10:
In a Bayesian network a node with only outgoing edge(s) represents
Detailed Solution : As there is no incoming edge for the node, the node is not
conditionally dependent on any other node.
___________________________________________________________________________
************END*******