0% found this document useful (0 votes)
30 views3 pages

ML Week9 Soln

Uploaded by

Neetha Das
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
30 views3 pages

ML Week9 Soln

Uploaded by

Neetha Das
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 3

Introduction to Machine Learning

Week 9
Prof. B. Ravindran, IIT Madras

1. (2 marks) In the undirected graph given below, how many terms will be there in its potential
function factorization?

(a) 7
(b) 3
(c) 5
(d) 9
(e) None of the above

Soln. B - Three Cliques {A, B, C, D}, {A, D, E}, {B, C, F}


2. (2 Mark) Consider the following directed graph:

A B C

D E

Based on the d-separation rules, which of the following statements is true?


(a) A and C are conditionally independent given B
(b) A and E are conditionally independent given D
(c) B and E are conditionally dependent given C
(d) A and C are conditionally dependent given D and E

1
Soln. A - Conditioning on B blocks the only path A-B-C between A and C, making A and C
conditionally independent given B.

3. (1 Mark) Consider the following undirected graph:

A B E

C D

In the undirected graph given above, which nodes are conditionally independent of each other
given C? Select all that apply.

(a) A, E
(b) B, F
(c) A, D
(d) B, D
(e) None of the above

Soln. E - None of the paths between the given pairs are blocked when C is conditioned on.

4. (1 Marks) Consider the following statements about Hidden Markov Models (HMMs):

I. The ”Hidden” in HMM refers to the fact that the state transition probabilities are un-
known.
II. The ”Markov” property means that the current state depends only on the previous state.
III. The ”Hidden” aspect relates to the underlying state sequence that is not directly observ-
able.
IV. The ”Markov” in HMM indicates that the model uses matrix operations for calculations.

Which of the statements correctly describe the ”Hidden” and ”Markov” aspects of Hidden
Markov Models?

(a) I and II
(b) I and IV
(c) II and III
(d) III and IV

Soln. C - Refer to the lectures

2
5. (2 marks) For the given graphical model, what is the optimal variable elimination order when
trying to calculate P(E=e)?

A B C D E

(a) A, B, C, D
(b) D, C, B, A
(c) A, D, B, C
(d) D, A, C, A

Soln. A

XXXX
P (E = e) = P (a, b, c, d, e)
d c b a
XXXX
P (E = e) = P (a)P (b|a)P (c|b)P (d|c)P (e|d)
d c b a
X X X X
P (E = e) = P (e|d) P (d|c) P (c|b) P (a)P (b|a)
d c b a

6. (1 Marks) Consider the following statements regarding belief propagation:

I. Belief propagation is used to compute marginal probabilities in graphical models.


II. Belief propagation can be applied to both directed and undirected graphical models.
III. Belief propagation guarantees an exact solution when applied to loopy graphs.
IV. Belief propagation works by passing messages between nodes in a graph.

Which of the statements correctly describe the use of belief propagation?

(a) I and II
(b) II and III
(c) I, II, and IV
(d) I, III, and IV
(e) II, III, and IV

Soln. C - Refer to the lectures.

7. (1 Mark) HMMs are used for finding these. Select all that apply.
(a) Probability of a given observation sequence
(b) All possible hidden state sequences given an observation sequence
(c) Most probable observation sequence given the hidden states
(d) Most probable hidden states given the observation sequence
Soln. A, D - Refer to the lectures.

You might also like