0% found this document useful (0 votes)
23 views

Week 3

Uploaded by

Raja mitme
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
23 views

Week 3

Uploaded by

Raja mitme
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 3

Introduction to Machine Learning

Week 3
Prof. B. Ravindran, IIT Madras

1. (1 Mark) For a two-class problem using discriminant functions (δk - discriminant function for
class k), where is the separating hyperplane located?

(a) Where δ1 > δ2


(b) Where δ1 < δ2
(c) Where δ1 = δ2
(d) Where δ1 + δ2 = 1

Soln. C

2. (1 Mark) Given the following dataset consisting of two classes, A and B, calculate the prior
probability of each class.

Feature 1 Class
2.3 A
1.8 A
3.2 A
2.7 B
3.0 A
2.1 A
1.9 B
2.4 B

What are the prior probabilities of class A and class B?

(a) P (A) = 0.5, P (B) = 0.5


(b) P (A) = 0.625, P (B) = 0.375
(c) P (A) = 0.375, P (B) = 0.625
(d) P (A) = 0.6, P (B) = 0.4

Soln. B
3. (1 Mark) In a 3-class classification problem using linear regression, the output vectors for three
data points are [0.8, 0.3, -0.1], [0.2, 0.6, 0.2], and [-0.1, 0.4, 0.7]. To which classes would these
points be assigned?

(a) 1, 2, 1
(b) 1, 2, 2
(c) 1, 3, 2
(d) 1, 2, 3

1
Soln. D

4. (1 Mark) If you have a 5-class classification problem and want to avoid masking using polyno-
mial regression, what is the minimum degree of the polynomial you should use?
(a) 3
(b) 4
(c) 5
(d) 6
Soln. B
5. (1 Mark) Consider a logistic regression model where the predicted probability for a given data
point is 0.4. If the actual label for this data point is 1, what is the contribution of this data
point to the log-likelihood?
(a) -1.3219
(b) -0.9163
(c) +1.3219
(d) +0.9163
Soln. B
6. (1 Mark) What additional assumption does LDA make about the covariance matrix in com-
parison to the basic assumption of Gaussian class conditional density?
(a) The covariance matrix is diagonal
(b) The covariance matrix is identity
(c) The covariance matrix is the same for all classes
(d) The covariance matrix is different for each class
Soln. C
7. (1 Mark) What is the shape of the decision boundary in LDA?
(a) Quadratic
(b) Linear
(c) Circular
(d) Can not be determined
Soln. B
2 2
8. (1 Mark) For two classes C1 and C2 with within-class variances σw1 = 1 and σw2 = 4 respec-
′ ′
tively, if the projected means are µ1 = 1 and µ2 = 3, what is the Fisher criterion J(w)?

(a) 0.5
(b) 0.8
(c) 1.25

2
(d) 1.5

Soln. B
Sw = σw12 2
+ σw2 = 1 + 4 = 5 Sb = (µ′2 − µ′1 )2 = (3 − 1)2 = 4 J(w) = SSwb = 45 = 0.8
   
2 5
9. (2 Marks) Given two classes C1 and C2 with means µ1 = and µ2 = respectively, what
3 7
is the direction vector w for LDA when the within-class covariance matrix Sw is the identity
matrix I?
 
4
(a)
3
 
5
(b)
7
 
0.7
(c)
0.7
 
0.6
(d)
0.8
Soln. D
Sw ∝ µ1 − µ2

You might also like