0% found this document useful (0 votes)
5 views

Assignment 5 Solution

The document contains a series of questions and answers related to Support Vector Machines (SVM), focusing on concepts such as the margin of a hyperplane, optimization problems, and the role of Lagrange multipliers. Key points include the definitions of hard margin and soft margin SVMs, and the dual optimization problem solved using quadratic programming. The answers provided clarify the conditions under which SVMs are effective and the mathematical formulations involved in their design.
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
5 views

Assignment 5 Solution

The document contains a series of questions and answers related to Support Vector Machines (SVM), focusing on concepts such as the margin of a hyperplane, optimization problems, and the role of Lagrange multipliers. Key points include the definitions of hard margin and soft margin SVMs, and the dual optimization problem solved using quadratic programming. The answers provided clarify the conditions under which SVMs are effective and the mathematical formulations involved in their design.
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 3

Data Mining: Assignment Week 5: Support Vector Machine

1. Margin of a hyperplane is defined as:

A. The angle it makes with the axes

B. The intercept it makes on the axes

C. Perpendicular distance from its closest point

D. Perpendicular distance from origin

Ans: C

2. In a hard margin support vector machine:

A. No training instances lie inside the margin

B. All the training instances lie inside the margin

C. Only few training instances lie inside the margin

D. None of the above

Ans: A

3. The primal optimization problem solved to obtain the hard margin optimal
separating hyperplane is:

A. Minimize ½ WTW, such that yi(WTXi+b) ≥ 1 for all i

B. Maximize ½ WTW, such that yi(WTXi+b) ≥ 1 for all i

C. Minimize ½ WTW, such that yi(WTXi+b) ≤ 1 for all i

D. Maximize ½ WTW, such that yi(WTXi+b) ≤ 1 for all i

Ans: A

4. The dual optimization problem solved to obtain the hard margin optimal separating
hyperplane is:

A. Maximize ½ WTW, such that yi(WTXi+b) ≥ 1- αi for all i

B. Minimize ½ WTW -  αi(yi(WTXi+b) -1), such that αi ≥ 0, for all i

C. Minimize ½ WTW -  αi, such that yi(WTXi+b) ≤ 1 for all i

D. Maximize ½ WTW +  αi , such that yi(WTXi+b) ≤ 1 for all i


Ans: B

5. The Lagrange multipliers corresponding to the support vectors have a value:

A. equal to zero

B. less than zero

C. greater than zero

D. can take on any value

Ans: C

6. The SVM’s are less effective when:

A. The data is linearly separable


B. The data is clean and ready to use
C. The data is noisy and contains overlapping points
D. None of the above

Ans: C

7. The dual optimization problem in SVM design is solved using:

A. Linear programming

B. Quadratic programming

C. Dynamic programming

D. Integer programming

Ans: B
8. The relative performance of a SVM on training set and unknown samples is
controlled by:

A. Lagrange multipliers

B. Margin

C. Slack

D. Generalization constant C

Ans: D

9. The primal optimization problem that is solved to obtain the optimal separating
hyperplane in soft margin SVM is:

A. Minimize ½ WTW, such that yi(WTXi+b) ≥ 1-i for all i

B. Minimize ½ WTW + Ci2, such that yi(WTXi+b) ≥ 1-i for all i

C. Minimize ½ WTW, such that yi(WTXi+b) ≥ 1-i 2for all i

D. Minimize ½ WTW+ Ci2, such that yi(WTXi+b) ≥ 1 for all i

Ans: B

10. We are designing a SVM WTX+b=0, suppose Xj’s are the support vectors and αj’s
the corresponding Lagrange multipliers, then which of the following statements are
correct:

A. W =  αjyjXj

B.  αjyj = 0

C. Either A or B

D. Both A and B

Ans: D

You might also like