0% found this document useful (0 votes)
14 views6 pages

SVM Tutorial

The document is a tutorial on Support Vector Machines (SVM) covering various questions related to SVM concepts such as support vectors, Lagrange multipliers, optimal hyperplanes, and the impact of the slack penalty weight (C) on decision boundaries. It includes specific data points and asks for calculations related to SVM parameters and hyperplane equations. The tutorial is structured as a series of questions aimed at testing understanding of SVM principles.

Uploaded by

asmisriva
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
14 views6 pages

SVM Tutorial

The document is a tutorial on Support Vector Machines (SVM) covering various questions related to SVM concepts such as support vectors, Lagrange multipliers, optimal hyperplanes, and the impact of the slack penalty weight (C) on decision boundaries. It includes specific data points and asks for calculations related to SVM parameters and hyperplane equations. The tutorial is structured as a series of questions aimed at testing understanding of SVM principles.

Uploaded by

asmisriva
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 6

SVM Tutorial

Machine Learning
Which one of the following sets is a possible set of suppo vector
1. {x1,x2,x5}
2. {x3,x4,x5}
3. {x4,x5}
4. {x1,x2,x3,x4}
Question2: Consider the hard margin SVM trained on only the two data points shown
below. What are the values for αi (the Lagrange multipliers) and the offset b that would
give the maximal margin linear classifier for these points?

i Xi Yi
1 0 +1
2 4 -1
A. α1 =α2 = 1/8, b = 1
B. α1 =α2 = 1/4, b = 0
C. α1 =α2 = 1/2, b = 1
D. α1 =α2 = 1/4, b = -1
Question 3:Consider a dataset with the following points in a two-dimensional space:
Positive class :(4,2),(6,2)(4,2),(6,2).
Negative class :(1,5),(3,5)(1,5),(3,5).
Find the equation of the optimal hyperplane that separates these points using a hard-margin SVM.
Question 4: Consider a Soft SVM applied to a training set three times, each with a different
value of the hyperparameter C (the slack penalty weight). The results of these runs are as
follows:
In the first run, with C=CA, the decision boundary is denoted as A.
In the second run, with C=CB, the decision boundary is denoted as B.
In the third run, with C=CC, the decision boundary is denoted as C.

Give the relationship between CA CB CC


Question 5:Using the SVM algorithm, find the hyperplane with maximum margin
for the following data.

X1 X2 X3
2 1 +1
4 3 -1

You might also like