Support Vector Machines 1639601280
Support Vector Machines 1639601280
Week 12
3
(b) 1Dimensional Data
What classifier would you pick now?
4
(c) 1Dimensional Data
What classifier would you pick this time?
5
Soft margins
For 1D we only need a
point to set threshold.
It is considered as 0-
dimensional hyperplane
Soft margins
are more flexible and
allow misclassification.
7
2Dimensional Data
For 2Dimensional Data
we need a line to set
threshold. It is
considered as 1-
dimensional hyperplane.
Similar situation we
can imagine for higher
dimensions.
8
SVM: Linear Kernel
What classifier would you pick?
9
SVM: Polynomial Kernel
We can plot our data in
the following coordinates:
X = x
Y = x2
The algorithm of
transforming data and
finding SVC is called
Support Vector Machines 10
SVM: Polynomial Kernel
In order to classify a new
data point, we’ll need to
classify it’s x-x2 position.
Sigmoid kernel
Radial kernel
Polynomial kernel
12
SVM: Multi-Class
Besides binary
classification (2
classes) SVM can
handle tasks
classifying more
than 2 classes
(multi-class).
Strategies involved:
● 1 vs all
● 1 vs 1
13
SVM: Multi-Class: 1 vs all
Creating training sets
Red vs Purple:
Winner: Red Compared to 1 vs all, 1
vs 1 compares whether
the data point should
be classified as either
Green vs Red: class. The data point is
Winner: Red assigned to class with
most “wins”. 15
Result: Red
Support Vector Machines
●
Works well on datasets with many ●
Require high training time, so not
features recommended for large datasets.
●
Provides a clear separation margin ●
Very sensitive to outliers.
●
Effective for datasets where the
number of features are greater
than the number of data points
●
Possible to specify different kernel
functions to make a proper
decision boundary 16