ML Chapter 5 Part 2
ML Chapter 5 Part 2
• Figure – 6 illustrates one such example, where a support vector classifier was used to fit a small data
set. We have used solid line to show the hyperplane and dashed line to show the margins.
• For purple observations, the observation number 3, 4, 5, and 6 are on the correct side of the
margin, the observation 2 is on the margin, and observation 1 is on the wrong side of the margin.
• For the blue observations, observation number 7 and 10 are on the correct side of the margin,
observation number 9 is on the margin, observation number 8 is on the wrong side.
• Right hand panel is largely same as the left-hand panel except two additional observations, 11 and
12. These two points are on the wrong side of the hyperplane as well as the margin.
Support Vector Classifier - Explanation
•
Slack Variable and Tuning Parameter
•
Slack Variable and Tuning Parameter
•
Slack Variable and Tuning Parameter
•
Support Vector Classifier
Support Vector Machines
• In this section, we shall learn about a mechanism for converting a
linear classifier that produces non-linear decision boundaries.
• The support vector machines do this in an automatic way.
Non-Linear Decision Boundary
• The support vector classifiers work well if for a two-class setting, the
decision boundary between the two classes is linear.
• In practice, on several occasions, we face non-linear class boundaries.
In such cases, a SVC or any other linear classifier will perform poorly.
Non-Linear Decision Boundary
•
Non-Linear Decision Boundary
•
Kernel Function
• It is not easy to handle nonlinear transformation of input data into higher
dimensional.
• Many options are available but the options may be computationally heavy.
• The kernel functions were introduced to avoid some of these problems.
• Support Vector Machine (SVM) is an extension to the support vector
classifier.
• It results from transformation of input data in a specific way using kernels.
• The main idea behind the support vector machine is to enlarge our input
data so that we can accommodate a non-linear boundary between the
classes.
• This can be achieved efficiently using a kernel approach.
Kernel Function
•
Example – Kernel Function
•
Kernel Function
• Some of the common kernels are a polynomial kernel, sigmoid kernel,
and Gaussian radial basis function.
• Each one of these will result in a different nonlinear classifier in the
original input space.
SVM Two or More Classes
• The concept of separating hyperplanes does not naturally support
more than two classes.
• However, several proposals have been made to extend SVM to
K-classes.
• The two most popular approaches are one-versus-one and
one-versus-all approaches.
One-Versus-One Classification
•
One-Versus-All Classification
•
Thanks
Samatrix Consulting Pvt Ltd