0% found this document useful (0 votes)
9 views19 pages

Machine Learning Unit 5 Part 1

Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
9 views19 pages

Machine Learning Unit 5 Part 1

Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 19

Machine Learning

Prof. Shraddha Kumar


Department of Computer Science

S . D. Bansal College, Indore


What we will learn

1.Support Vector Machines


2.Types of SVM
3.Pros and Cons of Support Vector machines

01
Support Vector Machine

• SVM is one of the most popular


Supervised Learning algorithms
• Used for Classification as well as
Regression problems.
• However, primarily, it is used for
Classification problems in Machine
Learning.

02
Support Vector Machine

• The goal of the SVM algorithm is to create


the line or decision boundary that can
segregate n-dimensional space into classes
so that we can easily put the new data point
in the correct category in the future.
• This best decision boundary is called a
hyperplane.

02
Support Vector Machine
• SVM algorithm can be used for
• Face detection,
• Image classification,
• Text categorization, etc.

02
Support Vector Machine

• SVM chooses the extreme data


points/vectors that help in creating the
hyperplane.
• These extreme cases are called as
support vectors, and hence algorithm
is termed as Support Vector Machine.

03
Support Vector Machine

• Consider the below diagram in which


there are two different categories that
are classified using a decision
boundary or hyperplane:

03
Hyperplane and Support Vectors in the SVM algo:

•Hyperplane: There can be multiple


lines/decision boundaries to segregate the
classes in n-dimensional space, but we need
to find out the best decision boundary that
helps to classify the data points. This best
boundary is known as the hyperplane of
SVM.

04
Hyperplane and Support Vectors in the SVM algo:

•The dimensions of the hyperplane depend on the features


present in the dataset, which means if there are 2 features (as
shown in image), then hyperplane will be a straight line. And if
there are 3 features, then hyperplane will be a 2-dimension plane.
•We always create a hyperplane that has a maximum margin,
which means the maximum distance between the data points.

04
Hyperplane and Support Vectors in the SVM algo:

Support Vectors:
The data points or vectors that are the closest to the hyperplane
and which affect the position of the hyperplane are termed as
Support Vector. Since these vectors support the hyperplane,
hence called a Support vector.

05
Types of SVM
•Linear SVM: Linear SVM is used for linearly separable data, which means
if a dataset can be classified into two classes by using a single straight line,
then such data is termed as linearly separable data, and classifier is used
called as Linear SVM classifier.
•Non-linear SVM: Non-Linear SVM is used for non-linearly separated data,
which means if a dataset cannot be classified by using a straight line, then
such data is termed as non-linear data and classifier used is called as Non-
linear SVM classifier.

06
How does SVM works?
•Linear SVM:
•The working of the SVM algorithm can be understood by using an example. Suppose
we have a dataset that has two tags (green and blue), and the dataset has two features
x1 and x2. We want a classifier that can classify the pair(x1, x2) of coordinates in either
green or blue. Consider the below image:

07
How does SVM works?

• The SVM algorithm helps to find the best line or decision boundary; this
best boundary or region is called as a hyperplane.
• SVM algorithm finds the closest point of the lines from both the classes.
These points are called support vectors.
• The distance between the vectors and the hyperplane is called as margin.
And the goal of SVM is to maximize this margin.
• The hyperplane with maximum margin is called the optimal hyperplane.

08
How does SVM works?
Non-Linear SVM:
If data is linearly arranged, then we can separate it by using a
straight line, but for non-linear data, we cannot draw a single straight
line. Consider the below image:
So to separate these data points, we need to add one more
dimension. For linear data, we have used two dimensions x and y,
so for non-linear data, we will add a third dimension z. It can be
calculated as:
z=x2 +y2

09
Pros and Cons associated with SVM

Pros:
• It works really well with a clear margin of separation
• It is effective in high dimensional spaces.
• It uses a subset of training points in the decision function (called support
vectors), so it is also memory efficient.

10
Pros and Cons associated with SVM

Cons:
• It doesn’t perform well when we have large data set because the required
training time is higher
• It also doesn’t perform very well, when the data set has more noise i.e.
target classes are overlapping
• SVM doesn’t directly provide probability estimates, these are calculated
using an expensive five-fold cross-validation.

10
Conclusion

01 Support Vector Machines

02 Types of SVM

03 Pros and Cons of Support Vector machines

26
Thank you so much!
For more info please contact us

0755 - 3501700
[email protected]

Video Lecture by : Prof. Shraddha Kumar

You might also like