0% found this document useful (0 votes)
20 views13 pages

SVM Presentation

ppt

Uploaded by

tabhanesumit348
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
20 views13 pages

SVM Presentation

ppt

Uploaded by

tabhanesumit348
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 13

Support Vector Machines (SVM)

Understanding the SVM Algorithm in


Machine Learning
Introduction to SVM
• Support Vector Machines (SVM) is a
supervised machine learning algorithm used
for classification and regression tasks.
• SVM is primarily used for classification by
finding a hyperplane that best separates
different classes.
Key Concepts of SVM
• 1. Hyperplane: A decision boundary that
separates different classes in the dataset.
• 2. Support Vectors: Data points that are
closest to the hyperplane and influence its
position.
• 3. Margin: The distance between the
hyperplane and the nearest support vectors.
Linear SVM
• Linear SVM is used when the data is linearly
separable, meaning a straight line or
hyperplane can separate the classes.
• Equation: w·x - b = 0
• Where w is the weight vector, x is the input
vector, and b is the bias term.
Example: Linear SVM
• In a 2D space, a linear SVM creates a line that
best separates two classes, with maximum
margin between the line and the nearest data
points (support vectors).
Non-Linear SVM
• Non-linear SVM is used when data is not
linearly separable. SVM uses kernel functions
to map data to higher dimensions.
• Common kernel functions:
• Polynomial Kernel
• Radial Basis Function (RBF) Kernel
Kernel Trick in SVM
• The kernel trick allows SVM to perform linear
classification in a transformed higher-
dimensional space, making it possible to
separate non-linearly separable data.
• This avoids explicitly computing the
coordinates of the data in higher dimensions,
making the algorithm efficient.
Example: Non-Linear SVM
• In non-linear SVM, the algorithm uses a kernel
function (e.g., RBF) to map data into a higher
dimension, allowing for separation using a
hyperplane.
SVM Hyperparameters
• 1. C (Regularization): Controls the trade-off
between maximizing the margin and
minimizing the classification error.
• 2. Gamma (for RBF Kernel): Defines how far
the influence of a single training example
reaches. Low gamma means far, high gamma
means close.
Advantages of SVM
• 1. Effective in high-dimensional spaces.
• 2. Works well for both linear and non-linear
data using kernel functions.
• 3. Effective when the number of dimensions
exceeds the number of samples.
Disadvantages of SVM
• 1. Choosing the right kernel function can be
challenging.
• 2. SVM is less effective for larger datasets.
• 3. It can be computationally expensive.
Applications of SVM
• 1. Text and hypertext categorization
• 2. Image classification
• 3. Bioinformatics (e.g., cancer classification)
• 4. Handwriting recognition
• 5. Face detection in images
Conclusion
• Support Vector Machines (SVM) is a versatile
algorithm that is widely used for classification
and regression tasks, especially when dealing
with high-dimensional data or when a clear
margin of separation between classes is desired.

• https://medium.com/@qjbqvwzmg/kernel-
methods-in-machine-learning-theory-and-
practice-b030bbe0eacc

You might also like