0% found this document useful (0 votes)
56 views14 pages

Support Vector Machines

Support vector machines (SVMs) provide an alternative view of logistic regression that focuses on maximizing the margin between classes rather than minimizing errors. The SVM optimization objective aims to minimize a cost that depends on examples inside the margin and maximize the distance between the closest examples and the decision boundary. Nonlinear decision boundaries can be achieved using kernels to implicitly map data to higher dimensional spaces where a linear boundary can be found.

Uploaded by

Qurat Ul Ain
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPT, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
56 views14 pages

Support Vector Machines

Support vector machines (SVMs) provide an alternative view of logistic regression that focuses on maximizing the margin between classes rather than minimizing errors. The SVM optimization objective aims to minimize a cost that depends on examples inside the margin and maximize the distance between the closest examples and the decision boundary. Nonlinear decision boundaries can be achieved using kernels to implicitly map data to higher dimensional spaces where a linear boundary can be found.

Uploaded by

Qurat Ul Ain
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPT, PDF, TXT or read online on Scribd
You are on page 1/ 14

Support Vector

Machines
Alternative view of logistic regression

If , we want ,
If , we want ,
Alternative view of logistic regression
Cost of example:

If (want ): If (want ):

Close approximation :
Gives SVM computational
advantage
Support vector machine
Logistic regression:

Removing m will not degrade


Support vector machine: Optimal value much ?
ple
Exam

In Logistic Regression we care


more about 2nd term i.e. B

In SVM we care more about first term i.e. A


SVM hypothesis

Hypothesis:
Support Vector Machine

-1 1 -1 1

If , we want (not just )


If , we want (not just )
SVM Decision Boundary

Whenever :
Optimization objective
Of SVM

-1 1

Whenever :

Suppose we have set C to large value, Now in order to minimize


this equation we have to make term (under box) to zero. Which
can be done by setting costi() = 0, which can be achieved by
manipulating (Theta’ x)

-1 1
SVM Decision Boundary: Linearly separable case

x2

x1

Large margin classifier


Large margin classifier in presence of outliers

LMC disadvantage x2

x1
Non-linear Decision Boundary

x2

x1

Is there a different / better choice of the features ?


Kernel
Given , compute new feature depending
on proximity to landmarks
x2

x1
Kernels and Similarity
f1 will be maximum i.e. 1,
when x1 = 3 and x2 = 5
Example:
For points near l(1), l(2) (since theta1 = theta2 = 1),
We Predict y = 1 otherwise we predict y = 0

x2

x1

x is close to l(1) and far


away from l(2) and l(3)

You might also like