0% found this document useful (0 votes)
7 views8 pages

Lecture 09

The document discusses support vector machines and how they can classify data by converting it to higher dimensions using kernel functions. Kernels like polynomial and radial basis function kernels are used to calculate relationships between data points without explicitly transforming the data. This allows SVMs to find support vector classifiers even in higher dimensions.

Uploaded by

Simon Lavenzer
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
7 views8 pages

Lecture 09

The document discusses support vector machines and how they can classify data by converting it to higher dimensions using kernel functions. Kernels like polynomial and radial basis function kernels are used to calculate relationships between data points without explicitly transforming the data. This allows SVMs to find support vector classifiers even in higher dimensions.

Uploaded by

Simon Lavenzer
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 8

AI for Mechanical Engineering

Dr. Arsalan Arif

SVM

Artificial Intelligence A Modern Approach


Stuart J. Russell and Peter Norvig

Spring 2023
Support Vector Machine
Y axis is added to same data

Y axis =(X-axis

SVM converts the data from lower to


higher dimension

X-axis
Support Vector Machine
Now SVM can classify new observations

• In this case one dimensional data is converted to 2 dimensional, why


not 3 or 4 ?
• Kernel Function will decide it to find SV classifier in higher
dimensions
Y axis =(X-axis

X-axis
Support Vector Machine
Polynomial Kernel

• Parameter “d” is for degree of the polynomial


• For d=1, The Kernel finds the relation between each pair of
observation in 1-D
• For d=2, The Kernel finds the relation between each pair of
observation in 2-D
Y axis =(X-axis

• For d=3, The Kernel finds the relation between each pair of
observation in 3-D

• Relationships between each pair of observation will be used to find


Support vector Classifier

• By cross validation a good value of D can be found


Support Vector Machine
Radial Kernel
• Kernel computes relationships between each value
• Uses weighted nearest neighbor
• New data uses the classification of the closest data
Support Vector Machine
Radial Kernel
• Kernel functions calculate the relation between each pair as if they are in higher domain but don’t actually transform them
is called the Kernel trick
• Reduced the amount of computation to transform data from low dimension to high dimension
Support Vector Machine
Polynomial Kernel Kernel
Support Vector Machine
Polynomial Kernel Kernel
¿ ¿
¿
r= Coefficient of the polynomial Let
d= degree of polynomial r=
d= 2

¿ ¿ ¿
¿ ¿ a b

X axis, Y axis, Z axis

Once r and D are decided, put the values of observation to get the high dimensional relationship

¿
¿
¿ 20,880.25 is one of the 2 D relationship. Compare the 2 data sets
We will solve for SVM but we didn't transform the data to 2-D

You might also like