Data An-6
Data An-6
FROM SVM TO KM
Kernel Machines
Kernel Machines
• Polynomial Kernel:
Where x and y are the input feature vectors, gamma is a parameter that controls the width of
the Gaussian function, and
||x - y||^2 is the squared Euclidean distance between the input vectors
• Laplace Kernel
Where x and y are the input feature vectors, gamma is a parameter that controls the width of
the
Laplacian function, and ||x - y|| is the L1 norm or Manhattan distance between the input
vectors.
Consider an example to find the similarity between two vectors – ‘x’ and ‘y’, using
Cosine Similarity. The ‘x’ vector has values, x = { 3, 2, 0, 5 } The ‘y’ vector has values,
y = { 1, 0, 0, 0 } The formula for calculating the cosine similarity is :
Cosine_Sim (x, y) = x . y / ||x|| * ||y||
• Characteristics of Kernel Function
• Symmetry: A kernel function is symmetric, meaning that it
produces the same value regardless of the order in which the
inputs are given.
• Reproducing property: it can be used to reconstruct the input
data in the feature space.
• Smoothness: a smooth transformation of the input data into the
feature space.
• Complexity: more complex kernel functions may lead to over
fitting and reduced generalization performance
• Characteristics of Kernel Function
• Symmetry: A kernel function is symmetric, meaning that it
produces the same value regardless of the order in which the
inputs are given.
• Reproducing property: it can be used to reconstruct the input
data in the feature space.
• Smoothness: a smooth transformation of the input data into the
feature space.
• Complexity: more complex kernel functions may lead to over
fitting and reduced generalization performance
Conclusion
- Kernel machines extend linear algorithms to work on non-linear data.
- SVM is a powerful tool when combined with kernel functions.
- Final Thoughts:
- Choosing the right kernel is crucial for the success of the model.
- Kernel methods continue to be relevant in various domains of machine learning.
---