Regression Bayesian SVM Notes
Regression Bayesian SVM Notes
REGRESSION
Linear Regression:
- Definition:
It is a statistical method to predict the value of one variable (dependent) based on another
(independent).
- Key Points:
- Used for predicting continuous values like house prices, salaries, etc.
- The goal is to find the best-fit line that minimizes the error.
- Error is calculated using the Least Squares Method (difference between actual and predicted
values).
- Formula:
y = mx + c
Where:
- Example:
- Output: Price.
- Definition:
Used when the output is a categorical value (e.g., Yes/No, 0/1). Instead of predicting a continuous
- Key Points:
- Formula:
P = 1 / (1 + e^(-z))
Where z = mx + c.
- How it works:
- Example:
2. BAYESIAN LEARNING
Bayes' Theorem:
- Definition:
A formula used to find the probability of an event occurring based on prior knowledge.
- Formula:
P(A|B) = (P(B|A) * P(A)) / P(B)
Where:
- Example:
If 1% of people have a disease and a test is 99% accurate, what's the probability you have the
Concept Learning:
- Definition:
The process of learning a general rule from examples. Bayesian methods use probabilities to pick
- Example:
If it rains, people use umbrellas 80% of the time. From past data, the model learns the concept:
- Definition:
Combines all possible hypotheses and chooses the one with the highest probability to classify
data.
- Key Point:
- Definition:
A simpler version of the Bayes classifier that assumes all features are independent.
- Formula:
- Example:
- Definition:
- Example:
EM Algorithm (Expectation-Maximization):
- Definition:
- Steps:
Introduction:
- It tries to find the best boundary (hyperplane) that separates data points into classes.
Hyperplane:
- Definition:
- Example:
Support Vectors:
- The points closest to the hyperplane that help define its position.
1. Linear Kernel:
2. Polynomial Kernel:
Properties of SVM:
Issues in SVM: