Unit 3
Unit 3
Linear Regression
Definition:
Linear Regression is a supervised machine learning algorithm
used to predict a value (dependent variable) based on the value of
one or more input variables (independent variables). It shows the
linear relationship between the variables.
Example:
Types:
In short:
Decision Tree
🔸 Decision Tree
Pros:
Cons:
SVM tries to find the best decision boundary (also called a hyperplane) that
maximally separates the classes.
Key Concepts:
Goal of SVM
Imagine we have two classes that are linearly separable (you can draw a
straight line between them).
SVM finds the hyperplane with the largest margin.
Where:
The fundamental Naïve Bayes assumption is that each feature makes an:
independent
equal
problem with attributes Color, Type, Origin, and the target, Stolen can be
Naive Bayes assumes that all features are independent given the class. That
is:
Example:
given the features of the car. The columns represent these features and
the rows represent individual entries. If we take the first row of the
dataset, we can observe that the car is stolen if the Color is Red, the Type
parameters/features.
X is given as,
Here x1, x2…, xn represent the features, i.e they can be mapped to
Color, Type, and Origin. By substituting for X and expanding using the
chain rule we get,