Unit 2 ML - Ver 2
Unit 2 ML - Ver 2
Learning
Subject code:
Regulations: 2021 AL3451
where:
is the dependent variable
x is the independent variable
is the intercept
is the slope
Least Square Method
Least Square Method
Least Square Method
Least Square Method
Least Square Method
Least Square Method
Least Square Method
Least Square Method
Least Square Method
Least Square Method
Least Square Method
Least Square Method
Least Square Method
Multiple Linear regression using
Least Square Method
where:
is the dependent variable
, , , .... are the independent variable
is the intercept
, , , .... are the slope
Linear Regression
Linear Regression
Linear Regression
Linear Regression
Linear Regression
Multiple Linear Regression
Bayesian Linear Regression
• Regression is a machine learning task to predict continuous
values (real numbers), as compared to classification, that is used
to predict categorical (discrete) values.
• Bayesian Regression can be very useful when we have
insufficient data in the dataset or the data is poorly distributed.
• The output of a Bayesian Regression model is obtained from a
probability distribution.
• The aim of Bayesian Linear Regression is to find the ‘posterior’
distribution for the model parameters.
Bayesian Linear Regression
(2.9, 3.2)
(0.5, 1.4)
(2.3, 1.9)
Gradient Descent
Gradient Descent
Gradient Descent
Gradient Descent
Gradient Descent
Gradient Descent
Gradient Descent
Gradient Descent
Take the derivatives of the sum of squared residuals with
respect to the Intercept
Slope =
Gradient Descent
Gradient Descent
Gradient Descent
4
3.5
2.5
Height
1.5
0.5
0
0 0.5 1 1.5 2 2.5
Weight
Gradient Descent
• Types of Gradient Descent:
1. Batch Gradient Descent
2. Stochastic Gradient Descent
3. Minibatch Gradient Descent
• Batch Gradient Descent involves calculations
over the full training set.
• Stochastic Gradient Descent runs one training
example per iteration.
• Minibatch Gradient Descent divides the training
datasets into small batch sizes and performs the
updates on those batches separately.
Gradient Descent
Reminder .....
• Supervised learning has 2 types of tasks based
on the nature of the target feature (dependent
variable):
1. Regression
2. Classification
()
• [ ...... ]
Example: + - 1 = 0
[-1 1 1] =0
Perceptron Algoritham
• Perceptron is an example of linear discriminant
function.
• Discriminative models
excel at classification
tasks by effectively
distinguishing
•between
Discriminativedifferent
models set out to answer the
classes.
following question:
“What side of the decision boundary is this
data found in?”
Logistic Regression
• Linear regression predicts
the numerical response
but is not suitable for
predicting the categorical
values.
• When categorical
variables are involved, it
is called classification
problem and logistic
regression is suitable for
binary classification
problem.
• Odds(θ)=
log
take log on odds formula
Logistic Regression
• Exponentiating on both sides,
we have
• + Y() = Y
• Since , • (1+Y) = Y
• =
• Let Y = ) • =
• Then, Y • =
• = Y*()
• = Y - Y()
Types of Logistic Regression
• There are three types of Logistic Regressions:
1. Binary logistic regression
2. Multinomial logistic regression
3. Ordinal logistic regression