2-Logistic Regression
2-Logistic Regression
Regression
Classification
Machine Learning
Why use Logistic Regression?
The Logistic regression equation can be obtained from the Linear Regression
equation. Themathematical steps to get Logistic Regression equations are given
below:
We know the equation of the straight line can be written as:
In Logistic Regression y can be between 0 and 1 only, so for this let's divide the
aboveequation by (1-y), and we need range between -[infinity] to +[infinity], then take
logarithm of the equation itwill become
Logistic Regression Equation:
In Logistic Regression y can be between 0 and 1 only, so for this let's divide the
aboveequation by (1-y), and we need range between -[infinity] to +[infinity], then take
logarithm of the equation itwill become
𝑝𝑖
𝑙𝑜𝑔𝑒 = 𝜃𝑇𝑥
1 − 𝑝𝑖
Logistic Regression Equation:
𝑝𝑖
𝑙𝑜𝑔𝑒 = 𝜃𝑖 𝑥
1 − 𝑝𝑖
𝑝𝑖
= 𝑒 𝜃𝑖 𝑥
1 − 𝑝𝑖
𝑝𝑖 = (1 − 𝑝𝑖 )𝑒 𝜃𝑖 𝑥
1
𝑝𝑖 =
1+𝑒 −𝜃𝑖 𝑥
types of Logistic Regression
Advantages Limitations
Logistic regression is easier to implement, If the number of observations is lesser than
interpret, and very efficient to train. the number of features, Logistic
Regression should not be used; otherwise,
it may lead to overfitting.
It makes no assumptions about It can only be used to predict discrete
distributions of classes in feature space. functions.
Good accuracy for many simple data sets In Linear Regression independent
and it performs well when the dataset is and dependent variables are related
linearly separable. linearly. But Logistic Regression needs that
Logistic regression is less inclined to over- independent variables are linearly related
fitting but it can overfit in high to the log odds (log(p/(1-p)).
dimensional datasets.
Difference between logistic regression and linear regression
Linear regression Logistic regression
Linear regression is used to predict the Logistic Regression
continuous dependent variable using a given is used to predict the categorical dependent
set of independent variables. variable using a given set of independent
variables.
is used for solving Regression problem. is used for solving classification problem.
we predict the value of continuous we predict the value of categorical
variables. variables.
we find the best fit line, by which we can we find the S-curve by which
easily predict the output. we can classify the samples.
Here no threshold value is needed. a threshold value is added.
We plot the training datasets, a straight line Any change in the coefficient leads to a
can be drawn that touches maximum plots. change in the direction of the logistic
function. It means positive slopes result in
an S- shaped curve and negative slopes
result in a Z- shaped curve.
The Similarities between Linear Regression and Logistic Regression
• Linear Regression and Logistic Regression both are supervised
Machine Learningalgorithms.
• Linear Regression and Logistic Regression, both the models are
parametricregression i.e. both the models use linear equations for
predictions
That‘s all the similarities we have between these two models.
The Cost function of linear regression
Want :
Repeat