ML Logistic Regression
ML Logistic Regression
1 Where
𝑌= Large –ve values will be closer to
1 + 𝑒 −𝑍 Z = w.X + b
zero and large +ve values will be
Sigmoid function Suppose Z=5X+10 closer to 1
Logistic Regression
Advantages
Easy to implement.
Perform well on data with linear relationship.
Less prone to overfitting for low dimensional dataset.
Disadvantages
High dimensional dataset causes overfitting.
Difficult to capture complex relationship in a dataset.
Sensitive to outliers.
Needs a larger dataset.
Logistic Regression – Example Calculation….
1
Suppose Z=5X+10 𝑌 = Sigmoid function
1+𝑒 −𝑍
Logistic Regression - Inference….
If Z value is large positive
number, then
1
𝑌 =
1+0
𝑌 = 1
‘m’ represents the number of data points in the training set Cost function (J)
Gradient Decent for Logistic Regression...
Working of Gradient Decent .
Gradient Descent - Logistic Regression
- Gradient decent is an optimization algorithm for minimizing
the cost function in various ML algorithms.
- It is used for updating the parameters of the learning
model.
- Formula for updating w and b is:
- w --> weight
- b --> bias
- L --> Learning rate
- dw -->partial derivative of loss unction with respect to w.
- db --> partial derivate of loss function with respect to b.
Logistic Regression
- Sigmoid function
- Updating weights through Gradient
decent
- Derivate to get dw and db.
Chapter Reading
Chapter Chapter 01
Pattern Recognition and
Machine Learning
by
Machine Learning
by
Tom Mitchell
Christopher M. Bishop