PCCAIML601
PCCAIML601
(Affiliated To MAKAUT)
Detailed Report on
Logistic Regression and Maximum Likelihood
Estimation
Submitted as CA2 in
(PCCAIML601)
for
B. Tech in
Submitted by:
Eshika Giri
(34230822009)
1. Introduction
2. Logistic Regression
o Definition and Importance
o Mathematical Formulation
o Sigmoid Function
o Decision Boundary
3. Maximum Likelihood Estimation (MLE)
o Concept of Likelihood
o Derivation of MLE for Logistic Regression
o Log-Likelihood Function
o Optimization Using Gradient Descent
4. Cost Function for Logistic Regression
5. Regularization in Logistic Regression
o L1 (Lasso) Regularization
o L2 (Ridge) Regularization
6. Advantages of Logistic Regression
7. Limitations of Logistic Regression
8. Applications of Logistic Regression
9. Conclusion
10. References
Abstract
Introduction
Logistic Regression is a widely used statistical method for binary classification tasks. It is
particularly effective when the target variable has two possible outcomes, such as 'yes' or 'no',
'spam' or 'not spam', and 'fraudulent' or 'non-fraudulent'. Unlike linear regression, which
predicts continuous values, logistic regression estimates the probability of a particular class
label.
The key idea behind logistic regression is to model the relationship between a set of
independent variables and the probability of a dependent variable belonging to a particular
class. The model uses the sigmoid function to ensure that the output values are constrained
between 0 and 1, making them interpretable as probabilities.
Logistic Regression
Logistic regression is a supervised learning algorithm used for classification problems where
the dependent variable is categorical. The model predicts the probability of an event
occurring, making it useful in numerous domains, including medical diagnosis, fraud
detection, and marketing.
Mathematical Formulation
For a given set of input features X, the logistic regression model is expressed as:
where:
Sigmoid Function
Decision Boundary
Concept of Likelihood
Conclusion
References