linear vs logistic regression
linear vs logistic regression
What is Regression?
Regression is a statistical
method used to model the
relationship between a
dependent variable and one or
more independent variables. It
helps us understand how
changes in the independent
variables affect the dependent
variable.
Types of supervised learning:
1. Linear regression
(i) Simple linear regression
(ii) Multiple linear regression
2.Logistic regression
Simple linear regression:
A statistical method used to
model the relationship
between a single independent
variable and a dependent
variable using a straight line.
(independent
)
formula:
y = β₀ + β₁x + ε
where:
● y is the dependent variable
● x is the independent variable
● β₀ is the intercept
● β₁ is the slope
● ε is the error term
Formula for slope:
Pros and Cons:
where:
● y is the dependent variable
● x₁, x₂, ..., xₖ are the independent variables
● β₀, β₁, β₂, ..., βₖ are the coefficients
Formula:
Why sigmoid function is used in logistic
regression?
1. The sigmoid function maps any real number to a value between 0
and 1. This is ideal for representing probabilities, as probabilities
must fall within this range.
2. While other activation functions like ReLU (Rectified Linear Unit) and
tanh are commonly used in neural networks for different purposes,
they may not be as well-suited for logistic regression. For example,
ReLU's output is not bounded between 0 and 1, making it less
appropriate for modeling probabilities.
Comparison