logistic regression
logistic regression
basics
The elegance of this function lies in its simplicity: it takes the linear equation, akin to a straight
road, and bends it into an S-shaped path that gracefully transitions from one state to another.
#3.1 From Linear To Logistic Regression
Step 1. Linear Regression Foundation:We begin with a representation of a linear regression
equation:
Y = Ax + B,
where:
• Y is the dependent variable (the outcome we're trying to predict).
• X is the independent variable (the predictor).
• A and B are the coefficients that represent the slope and the y-intercept of the regression
line, respectively.
The main problem?
The plot shows how linear regression would typically fit a straight line through data points.
Step 2. Probability Adjustment: Since linear regression outputs can extend beyond the range of
[0,1], which is not suitable for probability, the equation is adjusted to P = Ax + B to reflect that
P (probability) is being modeled instead of a direct measurement Y.
Step 3. Odds Transformation. However, P can still take on values less than 0 or greater than 1,
which is not practical for probabilities. So to constraint P between [0,1] the odds are computed.
This is a pivotal step in moving from linear to logistic regression. This transformation allows us to
model P as a linear combination of x but in the log-odds space, not the probability space.
This equation represents the sigmoid function, which bounds P between 0 and 1. It translates the
linear combination of x into a probability.