0% found this document useful (0 votes)
2 views

Lecture 4 (Part 4) - LogisticRegression

The document discusses the differences between linear regression and logistic regression, highlighting that linear regression predicts continuous variables while logistic regression predicts binary outcomes. It explains the output function of logistic regression, which uses the sigmoid function to map real numbers to a probability range of 0 to 1, and introduces the loss function for both types of regression. Additionally, it mentions the use of gradient descent for optimizing parameters in logistic regression and notes that the model can be extended to multi-dimensional inputs.

Uploaded by

Sahlah Adesina
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
2 views

Lecture 4 (Part 4) - LogisticRegression

The document discusses the differences between linear regression and logistic regression, highlighting that linear regression predicts continuous variables while logistic regression predicts binary outcomes. It explains the output function of logistic regression, which uses the sigmoid function to map real numbers to a probability range of 0 to 1, and introduces the loss function for both types of regression. Additionally, it mentions the use of gradient descent for optimizing parameters in logistic regression and notes that the model can be extended to multi-dimensional inputs.

Uploaded by

Sahlah Adesina
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 13

COSC 434

Intro to Machine Learning


Logistic Regression
Linear regression vs logistic regression
• Linear regression:
The target variable
(output) is a
continuous variable
(like price, weight,
height, etc.)

• Logistic regression: Logistic regression


The target variable is considered a
(output) is a binary generalized linear
discrete variable ( 0 model.
or 1). It can be used
for the classification
problem.
Logistic regression with single input variable
• For simplicity, let us look at logistic regression with one input variable first.
• This can later be easily extended into logic regression with two or more input variables
(i.e., multi-dimensional input).
• For example, we want to classify whether a mouse is
“not obese” (0) or “obese” (1) using its weight as the input.
Output function
• In linear regression, the output variable is a real number.
• In logistic regression, the output function is a real number in the
range 0 to 1.
• This allows us to interpret the final binary output as probability.
• For example, if the output is 0.75 we can say with moderate confidence that
the result will be 1.
• If the output is 0.97 we can say very confidently that the result will be 1.
• If the output is 0.45 we will say the result is 0, but with a low confidence.
Output function
• The sigmoid function serves this purpose of mapping any real
numbers into the range of 0 and 1.
Examples of Sigmoid function outputs
x g(x)
-100 3.72008E-44
-50 1.92875E-22 very small numbers
-10 4.53979E-05
-5 0.006692851
-4 0.01798621
-3 0.047425873
-2 0.119202922
-1 0.268941421
0 0.5
1 0.731058579
2 0.880797078
3 0.952574127
4 0.98201379
5 0.993307149
10 0.999954602
50 1
100 1
Output of logistic regression
• Assume we know the parameters 𝑤 and 𝑏.

• Output of linear regression (a real number)

• Output of logistic regression (real number in the range 0..1)


Loss function
• Loss function of linear regression on a single data point (𝑥𝑖 , 𝑦𝑖 ):

• Loss function of logistic regression on single data point 𝑥𝑖 , 𝑦𝑖 , where


𝑦𝑖 ∈ {0,1}
If 𝑦𝑖 = 1
ෝ𝒊
𝒚 𝜺𝒊
1 0
0.9 0.045757
0.8 0.09691
0.7 0.154902
0.6 0.221849
0.5 0.30103
0.4 0.39794
0.3 0.522879
0.2 0.69897
0.1 1
0.0001 4
0.00001 5
0 infinity
If 𝑦𝑖 = 0
ෝ𝒊
𝒚 𝜺𝒊
1 infinity
0.99999 5
0.9999 4
0.9 1
0.8 0.69897
0.7 0.522879
0.6 0.39794
0.5 0.30103
0.4 0.221849
0.3 0.154902
0.2 0.09691
0.1 0.045757
0 0
Solving for the optimal 𝑤 and 𝑏
• For linear regression

• For logistic regression

Note: the average of the summation can also be used (by multiplying the summation with 1/𝑁).
Solving w and b with gradient descent
• w and b can be solved with gradient descent.
• https://www.coursera.org/lecture/neural-networks-deep-
learning/logistic-regression-gradient-descent-5sdh6

(Note: this will not be covered in the exams.)


Notes
• The formulation of logistic regression for one-dimensional input can
be easily extended into that with multi-dimensional input.
• The idea of regularization can also be applied to logistic regression.

You might also like