Problem Set Logistic Regression
Problem Set Logistic Regression
The parameters of the hypothesis function, in logistic regression are updated in gradient descent algorithm as,
∂Loss
wj = wj + α (1)
dwi
Calculate α ∂Loss
dwi for,
where:
• m is the number of training examples.
• y (i) is the actual class label of the i-th training example, which can be 0 or 1.
• h(x(i) ) is the predicted probability that the i-th training example belongs to the class with label 1,
calculated as h(x(i) ) = σ(z)), where σ is the sigmoid function, z = W · X + b and b is the bias.
Task: Simulate the gradient descent algorithm for logistic regression, performing up to two iterations.
Instructions:
• Initial Parameters: Set all model parameters to an initial value of 1.
1
Keyword Frequency (%) Email Length (1000s of characters) Spam (1) or Not Spam (0)
20 2 1
5 3 0
30 1 1
7 2 0
25 1 1
3 4 0
15 1.5 1
4 3.5 0
Dataset 2: University Admission Prediction This dataset predicts whether a student will be admitted to a
university based on their GRE score and GPA.
GRE Score (out of 340) GPA (out of 4.0) Admitted (1) or Not Admitted (0)
330 3.9 1
315 3.5 1
300 3.2 0
320 3.8 1
310 3.0 0
305 3.3 0
325 3.7 1
318 3.6 1
Dataset 3: Loan Default Prediction This dataset predicts whether a borrower will default on a loan based on
their annual income and loan amount.
Annual Income ($1000s) Loan Amount ($1000s) Default (1) or Not Default (0)
45 15 0
85 20 0
50 30 1
60 22 0
30 25 1
100 35 0
40 18 1
75 28 0