0% found this document useful (0 votes)
3 views2 pages

Regression Summary

Uploaded by

oneu9724
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
3 views2 pages

Regression Summary

Uploaded by

oneu9724
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 2

Supervised Learning - Regression Summary

Supervised Learning - Regression Summary

1. Regression Overview:

- Definition: Analyzes relationships between dependent (Y) and independent variables (X).

- Types:

1. Simple Linear Regression: One independent variable.

Formula: Y = b0 + b1X

Example:

Data: X = [1, 2, 3], Y = [2, 4, 5]

- Step 1: Calculate means X_mean = 2, Y_mean = 3.67

- Step 2: Compute slope b1 = 1.5

- Step 3: Intercept b0 = 0.67

- Final Equation: Y = 0.67 + 1.5X

2. Regression Algorithms:

- Linear Regression: Assumes a linear relationship between X and Y.

- Polynomial Regression: Adds polynomial terms for nonlinear data.

Example:

Data: X = [1, 2, 3], Y = [1, 4, 9]

Fit: Y = 0 + 0X + 1X^2

3. Regularization Techniques:

- Ridge Regression (L2): Adds penalty lambda * sum(beta^2) to reduce overfitting.

- Lasso Regression (L1): Adds penalty lambda * sum(abs(beta)) for feature selection.

Page 1
Supervised Learning - Regression Summary

4. Evaluation Metrics:

- MAE: Measures average errors.

Example: Actual: [3, 5, 7], Predicted: [2.5, 5.5, 6.5]

MAE = 0.5

- MSE and RMSE:

MSE = 0.25, RMSE = 0.5

5. Optimization Techniques:

- OLS: Minimizes sum of squared residuals.

Example: Data X = [1, 2], Y = [2, 4]

Fit: Y = 0 + 2X

- Gradient Descent:

Formula: theta = theta - alpha * (partial derivative of J(theta) w.r.t. theta)

6. Conclusion:

- Regression techniques predict continuous values.

- Regularization handles overfitting, optimization improves parameter estimation.

Page 2

You might also like