Lasso & Ridge Regression
Lasso & Ridge Regression
Regression models are trained on labeled data, where the relationships between features
and the target variable are learned, allowing the model to make predictions on new, unseen
data.
1. Linear Regression
2. Logistic Regression (for classification tasks)
3. Ridge Regression
4. Lasso Regression
5. Polynomial Regression
6. Elastic Net Regression
7. Support Vector Regression (SVR)
8. Decision Tree Regression
9. Random Forest Regression
10. Gradient Boosting Regression
11. Bayesian Linear Regression
12. Quantile Regression
13. Poisson Regression
14. Theil-Sen Estimator
15. Stepwise Regression
In ML, regression models are applied in various fields such as finance, healthcare,
marketing, and more, where the goal is to predict continuous outcomes like prices,
temperatures, or stock market trends.
Ridge regression
Ridge regression is a linear regression technique with L2 regularization that helps prevent
overfitting by adding a penalty to the size of the model's coefficients. This penalty shrinks
coefficients, making the model simpler and improving its ability to generalize to new data.
Ridge regression adds a regularization term to the OLS cost function, penalizing large
coefficients to reduce their magnitude and prevent overfitting.
The value of λ can be selected using techniques like cross-validation, which helps in
finding the best tradeoff between bias and variance.
Key Features:
Use Cases:
● High-Dimensional Data: Works well when there are more features than
observations.
● Multicollinearity: Effective for highly correlated predictors.
● Generalization: Useful when generalization is more important than perfect fitting.
Lasso Regression
Lasso regression (Least Absolute Shrinkage and Selection Operator) is another type of
linear regression that also includes regularization, but it uses L1 regularization instead of
L2. The key feature of Lasso is that it forces some of the coefficients to become exactly zero,
thus performing automatic feature selection.
1. Feature Selection: Lasso can set some coefficients to zero, removing irrelevant
features.
2. Prevents Overfitting: Regularization helps avoid overfitting.
3. Improved Generalization: Simplifies the model, enhancing generalization.
Use Cases
1. High-Dimensional Data: Ideal for models with many features, enabling automatic
feature selection.
2. Sparse Models: Suitable for creating models with only the most relevant features.
● Lasso (L1 regularization) can eliminate features entirely (i.e., set coefficients to
zero), which is useful when some features are not important.
● Ridge (L2 regularization) only shrinks coefficients but does not eliminate any
features, making it useful when all features are potentially relevant.