0% found this document useful (0 votes)
5 views

Supervised Learning Regression

Presentation on supervised learning

Uploaded by

Syed Ali Asad
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
5 views

Supervised Learning Regression

Presentation on supervised learning

Uploaded by

Syed Ali Asad
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 15

REGRESSION

Saad Naeem

SP21-BCS-040

1
What is Regression?
Regression is a supervised
machine learning technique used to
predict a continuous numerical output
variable based on one or more input
variables. It is commonly used for
forecasting, modeling relationships, and
understanding the impact of independent
variables on a dependent variable.

Continuous Output:
Unlike classification, which
predicts discrete labels, regression deals
with predicting continuous values.
Examples include predicting house
prices, temperature, stock prices, etc.
What is Regression?
Regression is a supervised
machine learning technique used to
predict a continuous numerical output
variable based on one or more input
variables. It is commonly used for
forecasting, modeling relationships, and
understanding the impact of independent
variables on a dependent variable.

Continuous Output:
Unlike classification, which
predicts discrete labels, regression deals
with predicting continuous values.
Examples include predicting house
prices, temperature, stock prices, etc.
Linear Regression

1 Fundamental Model 2 Finds Best-Fit Line


Linear regression is a supervised learning The algorithm aims to find the line of best
algorithm that models the relationship fit that minimizes the sum of squared
between a dependent variable and one or differences between the predicted and
more independent variables using a linear actual values.
equation.

3 Used for Prediction 4 Versatile Applications


Once the model is trained, it can be used to Linear regression has a wide range of
make predictions of the dependent variable applications, from predicting housing
based on new inputs of the independent prices to forecasting sales, making it a
variables. fundamental technique in machine
learning.
5
Polynomial Regression

Non-Linear Relationships Curved Fit Polynomial Degree


Polynomial regression is used to This model fits a curved line (a The degree of the polynomial (1st,
model non-linear relationships polynomial function) to the data, 2nd, 3rd, etc.) determines the
between the independent and allowing it to capture more complex complexity of the model and how
dependent variables. patterns than linear regression. well it can fit the data.
7
KEY CONCEPTS
1. Cost Function:
A function that measures the
performance of a model by
quantifying the error between the
predicted values and the actual
values of the target variable.

2. Error Measurement: The cost


function calculates the error between
the predicted outputs of the model
and the actual outputs. The type of
error measurement depends on the
specific problem and the chosen cost
function.

3. Optimization: The process of


training a machine learning model
involves finding the set of model
parameters (e.g., weights and biases
in linear regression) that minimize
the cost function. Optimization
algorithms like gradient descent are
often used for this purpose.

8
9
10
Ridge Regression
What is Ridge Regression? How it Works

Ridge Regression is a type of linear Ridge Regression adds a regularization


regression that adds a penalty term to term to the cost function, which
the cost function to prevent overfitting. penalizes large coefficients. This
It shrinks the coefficients towards zero, encourages the model to use smaller,
reducing the variance of the model more stable coefficients, improving
without greatly increasing the bias. generalization and preventing
overfitting.
Lasso Regression
L1 Regularization Variable Selection
Lasso regression uses L1 regularization to This makes Lasso useful for feature selection,
shrink less important feature coefficients to as it can automatically identify and remove
exactly zero, resulting in a sparse model. irrelevant predictors from the model.
Gradient Boosting Regression

1 Ensemble Learning 2 Sequential Modeling


Gradient boosting is an ensemble learning The model is trained in a stagewise fashion, with
technique that combines multiple weak models, each new tree trying to correct the errors of the
like decision trees, into a strong predictive model. previous trees.

3 Handles Nonlinearity
Gradient boosting can capture complex, nonlinear
relationships in the data by adding more trees to
the model.
Random Forest Regression
Ensemble Approach Handles Complexity
Random Forest Regression combines It can effectively model complex, non-linear
multiple decision trees to make more robust relationships in the data, making it a versatile
and accurate predictions. regression technique.

Feature Importance
Random Forest can determine the relative importance of each feature, providing valuable insights for
feature selection.
Regression Evaluation Metrics
When evaluating the performance of a regression model, there are several important metrics to consider. These metrics help
measure how well the model can predict the target variable based on the input features.

These metrics provide a comprehensive view of the model's performance, including the average magnitude of errors (MAE
and RMSE), and the overall spread of errors (MSE).

You might also like