0% found this document useful (0 votes)
23 views2 pages

Machine Learning Note 1

Uploaded by

fengsijie360
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
23 views2 pages

Machine Learning Note 1

Uploaded by

fengsijie360
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 2

Class Note 1: Linear Regression

Introduction to Linear Regression


Linear regression is one of the simplest and most widely used
supervised machine learning algorithms. It is used for predicting a
continuous target variable based on one or more input features. The
algorithm assumes a linear relationship between the input features
and the target variable. The equation for a simple linear regression
model is:

y=β0+β1x+ϵy=β0+β1x+ϵ
where yy is the target variable, xx is the input feature, β0β0 is the
intercept, β1β1 is the coefficient, and ϵϵ is the error term.

Major Points

1. Assumptions: Linear regression assumes linearity,


independence, homoscedasticity (constant variance of errors),
and normality of errors. Violations of these assumptions can
lead to poor model performance.
2. Cost Function: The most common cost function used in
linear regression is the Mean Squared Error (MSE), which
measures the average squared difference between the
predicted and actual values.
3. Optimization: The model parameters (β0β0 and β1β1) are
optimized using techniques like Gradient Descent or the
Ordinary Least Squares (OLS) method.
4. Regularization: To prevent overfitting, regularization
techniques like Ridge (L2) and Lasso (L1) regression can be
applied. Ridge regression adds a penalty proportional to the
square of the coefficients, while Lasso adds a penalty
proportional to the absolute value of the coefficients.

Use Cases

1. Predicting House Prices: Linear regression can be used to


predict house prices based on features like square footage,
number of bedrooms, and location.
2. Sales Forecasting: Businesses can use linear regression to
forecast future sales based on historical data and factors like
advertising spend.
3. Risk Assessment: In finance, linear regression can be used
to assess the risk of investments by modeling the relationship
between risk factors and returns.

Optimization Techniques

1. Feature Scaling: Standardizing or normalizing input features


can improve the performance of linear regression, especially
when using Gradient Descent.
2. Feature Selection: Removing irrelevant or redundant
features can reduce overfitting and improve model
interpretability.
3. Cross-Validation: Using k-fold cross-validation helps in
assessing the model's performance on unseen data and
selecting the best hyperparameters.
4. Regularization: Applying Ridge or Lasso regression can help
in reducing model complexity and preventing overfitting.

You might also like