0% found this document useful (0 votes)
2 views

simple linear regression with example problem

Linear regression is a fundamental machine learning algorithm used for predictive analysis of continuous variables, establishing a linear relationship between dependent and independent variables. It can be categorized into Simple Linear Regression, which uses one independent variable, and Multiple Linear Regression, which uses multiple independent variables. The goal is to find the best fit line that minimizes the error between predicted and actual values, often using the Mean Squared Error (MSE) as the cost function to optimize the regression coefficients.

Uploaded by

Srikanth Sri
Copyright
© © All Rights Reserved
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
2 views

simple linear regression with example problem

Linear regression is a fundamental machine learning algorithm used for predictive analysis of continuous variables, establishing a linear relationship between dependent and independent variables. It can be categorized into Simple Linear Regression, which uses one independent variable, and Multiple Linear Regression, which uses multiple independent variables. The goal is to find the best fit line that minimizes the error between predicted and actual values, often using the Mean Squared Error (MSE) as the cost function to optimize the regression coefficients.

Uploaded by

Srikanth Sri
Copyright
© © All Rights Reserved
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 12

Linear Regression in Machine Learning

Linear regression is one of the easiest and most popular Machine Learning algorithms. It is a
statistical method that is used for predictive analysis. Linear regression makes predictions for
continuous/real or numeric variables such as sales, salary, age, product price, etc.

Linear regression algorithm shows a linear relationship between a dependent (y) and one or more
independent (y) variables, hence called as linear regression. Since linear regression shows the
linear relationship, which means it finds how the value of the dependent variable is changing
according to the value of the independent variable.

The linear regression model provides a sloped straight line representing the relationship between
the variables. Consider the below image:

Types of Linear Regression

Linear regression can be further divided into two types of the algorithm:

 Simple Linear Regression:


If a single independent variable is used to predict the value of a numerical dependent
variable, then such a Linear Regression algorithm is called Simple Linear Regression.
 Multiple Linear regression:
If more than one independent variable is used to predict the value of a numerical
dependent variable, then such a Linear Regression algorithm is called Multiple Linear
Regression.
Linear Regression Line

A linear line showing the relationship between the dependent and independent variables is called
a regression line. A regression line can show two types of relationship:

 Positive Linear Relationship:


If the dependent variable increases on the Y-axis and independent variable increases on
X-axis, then such a relationship is termed as a Positive linear relationship.

Negative Linear Relationship:


If the dependent variable decreases on the Y-axis and independent variable increases on the X-
axis, then such a relationship is called a negative linear relationship.
Finding the best fit line:

When working with linear regression, our main goal is to find the best fit line that means the
error between predicted values and actual values should be minimized. The best fit line will have
the least error.

The different values for weights or the coefficient of lines (a 0, a1) gives a different line of
regression, so we need to calculate the best values for a 0 and a1 to find the best fit line, so to
calculate this we use cost function.

Cost function-

 The different values for weights or coefficient of lines (a 0, a1) gives the different line of
regression, and the cost function is used to estimate the values of the coefficient for the
best fit line.
 Cost function optimizes the regression coefficients or weights. It measures how a linear
regression model is performing.
 We can use the cost function to find the accuracy of the mapping function, which maps
the input variable to the output variable. This mapping function is also known as
Hypothesis function.

For Linear Regression, we use the Mean Squared Error (MSE) cost function, which is the
average of squared error occurred between the predicted values and actual values. It can be
written as:

For the above linear equation, MSE can be calculated as:

You might also like