0% found this document useful (0 votes)
4 views

Simple linear regression

The document explains simple linear regression, focusing on finding the best fit line using Ordinary Least Squares (OLS) to minimize the Sum of Squared Residuals (SSR). It details the process of calculating residuals and SSR, emphasizing the importance of these metrics in estimating the parameters of the regression model. Additionally, it mentions the use of Python for implementing OLS and hints at discussing uncertainty through p-values and confidence intervals later.

Uploaded by

oyes12766
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
4 views

Simple linear regression

The document explains simple linear regression, focusing on finding the best fit line using Ordinary Least Squares (OLS) to minimize the Sum of Squared Residuals (SSR). It details the process of calculating residuals and SSR, emphasizing the importance of these metrics in estimating the parameters of the regression model. Additionally, it mentions the use of Python for implementing OLS and hints at discussing uncertainty through p-values and confidence intervals later.

Uploaded by

oyes12766
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 1

Simple linear regression:

- 1 x and 1 y
- y = intercept t+ slope * X
- y = b0 + b1 * X

Need to find the best fit line.

 To do that we need to  find the Measured Error

 Residual = Observed or actual value - Predicted value

 The SSR is always = 0 for OLS Estimators.

 To get the Total Error in the model  Square Each residual and then add up the residuals

for every data point. - This is called Sum of Squared Residuals (SSR)

SSR - the sum of the squared differences between each observed value and its associated predicted value

(ex. Residual = -1.5, the squared residual is 2.25) to get SSR you add all the squared residuals.

For Linear Regression, we'll be using a technique called OLS ordinary least square to measure the best fit line.

 OLS - a method that minimizes SSR to estimate parameters in a linear regression model. Using OLS, we
can calculate b0 intercept and b1 slope.

 To calculate SSR: 1st we need find the y for predicted value for each X then find the residual for every
observed value of x. Plot the residuals of x (the residuals are difference between each observed value and
what the line predicted

- Above method time consuming to test out every line to see which fits best

USE PYTHON (Uses OLS to test and get the best line)

Later, we'll talk about uncertainty using p-values and confidence intervals to aid in the interpretation of results.

You might also like