0% found this document useful (0 votes)
9 views8 pages

Slides Regression

The document discusses the derivation of normal equations and the coefficient of determination (R²) in linear regression. It outlines the linear regression model, the objective of least squares minimization, and provides formulas for calculating the parameters β0 and β1. Additionally, it explains how R² measures the model's goodness-of-fit, with interpretations of its values.

Uploaded by

kazmogale47
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
9 views8 pages

Slides Regression

The document discusses the derivation of normal equations and the coefficient of determination (R²) in linear regression. It outlines the linear regression model, the objective of least squares minimization, and provides formulas for calculating the parameters β0 and β1. Additionally, it explains how R² measures the model's goodness-of-fit, with interpretations of its values.

Uploaded by

kazmogale47
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 8

Derivation of Normal Equations and R 2 in Linear

Regression

TG Mathivha

March 19, 2025

TG Mathivha Normal Equations and R 2 March 19, 2025 1/8


Introduction
Linear Regression Model:

y = β0 + β1 x + ε (1)

where:
y is the dependent variable.
x is the independent variable.
β0 is the intercept.
β1 is the slope.
ε is the error term.
Note:
We term ŷ = β0 + β1 x the approximate value by the regression model.
The assumption underlying the model is that the dependent variable
y depends linearly on the parameters. Nonlinear transformation of
independent variables is allowed.
TG Mathivha Normal Equations and R 2 March 19, 2025 2/8
Objective: Least Squares Minimization

Goal: Find β0 and β1 that minimize the sum of squared errors over all (n)
data points:
n
X
J(β0 , β1 ) = (yi − (β0 + β1 xi ))2 (2)
i=1

TG Mathivha Normal Equations and R 2 March 19, 2025 3/8


Derivation of Normal Equations

Compute Partial Derivatives


n
∂J X
= −2 (yi − (β0 + β1 xi )) = 0 (3)
∂β0
i=1
n
∂J X
= −2 xi (yi − (β0 + β1 xi )) = 0 (4)
∂β1
i=1

TG Mathivha Normal Equations and R 2 March 19, 2025 4/8


Solving for β0 and β1

Expand and Solve


X X
yi = nβ0 + β1 xi (5)
X X X
xi yi = β0 xi + β1 xi2 (6)

Solve for Parameters


P P P
n xi yi − xi y
β1 = P 2 P 2 i (7)
n xi − ( xi )
P P
yi − β1 xi
β0 = (8)
n

TG Mathivha Normal Equations and R 2 March 19, 2025 5/8


Coefficient of Determination R 2

Definition: Measures how well the model explains variance in y .


SSR
R2 = 1 − (9)
SST
Where:
(yi − ȳ )2 (Total variance in y )
P
SST =
SSE = (yi − ŷi )2 (Residual sum of squares)
P

TG Mathivha Normal Equations and R 2 March 19, 2025 6/8


Interpretation of R 2

R 2 ≥ 0.8: Strong fit


0.5 ≤ R 2 < 0.8: Moderate fit
R 2 < 0.5: Poor fit
Higher R 2 indicates a better fit

TG Mathivha Normal Equations and R 2 March 19, 2025 7/8


Conclusion

Normal equations provide a closed-form solution for regression.


The coefficient of determination R 2 measures goodness-of-fit.

TG Mathivha Normal Equations and R 2 March 19, 2025 8/8

You might also like