Slides Regression
Slides Regression
Regression
TG Mathivha
y = β0 + β1 x + ε (1)
where:
y is the dependent variable.
x is the independent variable.
β0 is the intercept.
β1 is the slope.
ε is the error term.
Note:
We term ŷ = β0 + β1 x the approximate value by the regression model.
The assumption underlying the model is that the dependent variable
y depends linearly on the parameters. Nonlinear transformation of
independent variables is allowed.
TG Mathivha Normal Equations and R 2 March 19, 2025 2/8
Objective: Least Squares Minimization
Goal: Find β0 and β1 that minimize the sum of squared errors over all (n)
data points:
n
X
J(β0 , β1 ) = (yi − (β0 + β1 xi ))2 (2)
i=1