0% found this document useful (0 votes)
3 views

L3_twoVariable_re_estimation_2023

two variable regression

Uploaded by

potheadpandafk
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
3 views

L3_twoVariable_re_estimation_2023

two variable regression

Uploaded by

potheadpandafk
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 24

Two-Variable

Regression
Model
The
Problem of Estimation
The Method of Ordinary Least Squares
● Two-variable PRF:

● The PRF is not directly observable. We estimate it from the SRF:

● But how is the SRF itself determined?


SRF
● Consider the following
SRF
● Choose the SRF in such a way that the sum of the residuals

is as small as possible.
● It may lead to cancellation of errors
SRF
● We can avoid this problem if we adopt the least-squares
criterion,

is as small as possible.
SRF
● For example
Normal equations
● We can use the following normal equations
SRF
● Solving the normal equations simultaneously, we obtain
Statistical properties of OLS estimators
● passes through the sample means of Y and X i.e.

● Second,

● The mean value of residuals is zero.


● Residuals are uncorrelated with the predicted Y
● Residuals are uncorrelated with X.
The Classical Linear Regression Model: The Assumptions Underlying
the Method of Least Squares
● The Gaussian, standard, or classical linear regression model
(CLRM) makes 7 assumptions.
● ASSUMPTION 1 Linear Regression Model: The regression model
is linear in the parameters, though it may or may not be linear in
the variables. That is
ASSUMPTION 2
● Fixed X Values or X Values Independent of the Error Term.
● 𝑐𝑜𝑣 𝑋𝑖 , 𝜇𝑖 = 0
ASSUMPTION 3
● Zero Mean Value of Disturbance.

● If the conditional mean of one random variable given another


random variable is zero, the covariance between the two
variables is zero and hence the two variables are uncorrelated.
● If X and u are correlated, it is not possible to assess their
individual effects on Y.
ASSUMPTION 4
● Homoscedasticity or Constant Variance of 𝝁𝒊 : The variance of the
error, or disturbance, term is the same regardless of the value of
X.
ASSUMPTION 4
● Assumption 4
Homoscedasticity
● It also means that

● Whereas
ASSUMPTION 5
● No Autocorrelation between the Disturbances.
ASSUMPTION 6
● The Number of Observations n Must Be Greater than the
Number of Parameters to Be Estimated.
ASSUMPTION 7
● The Nature of X Variables: The X values in a given sample must
not all be the same.
● If all the X values are identical, then 𝑋𝑖 = 𝑋ത and the
denominator of regression coefficient formula will be zero.
Precision or Standard Errors of Least-Squares Estimates

● Standard errors of the OLS estimates

Standard error of estimate or the


Standard error of the regression
(se).
Example : Food File
● For the food expenditure the simple regression command is
Properties of Least-Squares Estimators: The Gauss–Markov
Theorem
● An estimator is said to be best linear unbiased estimator (BLUE) if
the following hold:
● It is linear
● It is unbiased
● It has minimum variance in the class of all such linear unbiased
estimators; an unbiased estimator with the least variance is
known as an efficient estimator.
Gauss–Markov Theorem
● Given the assumptions of the classical linear regression model,
the least-squares estimators, in the class of unbiased linear
estimators, have minimum variance, that is, they are BLUE.

● The Gauss–Markov theorem makes no assumptions about the


probability distribution of the random variable 𝜇𝑖 , and therefore
of Y. As long as the assumptions of CLRM are satisfied, the
theorem holds.
The Coefficient of Determination r2: A Measure of
“Goodness of Fit”
● Summary measure that tells how well the sample regression line
fits the data.
Coefficient of Determination
● Further

You might also like