0% found this document useful (0 votes)
66 views

Simple Linear Regression Analysis

This document provides an outline and introduction to simple linear regression analysis. It defines regression analysis and distinguishes it from causation. It describes the ordinary least squares (OLS) method used to estimate regression parameters including the assumptions of the classical linear regression model. It discusses properties of least squares estimators such as being best linear unbiased and having minimum variance. It also defines the coefficient of determination and standard errors of OLS estimates.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
66 views

Simple Linear Regression Analysis

This document provides an outline and introduction to simple linear regression analysis. It defines regression analysis and distinguishes it from causation. It describes the ordinary least squares (OLS) method used to estimate regression parameters including the assumptions of the classical linear regression model. It discusses properties of least squares estimators such as being best linear unbiased and having minimum variance. It also defines the coefficient of determination and standard errors of OLS estimates.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 17

School Of Business, Economics

And Management
Rabecca Hatoongo-Masenke

Introduction to Econometrics
Date
Outline
2. Simple Linear Regression Analysis
• Assumptions of classical linear regression model.
• Estimation of regression parameters.
• Properties of least square estimators.
Outline Cont
• Maximum likelihood estimators.
• Sampling distribution of least squares estimators.
• Confidence intervals and hypothesis testing goodness of fit (R).
• Reporting the results of regression analysis.
Definition of regression
• Regression analysis is concerned with the study of the dependence of
one variable, the dependent variable, on one or more other variables,
the explanatory variables, with a view to estimating and/or predicting
the (population) mean or average value of the former in terms of the
known or fixed (in repeated sampling) values of the latter.
• Regression Versus Causation: Although regression analysis deals with
the dependence of one variable on other variables, it does not
necessarily imply causation. In the words of Kendall and Stuart, “A
statistical relationship, however strong and however suggestive, can
never establish causal connection: our ideas of causation must come
from outside statistics, ultimately from some theory or other.
Method of OLS
• Recall the two-variable PRF:
=
• the PRF is not directly observable. Thus we estimate the SRF

=
=
• which shows that the (the residuals) are simply the differences
between the actual and estimated Y values
Method OLS Cont
• Now given n pairs of observations on Y and X, we would like to
determine the SRF in such a manner that it is as close as possible to
the actual Y. To this end, we may adopt the following criterion:
Choose the SRF in such away that the sum of the residuals
= ( − ) is as small as possible
• However summing alone does not give a true picture:

• Hence Ordinary Least Square Method (OLS)


• The principle or the method of least squares chooses and in such a
manner that, for a given sample or set of data, is as small as possible.
• We derive the OLS estimators (Demonstrate)
Properties
• Expressed in terms of observations
• They are point estimators
• =
• The mean value of the estimated Y = is equal to the mean value of
the actual Y
• The mean value of the residuals is zero
• Residuals are uncorrelated with the predicted X and Y
Assumptions of CLRM
• Assumption 1: Linear regression model. The regression model is
linear in the parameters =+
• Assumption 2: X values are fixed in repeated sampling. Values taken
by the regressor X are considered fixed in repeated samples. More
technically, X is assumed to be nonstochastic
• Assumption 3: Zero mean value of disturbance . Given the value of X,
the mean, or expected, value of the random disturbance term is zero.
Technically, the conditional mean value of is zero.
E( |) = 0
Assumptions Cont
• Assumption 4: Homoscedasticity or equal variance of ui. Given the
value of X, the variance of ui is the same for all observations. That is,
the conditional variances of ui are identical.
• var ( |) = E[ − E( |Xi)]^2
= E(2 | Xi ) = σ^2
• Assumption 5: No autocorrelation between the disturbances. Given
any two X values, Xi and Xj (i = j), the correlation between any two
and (i = j) is zero
Assumptions Cont
• Assumption 6: Zero covariance between and , or E() = 0. Formally,
cov (, ) = E[ − E()][ − E()]
= E[ ( − E())] since E() = 0
= E() − E()E() since E() is nonstochastic
= E() since E() = 0
= 0 by assumption
Assumptions Cont
• Assumption 7: The number of observations n must be greater than
the number of parameters to be estimated. Alternatively, the
number of observations n must be greater than the number of
explanatory variables.
• Assumption 8: Variability in X values. The X values in a given sample
must not all be the same. Technically, var (X) must be a finite positive
number.
• Assumption 9: The regression model is correctly specified.
Alternatively, there is no specification bias or error in the model used
in empirical analysis
Assumptions Cont
• Assumption 10: There is no perfect multicollinearity. That is, there
are no perfect linear relationships among the explanatory variables.
Standard Errors of OLS Estimates
• Var(
• se(

• Var(
• se(=
• Note that all values can be estimated from the data except . This is
estimated using the following formula:

Properties of Least Square Estimators
• Best linear unbiasedness property
1. Linear-Linear function of a random variable
2. Unbiased E() should be equal to the true value
3. Minimum variance in the class of all such linear unbiased estimators.
This is known as an efficient estimator
Read more!
Coefficient of Determination
• Shows how much of the variation in Y is explained by X.
• TSS=ESS+RSS
• =1-
Reference
Gujarati.N.D, (2004): Basic Econometrics, 4th Edition. McGraw Hill
Education (India) Private Limited. New Delhi.

You might also like