0% found this document useful (0 votes)
45 views32 pages

Econometrics: Chapter 4: Bivariate Regression Model

This document provides an overview of bivariate regression models. It defines the population and sample regression functions, and outlines the classical assumptions of the regression model, including linearity, zero mean errors, homoskedasticity, no autocorrelation, non-stochastic X variables, and variation in X. It also discusses OLS estimation, the properties of OLS estimators, measures of fit like R2, limitations of R2, and techniques for forecasting with regression models. Exercises are provided to apply these concepts using sample data sets.

Uploaded by

Nuur Ahmed
Copyright
© Attribution Non-Commercial (BY-NC)
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPT, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
45 views32 pages

Econometrics: Chapter 4: Bivariate Regression Model

This document provides an overview of bivariate regression models. It defines the population and sample regression functions, and outlines the classical assumptions of the regression model, including linearity, zero mean errors, homoskedasticity, no autocorrelation, non-stochastic X variables, and variation in X. It also discusses OLS estimation, the properties of OLS estimators, measures of fit like R2, limitations of R2, and techniques for forecasting with regression models. Exercises are provided to apply these concepts using sample data sets.

Uploaded by

Nuur Ahmed
Copyright
© Attribution Non-Commercial (BY-NC)
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPT, PDF, TXT or read online on Scribd
You are on page 1/ 32

Econometrics

Chapter 4: Bivariate Regression Model

Population Regression Function


= relationship between dependent and independent variable in the population.

Population Regression Function

Sample regression function

Relationship between the dependent and independent variables in the sample Requires estimation of intercept and slope parameters from sample data

Population regression function and sample observations

Population and sample regression functions

Assumptions of the classical regression model

Ideal conditions that guarantee that estimated parameters are unbiased, consistent, and attain the lowest variance among linear unbiased estimators.

Assumption 1: linearity

Linear relationship between dependent and independent variable:

Assumption 2: E(u)=0

E(ui)=0

Assumption 3: Homoskedasticity

The error terms have a constant variance:

Heteroskedastic vs. homoskedastic error processes

Assumption 4: No autocorrelation

The error terms are independent across observations

Assumption 5: Nonstochastic X

The Xi are nonstochastic (not random) Common violations:


measurement error endogenous variables

This assumption guarantees that the covariance between the independent variable and the error term will be zero.

Violation of Assumption 5

Assumption 6 variation in X

There is some variation in the independent variable (X).

OLS estimation

Ordinary least squares estimation:

OLS Estimators

Properties of OLS estimators

Under the conditions of the classical regression model, OLS estimators are:

unbiased consistent best linear unbiased estimators (BLUE) (Gauss-Markov theorem)

Variance of the intercept estimator

Variance of the slope estimator

Covariance of intercept and slope terms

Example: Consumption function

Coefficient of Determination R2
With a bit of algebraic manipulation:

Explained and unexplained deviation

R2

TSS = RSS + ESS R2 = RSS/TSS, or R2 = 1 (ESS/TSS) R2 = the proportion of the variation explained by the regression model R2 is greater than or equal to zero R2 is less than or equal to one

R2 = 1

R2 = 0

R2 and the intercept

R2 may be appropriately computed as RSS/TSS only if an intercept term is included in the regression model

Cautions in interpreting R2

R2 is not a statistic that can be directly used for hypothesis testing If there is a large random component in the data generating process, R2 will be low, even if the model is correctly specified R2 is a measure of correlation, not causation

Forecasting

Forecasts have lower variance when:


the variance of the error term is smaller, the sample size is larger, or the value of X is close to the sample mean

Forecast accuracy

Assignments:

Select one of the data sets accompanying the text and compute basic descriptive statistics (mean, variance, correlations) pp. 147-154, #10, 18

You might also like