Econometrics 5 and 6
Econometrics 5 and 6
Introduction to Econometrics
BSc Eco 2023, Spring 2025
Instructor: Sunaina Dhingra
Email-id: [email protected]
Lecture Date: 17th February ( 2 lectures combined)
Introduction to Sample
Regression Function
Key Takeaways from PRF
• In reality, as we do not have data for the entire population, sample data is used
to discover relationships in the population
• The slope and intercept parameters for the population are estimated using
sample information
• An estimator is denoted with a hat sign “ ̂”
• 𝛽1 and 𝛽
0 are estimators; they are random variables that take as many values
as the number of samples
• Sample regression function is defined below:
0 + 𝛽
𝑦ෝ𝑖 = 𝛽 1 𝑥𝑖 ------(8)
Sample Regression Function and
Residuals
• The difference between the actual 𝑦𝑖
and predicted 𝑦𝑖 , termed “residuals” is
shown below:
𝜇ෝ𝑖 = 𝑦𝑖 − 𝑦ෝ𝑖 -----(9)
• 𝜇ෝ𝑖 is an estimator for the error
disturbances μ𝑖
𝑦𝑖 = 𝑦ෝ𝑖 + 𝜇ෝ𝑖 -------(10)
• 𝜇ෝ𝑖 can be positive or negative
Figure 1: SRF and the residuals
Source: Wooldridge, Chapter 2, Figure 2.4
Sample Regression Function and
Residuals
• The figure shows SRF, which is an estimator of PRF, and residuals, which is an
estimator of the error or disturbances
Source: Author’s estimation using Wage1 dataset in STATA, refer Do file for command
Estimation and Interpretation
of OLS Estimators
Estimation of OLS parameters
𝑤𝑎𝑔𝑒𝑖 = 𝑤𝑎𝑔𝑒
ෟ𝑖+ෞ μ𝑖
ෟ 𝑖 is the estimated conditional mean value of 𝑤𝑎𝑔𝑒𝑖
• 𝑤𝑎𝑔𝑒
• The parameters of this regression model are estimated using ordinary least
squares (OLS) method
Estimation of OLS Parameters
0 + 𝛽
𝑤𝑎𝑔𝑒𝑖 = 𝛽 1 𝑒𝑑𝑢cation𝑖 + μෝ𝑖 -------(2)
μෝ𝑖 = 𝑤𝑎𝑔𝑒𝑖 − 𝛽0 − 𝛽1 𝑒𝑑𝑢𝑐𝑎𝑡𝑖𝑜𝑛𝑖 −−−−−−−(3)
• As the sum of disturbances equals zero, the mean also equals zero
𝐸 𝜇 = 𝜇ҧ = 0−−−−−−−(4)
• As the average value of residual equals zero, we square the residuals, sum them, and
then try to minimize that sum
𝑛 𝑛
2 2
0 − 𝛽
𝑚𝑖𝑛 𝜇ෝ𝑖 = 𝑚𝑖𝑛 𝑤𝑎𝑔𝑒𝑖 − 𝛽 1 𝑒𝑑𝑢𝑐𝑎𝑡𝑖𝑜𝑛𝑖 −−−−−−−(5)
𝑖=1 𝑖=1
0 and 𝛽
• The estimators 𝛽 1 are obtained by minimizing the sum of squared residuals.
Estimation of OLS Parameters
𝑛 𝑛
2 2
0 − 𝛽
𝑚𝑖𝑛 𝜇ෝ𝑖 = 𝑚𝑖𝑛 𝑤𝑎𝑔𝑒𝑖 − 𝛽 1 𝑒𝑑𝑢𝑐𝑎𝑡𝑖𝑜𝑛𝑖 −−−−−−−(5)
𝑖=1 𝑖=1
0 , and equate them to 0
• Take the FOC of equation 5 with respect to 𝛽
0 − 𝛽
−2 σ𝑛𝑖=1 𝑤𝑎𝑔𝑒𝑖 − 𝛽 1 𝑒𝑑𝑢𝑐𝑎𝑡𝑖𝑜𝑛𝑖 = 0 −−−−−−−(6)
1 , and equate them to 0
• Take the FOC of equation 5 with respect to 𝛽
0 − 𝛽
−2 σ𝑛𝑖=1 𝑒𝑑𝑢𝑐𝑎𝑡𝑖𝑜𝑛𝑖 𝑤𝑎𝑔𝑒𝑖 − 𝛽 1 𝑒𝑑𝑢𝑐𝑎𝑡𝑖𝑜𝑛𝑖 = 0 −−−−−−−(7)
Estimation of OLS Parameters
0 and β
• Solving 6 and 7 simultaneously, we get the OLS estimators β 1
0 − 𝛽
−2 σ𝑛𝑖=1 𝑤𝑎𝑔𝑒𝑖 − 𝛽 1 𝑒𝑑𝑢𝑐𝑎𝑡𝑖𝑜𝑛𝑖 = 0 −−−−−−−(6)
0 − 𝛽
−2 σ𝑛𝑖=1 𝑒𝑑𝑢𝑐𝑎𝑡𝑖𝑜𝑛𝑖 𝑤𝑎𝑔𝑒𝑖 − 𝛽 1 𝑒𝑑𝑢𝑐𝑎𝑡𝑖𝑜𝑛𝑖 = 0 −−−−−−−(7)
0 = 𝑤𝑎𝑔𝑒 − 𝛽
𝛽 1 𝑒𝑑𝑢𝑐𝑎𝑡𝑖𝑜𝑛------(8)
• Covariance and slope have the same sign. Thus, the sign of covariance
determines the expected direction in which x affects y.
Fitted Values and Residuals
0 and 𝛽
• Predicted y: for any given value of 𝑥𝑖 , using the estimated 𝛽 1 values we get
𝑦ෝ𝑖 = 1 𝑥𝑖 -----(13)
𝛽0 +𝛽
𝑦𝑖 = 𝑦ෝ𝑖 + 𝑢ෝ𝑖 −−−−−(14)
• The fitted regression line is called the line of best fit
• The OLS residual associated with each observation i , 𝑢ෝ𝑖 is
𝑢ෝ𝑖 = 𝑦𝑖 − 𝑦ෝ𝑖 −−−−−(15)
• If 𝑢ෝ𝑖 is positive, the line under predicts 𝑦𝑖 and if 𝑢ෝ𝑖 is negative, the line over predicts 𝑦𝑖
STATA Results
STATA Result 1: Estimation of OLS Regression Line Figure 1: Line of best fit
Source: Author’s estimation using Wage1 dataset in STATA, refer Do file for command