0% found this document useful (0 votes)
101 views2 pages

hw1 Econometrics

This document contains 10 problems related to econometrics and statistical concepts: 1) It asks to show that the difference in sums of squared residuals between the least squares coefficient vector and any other vector is non-negative. 2) It asks to show an inequality relationship between the expected value of the squared error of the mean prediction and any other prediction function, given a scalar dependent variable and vector of independent variables. 3) It asks whether a claim about the equality of two least squares estimates from different regression models is correct and to prove the answer. 4) It asks to show the equality of two least squares estimates under certain conditions on the independent variables.

Uploaded by

Varun Anbualagan
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
101 views2 pages

hw1 Econometrics

This document contains 10 problems related to econometrics and statistical concepts: 1) It asks to show that the difference in sums of squared residuals between the least squares coefficient vector and any other vector is non-negative. 2) It asks to show an inequality relationship between the expected value of the squared error of the mean prediction and any other prediction function, given a scalar dependent variable and vector of independent variables. 3) It asks whether a claim about the equality of two least squares estimates from different regression models is correct and to prove the answer. 4) It asks to show the equality of two least squares estimates under certain conditions on the independent variables.

Uploaded by

Varun Anbualagan
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 2

S1 2021 Econometrics I

Eric Weese

Problem Set 1
Due: 18 April 2021

1. Suppose that b is the least squares coefficient vector in the regression of Y on X


and that c is any other k × 1 vector. Show that the difference in the two sums of
squared residuals is written as

(Y − Xc)′ (Y − Xc) − (Y − Xb)′ (Y − Xb) = (c − b)′ X ′ X(c − b).

Show that this difference is nonnegative.

2. Let y be a scalar random variable, and x be a k × 1 random vector. Show that the
following inequality holds for any function g(x).

E[(y − g(x))2 |x] ≥ E[(y − E[y|x])2 |x].

3. Consider the following regression model

yi = β1 xi1 + β2 xi2 + εi , i = 1, . . . , n,

where β1 and β2 are scalar, and xi1 = 1 for all i. Let b2 denote the least squares
estimate of β2 from this regression model. Consider another regression model

yi = γ1 (xi2 − x2 ) + ui , i = 1, . . . , n,
P
where x2 = n−1 ni=1 xi2 . Let c1 denote the least squares estimate of γ1 from this
regression model.
A researcher argues that b2 = c1 . Is her claim correct? Give a proof of your answer.

4. Consider the linear regression model

y = Xβ + ε = X 1 β 1 + X 2 β 2 + ε,

where y and ε are n × 1, X 1 is n × k1 , X 2 is n × k2 , β 1 is k1 × 1, and β 2 is k2 × 1.


Let (b1 , b2 ) denote the OLS estimator of (β 1 , β 2 ) from regressing y on X. Let e
b1
denote the OLS estimator of β 1 from regressing y on X 1 only. Show that b1 = e b1
if X ′1 X 2 = 0.

1
5. Consider the linear regression model

y = Xβ + ε = X 1 β 1 + X 2 β 2 + ε,

where y and ε are n × 1, X 1 is n × k1 , X 2 is n × k2 , β 1 is k1 × 1, and β 2 is k2 × 1.


Let Ye be the residuals from regressing Y on X 1 . Let X f2 be the matrix whose
columns are the residuals obtained by regressing each column of X 2 on X 1 . Let e e
be the residuals from regressing Ye on X
f2 . Show that e
e is identical to the residuals
from regressing Y on X.

6. Consider the regression model yi = x′i β + εi , where i = 1, . . . , n. Let ŷi denote


the least squares fitted value for the ith observation. Show that the R2 from this
regression is equal to the square of the sample correlation coefficient between yi and
ŷi .

7. Consider a linear regression model Y = Xβ + ε, where the explanatory variable


includes a constant term. Let a and b nonzero constants.
A researcher argues that the R2 from regressing aY on bX is the same as the R2
from regressing Y on X. Is his claim correct? Give a proof of your answer.

8. Consider a simple regression model

y i = β 1 + β 2 xi + εi , i = 1, . . . , n.

Let β̂2 denote the OLS estimator of β2 from this regression model. Show that β̂2 =
Pn Pn −1
Pn −1
Pn
i=1 (xi − x̄)(yi − ȳ)/ i=1 (xi − x̄) , where x̄ = n
2
i=1 xi and ȳ = n i=1 yi .
P
9. Suppose xi ∼ iid with Exi = µ and var(xi ) = σ 2 . Let x̄ = n−1 ni=1 xi . Let
P
S 2 = (n − 1)−1 ni=1 (xi − x̄)2 denote the sample variance of x. Show that S 2 →p σ 2 .

10. Suppose n(θ̂ − θ) →d N (0, σ 2 ). Prove θ̂ →p θ.

You might also like