0% found this document useful (0 votes)
146 views6 pages

Linear Restrictions Using Matrix Approach

The document discusses hypothesis testing using linear restrictions and the Wald test. It provides 4 examples of setting up null and alternative hypotheses by specifying restriction matrices and vectors for different linear restrictions on coefficients in a regression model. The examples progress from a single restriction on one coefficient to multiple simultaneous restrictions. Finally, it works through an example hypothesis test that the coefficients on variables X1 and X2 are equal to 1 and each other, respectively.

Uploaded by

Afeeq
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
146 views6 pages

Linear Restrictions Using Matrix Approach

The document discusses hypothesis testing using linear restrictions and the Wald test. It provides 4 examples of setting up null and alternative hypotheses by specifying restriction matrices and vectors for different linear restrictions on coefficients in a regression model. The examples progress from a single restriction on one coefficient to multiple simultaneous restrictions. Finally, it works through an example hypothesis test that the coefficients on variables X1 and X2 are equal to 1 and each other, respectively.

Uploaded by

Afeeq
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 6

1

Lecture 3: Hypothesis testing – linear restrictions using matrix approach

F-statistic of Wald-test:

( R ^β−r ) ' [ R ( X ' X )−1 R' ] ( R ^β−r ) / g


−1

F=
SSE/ ( n−k −1 )
Where;
R is the restrictions matrix.

r is the restrictions vector.

g is the number restrictions.

n is the sample size.

k is the number of independent variables.

Example 1:
Consider a linear regression model:
Y = β0 + β 1 X 1 + β 2 X 2 + β 3 X 3 + β 4 X 4 +u

The null and alternative hypotheses are:


H 0 : β 2=0 one linear restriction on coefficient vector β
H 1 : β2 ≠ 0

 The restrictions matrix R in this case is the 1 × 5 vector:


R=[ 0 0 1 0 0 ]

 The restrictions vector r is:


r =⌊ 0 ⌋

 The matrix-vector product Rβ in this case is:

[]
β0
β1
Rβ=[ 0 0 1 0 0 ] β 2 =0 β 0+ 0 β1 +1 β 2+ 0 β 3 +0 β 4 =β 2
β3
β4
2

 The null hypothesis H 0 : Rβ=r is therefore H 0 : β 2=0.


Example 2:
Consider a linear regression model:
Y = β0 + β 1 X 1 + β 2 X 2 + β 3 X 3 + β 4 X 4 +u

The null and alternative hypotheses are:


H 0 : β 1=0∧β 2=0 two linear restrictions on coefficient vector β
H 1 : β1 ≠ 0∧¿∨β 2 ≠ 0

 The restrictions matrix R in this case is the 2 × 5 vector:

R=
[ 00 1 0 0 0
0 1 0 0 ]
 The restrictions vector r in this case is the 2 × 1 vector:
0
r =⌊ ⌋
0
 The matrix-vector product Rβ in this case is:

[]
β0
β1
Rβ=
[
0 1 0 0 0
0 0 1 0 0
β
β2 = 1
β3
β2 ] []
β4

 The null hypothesis H 0 : Rβ=r is therefore H 0 : β =⌊ 0 ⌋ ,


1

2
[]
β 0

which says “ β 1=0∧β 2=0”


3

Example 3:
Consider a linear regression model:
Y = β0 + β 1 X 1 + β 2 X 2 + β 3 X 3 + β 4 X 4 +u

The null and alternative hypotheses are:


H 0 : β 1=β 3∧β 2=−β 4 or β 1−β 3=0∧β 2+ β 4=0

H 1 : β1 ≠ 0∧¿∨β 2 ≠ 0 or β 1−β 3 ≠ 0∧¿∨β 2 + β 4 ≠ 0

 The restrictions matrix R in this case is the 2 × 5 vector:

[
R= 0 1 0 −1 0
0 0 1 0 1 ]
 The restrictions vector r in this case is the 2 × 1 vector:

r =⌊ 0 ⌋
0

 The matrix-vector product Rβ in this case is:

[]
β0
β
Rβ=
[
0 1 0 −1 0 1
0 0 1 0 1
β3
β −β
β2 = 1 3
β 2+ β 4 ] [ ]
β4

[ β −β
 The null hypothesis H 0 : Rβ=r is therefore H 0 : β + β =⌊ 0 ⌋ ,
1

2
3

4
] 0

which says “ β 1−β 3=0∧β 2+ β 4=0 ”


4

Example 4:
Consider a linear regression model:
Y = β0 + β 1 X 1 + β 2 X 2 + β 3 X 3 + β 4 X 4 +u

The null and alternative hypotheses are:


H 0 : β 1+2 β 2=β 3+ 2 β 4 or β 1+ 2 β 2−β 3−2 β 4 =0

H 1 : β1 +2 β 2 ≠ β3 +2 β 4 or β 1+ 2 β 2−β 3−2 β 4 ≠ 0

 The restrictions matrix R in this case is the 1 × 5 vector:


R=[ 0 1 2 −1 −2 ]

 The restrictions vector r is:


r =⌊ 0 ⌋

 The matrix-vector product Rβ in this case is:

[]
β0
β1
Rβ=[ 0 1 2 −1 −2 ] β2 =β 1+ 2 β 2−β 3−2 β 4
β3
β4

 The null hypothesis H 0 : Rβ=r is therefore H 0 : β 1+2 β 2−β 3−2 β 4=0


5

Example:

Y X1 X2 X3
20 10 12 13
35 15 10 10
30 21 9 11
47 26 8 12
60 40 5 13
68 37 7 8
76 42 4 10
90 33 5 14
100 30 7 6
105 38 5 5
130 60 3 8
140 66 4 9
125 50 3 5

Y = β0 + β 1 X 1 + β 2 X 2 + β 3 X 3 +u

H 0 : β 1=1 and β 2=β 3

H 1 : β1 ≠1 and/or β 2 ≠ β 3

Or
H 0 : β 1=1 and β 2−β 3=0

H 1 : β1 ≠1 and/or β 2−β 3 ≠ 0

 The restrictions matrix R :


6

R=
[ 0 1 0 0
0 0 1 −1 ]
 The restrictions vector r :

r= [ 10]
 The matrix-vector product:

[ ][
β0

[
R ^β= 0 1 0 0
β1
0 0 1 −1 β 2
=
β1
β 2−β 3 ] ]
β3

 The null hypothesis H 0 : Rβ=r :

H0 :
[ ] []
β1
β 2−β 3
=
1
0

 F-statistic:

−1
( R ^β−r ) ' [ R ( X ' X )−1 R' ] ( R ^β−r ) / g
F=
SSE/ ( n−k −1 )

149.4627/2
F= =0.2952
2278.153/ ( 13−3−1 )

 Decision and conclusion:

At α =0.05 , F=0.2952 < F 2,9=4.26, therefore we do not reject H 0.


Conclude that β 1=1 and β 2=β 3.

You might also like