Basic Econometrics Notes
Basic Econometrics Notes
Cours e Le a de r
Prof. Dr.Sc VuThieu
1
Prof.VuThieu May 2004
Basic Econometrics
Introduction:
What is Econometrics?
2
Prof.VuThieu May 2004
Introduction
What is Econometrics?
Definition 1: Economic Measurement
Definition 2: Application of the
mathematical statistics to economic data
in order to lend empirical support to the
economic mathematical models and
obtain numerical results (Gerhard Tintner,
1968)
3
Prof.VuThieu May 2004
Introduction
What is Econometrics?
Definition 3: The quantitative
analysis of actual economic phenomena
based on concurrent development of
theory and observation, related by
appropriate methods of inference
(P.A.Samuelson, T.C.Koopmans and J.R.N.Stone,
1954)
4
Prof.VuThieu May 2004
Introduction
What is Econometrics?
Definition 4: The social science
which applies economics, mathematics
and statistical inference to the analysis of
economic phenomena (By Arthur S.
Goldberger, 1964)
Definition 5: The empirical
determination of economic laws (By H.
Theil, 1971)
5
Prof.VuThieu May 2004
Introduction
What is Econometrics?
Definition 6: A conjunction of
economic theory and actual
measurements, using the theory and
technique of statistical inference as a
bridge pier (By T.Haavelmo, 1944)
And the others
6
Prof.VuThieu May 2004
Economic Mathematical
Theory Economics
Econometrics
Economic Mathematic
Statistics Statistics
7
Prof.VuThieu May 2004
Introduction
Wh y a s e pa ra te dis c ipline ?
Economic theory makes statements
that are mostly qualitative in nature,
while econometrics gives empirical
content to most economic theory
Mathematical economics is to
express economic theory in
mathematical form without empirical
verification of the theory, while
econometrics is mainly interested in the
later
8
Prof.VuThieu May 2004
Introduction
Wh y a s e pa ra te dis c ipline ?
Economic Statistics is mainly
concerned with collecting, processing
and presenting economic data. It does
not being concerned with using the
collected data to test economic theories
9
Prof.VuThieu May 2004
Economic Mathematical
Theory Economics
Econometrics
Economic Mathematic
Statistics Statistics
10
Prof.VuThieu May 2004
Introduction
Me th odolog y of
E c onom e tric s
(1) Statement of theory or
hypothesis:
Keynes stated: ”Consumption
increases as income increases, but
not as much as the increase in
income”. It means that “The
marginal propensity to consume
(MPC) for a unit change in
income is grater than zero but less
than unit”
11
Prof.VuThieu May 2004
Introduction
Me th odolog y of
E c onom e tric s
(2) Specification of the
mathematical model of the
theory
Y = ß1+ ß2X ; 0 < ß2< 1
Y= consumption expenditure
X= income
ß1 and ß2 are parameters; ß1 is
intercept, and ß2 is slope coefficients
12
Prof.VuThieu May 2004
Introduction
Me th odolog y of
E c onom e tric s
(3) Specification of the
econometric model of the
theory
Y = ß1+ ß2X + u ; 0 < ß2< 1;
Y = consumption expenditure;
X = income;
ß1 and ß2 are parameters; ß1is
intercept and ß2 is slope coefficients;
u is disturbance term or error term. It
is a random or stochastic variable
13
Prof.VuThieu May 2004
Introduction
Me th odolog y of
E c onom e tric s
Y= Personal consumption
expenditure
X= Gross Domestic Product
all in Billion US Dollars
14
Prof.VuThieu May 2004
Introduction
Me th odolog y of
E c onom e tric s
(4) Obtaining Data
Year X Y
18
Prof.VuThieu May 2004
Introduction
Me th odolog y of
E c onom e tric s
(8) Using model for control or
policy purposes
Y=4000= -231.8+0.7194 X X 5882
MPC = 0.72, an income of $5882 Bill
will produce an expenditure of $4000
Bill. By fiscal and monetary policy,
Government can manipulate the
control variable X to get the desired
level of target variable Y
19
Prof.VuThieu May 2004
Introduction
Me th odolog y of E c onom e tric s
Figure 1.4: Anatomy of economic
modelling
• 1) Economic Theory
• 2) Mathematical Model of Theory
• 3) Econometric Model of Theory
• 4) Data
• 5) Estimation of Econometric Model
• 6) Hypothesis Testing
• 7) Forecasting or Prediction
• 8) Using the Model for control or policy
purposes
20
Prof.VuThieu May 2004
Economic Theory
Estimation
Hypothesis Testing
Application
in control or
Forecasting policy
studies
21
Prof.VuThieu May 2004
B a s ic E c onom e tric s
Chapter 1:
THE NATURE OF
REGRESSION
ANALYSIS
22
Prof.VuThieu May 2004
1-1. Historical origin of the term
“Regression”
27
Prof.VuThieu May 2004
1-4. Regression vs. Causation:
28
Prof.VuThieu May 2004
1-5. Regression vs.
Correlation
Correlation Analysis: the primary objective
is to measure the strength or degree of
linear association between two variables
(both are assumed to be random)
Regression Analysis: we try to estimate or
predict the average value of one variable
(dependent, and assumed to be stochastic)
on the basis of the fixed values of other
variables (independent, and non-stochastic)
29
Prof.VuThieu May 2004
1-6. Terminology and Notation
Dependent Variable Explanatory
Variable(s)
Explained Variable
Independent
Variable(s)
Predictand
Predictor(s)
Regressand
Regressor(s)
Response Stimulus or control
variable(s)
Endogenous
Exogenous(es)
30
Prof.VuThieu May 2004
1-7. The Nature and Sources
of Data for Econometric
Analysis
1) Types of Data :
Time series data;
Cross-sectional data;
Pooled data
2) The Sources of Data
3) The Accuracy of Data
31
Prof.VuThieu May 2004
1-8. Summary and Conclusions
32
Prof.VuThieu May 2004
1-8. Summary and Conclusions
33
Prof.VuThieu May 2004
Basic Econometrics
Chapter 2:
TWO-VARIABLE
REGRESSION ANALYSIS:
Some basic Ideas
34
Prof.VuThieu May 2004
2-1. A Hypothetical Example
35
Prof.VuThieu May 2004
2-1. A Hypothetical Example
36
Prof.VuThieu May 2004
Table 2-2: Weekly family income X ($), and consumption Y ($)
Total 325 462 445 707 678 750 685 1043 966 1211
39
Prof.VuThieu May 2004
2-2. The concepts of population
regression function (PRF)
40
Prof.VuThieu May 2004
2-4. Stochastic Specification of PRF
Ui = Y - E(YX=Xi ) or Yi = E(YX=Xi ) + Ui
Ui = Stochastic disturbance or stochastic error
term. It is nonsystematic component
Component E(YX=Xi ) is systematic or
deterministic. It is the mean consumption
expenditure of all the families with the same
level of income
The assumption that the regression line
passes through the conditional means of Y
implies that E(UiXi ) = 0
41
Prof.VuThieu May 2004
2-5. The Significance of the Stochastic
Disturbance Term
Ui = Stochastic Disturbance Term is a
surrogate for all variables that are
omitted from the model but they
collectively affect Y
Many reasons why not include such
variables into the model as follows:
42
Prof.VuThieu May 2004
2-5. The Significance of the Stochastic
Disturbance Term
Why not include as many as variable into
the model (or the reasons for using ui)
+ Vagueness of theory
+ Unavailability of Data
+ Core Variables vs. Peripheral Variables
+ Intrinsic randomness in human behavior
+ Poor proxy variables
+ Principle of parsimony
+ Wrong functional form
43
Prof.VuThieu May 2004
2-6. The Sample Regression
Function (SRF)
Table 2-4: A random
sample from the Table 2-5: Another random
sample from the population
population
Y X Y X
------------------ -------------------
70 80 55 80
65 100 88 100
90 120 90 120
95 140 80 140
110 160 118 160
115 180 120 180
120 200 145 200
140 220 135 220
155 240 145 240
150 260 175 260
------------------ --------------------
44
Prof.VuThieu May 2004
Weekly Consumption
Expenditure (Y)
SRF1
SRF2
Prof.VuThieu
Weekly Income (X) 45
May 2004
2-6. The Sample Regression
Function (SRF)
47
Prof.VuThieu May 2004
2-6. The Sample Regression
Function (SRF)
49
Prof.VuThieu May 2004
2-7. Summary and Conclusions
For empirical purposes, it is the stochastic
PRF that matters. The stochastic
disturbance term ui plays a critical role in
estimating the PRF.
The PRF is an idealized concept, since in
practice one rarely has access to the entire
population of interest. Generally, one has a
sample of observations from population
and use the stochastic sample regression
(SRF) to estimate the PRF.
50
Prof.VuThieu May 2004
Basic Econometrics
Chapter 3:
TWO-VARIABLE
REGRESSION MODEL:
The problem of Estimation
51
Prof.VuThieu May 2004
3-1. The method of ordinary
least square (OLS)
Least-square criterion:
Minimizing U^2i = (Yi – Y^i) 2
= (Yi- ^1 - ^2X)2 (3.1.2)
Normal Equation and solving it for
^1 and ^2 = Least-square
estimators [See (3.1.6)(3.1.7)]
Numerical and statistical
properties of OLS are as follows:
52
Prof.VuThieu May 2004
3-1. The method of ordinary least
square (OLS)
OLS estimators are expressed solely in terms of
observable quantities. They are point estimators
The sample regression line passes through sample
means of X and Y
The mean value of the estimated Y^ is equal to the
mean value of the actual Y: E(Y) = E(Y^)
The mean value of the residuals U^i is zero:
E(u^i )=0
u^i are uncorrelated with the predicted Y^i and
with Xi : That are u^iY^i = 0; u^iXi = 0
53
Prof.VuThieu May 2004
3-2. The assumptions underlying
the method of least squares
59
Prof.VuThieu May 2004
β̂
2
60
Prof.VuThieu May 2004
3-5. The coefficient of determination r2:
A measure of “Goodness of fit”
62
Prof.VuThieu May 2004
3-5. The coefficient of determination r2:
A measure of “Goodness of fit”
63
Prof.VuThieu May 2004
B a s ic E c onom e tric s
Chapter 4:
THE NORMALITY
ASSUMPTION:
Classical Normal Linear
Regression Model
(CNLRM)
64
Prof.VuThieu May 2004
4-2.The normality assumption
CNLR assumes that each u i is distributed
normally u i N(0, 2) with:
Mean = E(u i) = 0 Ass 3
Variance = E(u2i) = 2 Ass 4
Cov(u i , u j ) = E(u i , u j) = 0 (i#j) Ass 5
Note: For two normally distributed variables,
the zero covariance or correlation means
independence of them, so u i and u j are not only
uncorrelated but also independently distributed.
Therefore u i NID(0, 2) is Normal and
Independently Distributed
65
Prof.VuThieu May 2004
4-2.The normality assumption
Why the normality assumption?
(2) With a few exceptions, the distribution of sum
of a large number of independent and
identically distributed random variables tends
to a normal distribution as the number of such
variables increases indefinitely
(3) If the number of variables is not very large or
they are not strictly independent, their sum
may still be normally distributed
66
Prof.VuThieu May 2004
4-2.The normality assumption
Why the normality assumption?
(2) Under the normality assumption
for ui , the OLS estimators ^1 and
^2 are also normally distributed
(3) The normal distribution is a
comparatively simple distribution
involving only two parameters
(mean and variance)
67
Prof.VuThieu May 2004
4-3. Properties of OLS
estimators under the normality
assumption
With the normality assumption the
OLS estimators ^1 , ^2 and ^2 have
the following properties:
1. They are unbiased
2. They have minimum variance.
Combined 1 and 2, they are efficient
estimators
3. Consistency, that is, as the sample size
increases indefinitely, the estimators
converge to their true population
values
68
Prof.VuThieu May 2004
4-3. Properties of OLS estimators
under the normality assumption
4. ^1 is normally distributed
N(1, ^12)
And Z = (^1- 1)/ ^1 is N(0,1)
5. ^2 is normally distributed N(2 ,^22)
And Z = (^2- 2)/ ^2 is N(0,1)
6. (n-2) ^2/ 2 is distributed as the
2(n-2)
69
Prof.VuThieu May 2004
4-3. Properties of OLS estimators
under the normality assumption
7. ^1 and ^2 are distributed
independently of ^2. They have
minimum variance in the entire class
of unbiased estimators, whether linear
or not. They are best unbiased
estimators (BUE)
8. Let ui is N(0, 2 ) then Yi is
N[E(Yi); Var(Yi)] = N[1+ 2X i ; 2]
70
Prof.VuThieu May 2004
Some last points of chapter 4
71
Prof.VuThieu May 2004
Some last points of chapter 4
Chapter 5:
TWO-VARIABLE
REGRESSION:
Interval Estimation
and Hypothesis Testing
73
Prof.VuThieu May 2004
Chapter 5 TWO-VARIABLE REGRESSION:
Interval Estimation and Hypothesis Testing
74
Prof.VuThieu May 2004
Chapter 5 TWO-VARIABLE REGRESSION:
Interval Estimation and Hypothesis Testing
Coefficients
Z= (^2 - 2)/se(^2) = (^2 - 2) x2i / ~N(0,1)
(5.3.1)
We did not know and have to use ^ instead, so:
t= (^2 - 2)/se(^2) = (^2 - 2) x2i /^ ~ t(n-2)
(5.3.2)
=> Interval for 2
78
Pr [ -t /2 t t /2] = 1-
Prof.VuThieu (5.3.3)
May 2004
Chapter 5 TWO-VARIABLE REGRESSION:
Interval Estimation and Hypothesis Testing
Coefficients
Or confidence interval for 2 is
Pr [^
Prof.VuThieu 1-t /2se(^1) 1 ^1+t /2se(^1)] = 1- 79
May 2004
Chapter 5 TWO-VARIABLE REGRESSION:
Interval Estimation and Hypothesis Testing
85
Prof.VuThieu May 2004
Chapter 5 TWO-VARIABLE REGRESSION:
Interval Estimation and Hypothesis Testing
H0 H1 Reject H0 if
2 = 20 2 > 20 Df.(^2)/ 20 > 2 ,df
2 = 20 2 < 20 Df.(^2)/ 20 < 2(1-),df
2 = 20 2 # 20 Df.(^2)/ 20 > 2/2,df or
< 2 (1-/2), df
86
Prof.VuThieu May 2004
Chapter 5 TWO-VARIABLE REGRESSION:
Interval Estimation and Hypothesis Testing
87
Prof.VuThieu May 2004
Chapter 5 TWO-VARIABLE REGRESSION:
Interval Estimation and Hypothesis Testing
89
Prof.VuThieu May 2004
Chapter 5 TWO-VARIABLE REGRESSION:
Interval Estimation and Hypothesis Testing
90
Prof.VuThieu May 2004
Chapter 5 TWO-VARIABLE REGRESSION:
Interval Estimation and Hypothesis Testing
91
Prof.VuThieu May 2004
Chapter 5 TWO-VARIABLE REGRESSION:
Interval Estimation and Hypothesis Testing
An illustration:
5-12. (Continued)
Under the null hypothesis H0 that the res iduals
are normally dis tributed Jarque and Bera s how
that in large sample (as ymptotically) the JB
statis tic given in (5.12.12) follows the Chi-S quare
distribution with 2 df. If the p-value of the
computed Chi-S quare s tatistic in an application is
sufficiently low, one can reject the hypothes is that
the residuals are normally dis tributed. But if p-
value is reasonable high, one does not reject the
normality ass umption.
98
Prof.VuThieu May 2004
Chapter 5 TWO-VARIABLE REGRESSION:
Interval Estimation and Hypothesis Testing
99
Prof.VuThieu May 2004
Chapter 5 TWO-VARIABLE REGRESSION:
Interval Estimation and Hypothesis Testing
100
Prof.VuThieu May 2004
Chapter 5 TWO-VARIABLE REGRESSION:
Interval Estimation and Hypothesis Testing
101
Prof.VuThieu May 2004
Chapter 5 TWO-VARIABLE REGRESSION:
Interval Estimation and Hypothesis Testing
102
Prof.VuThieu May 2004
Chapter 5 TWO-VARIABLE REGRESSION:
Interval Estimation and Hypothesis Testing
103
Prof.VuThieu May 2004
Chapter 5 TWO-VARIABLE REGRESSION:
Interval Estimation and Hypothesis Testing
104
Prof.VuThieu May 2004
B a s ic E c onom e tric s
Chapter 6
EXTENSIONS OF THE
TWO-VARIABLE LINEAR
REGRESSION MODEL
105
Prof.VuThieu May 2004
Chapter 6
EXTENSIONS OF THE TWO-VARIABLE
LINEAR REGRESSION MODELS
109
Prof.VuThieu May 2004
Chapter 6
EXTENSIONS OF THE TWO-VARIABLE LINEAR
REGRESSION MODELS
6-2. Scaling and units of measurement
112
Prof.VuThieu May 2004
6-4. How to measure elasticity
The log-linear model
Exponential regression model:
Yi= 1Xi e u i (6.4.1)
By taking log to the base e of both side:
lnYi = ln1 +2lnXi + ui , by setting ln1 =
lnYi = +2lnXi + ui (6.4.3)
(log-log, or double-log, or log-linear model)
This can be estimated by OLS by letting
Y*i = +2X*i + ui , where Y*i=lnYi, X*i=lnXi ;
2 measures the ELASTICITY of Y respect to X, that is,
percentage change in Y for a given (small) percentage 113
Prof.VuThieu
change in X. May 2004
6-4. How to measure elasticity
Chapter 7
MULTIPLE REGRESSION
ANALYSIS:
The Problem of Estimation
121
Prof.VuThieu May 2004
7-1. The three-Variable Model:
Notation and Assumptions
Yi = ß1+ ß2X2i + ß3X3i + u i (7.1.1)
ß2 , ß3 are partial regression coefficients
With the following assumptions:
+ Zero mean value of U i:: E(u i|X2i,X3i) = 0. i (7.1.2)
+ No serial correlation: Cov(ui,uj) = 0, i # j (7.1.3)
+ Homoscedasticity: Var(u i) = 2 (7.1.4)
+ Cov(ui,X2i) = Cov(ui,X3i) = 0 (7.1.5)
+ No specification bias or model correct specified (7.1.6)
+ No exact collinearity between X variables (7.1.7)
(no multicollinearity in the cases of more explanatory
vars. If there is linear relationship exits, X vars. Are said
to be linearly dependent)
+ Model is linear in parameters 122
Prof.VuThieu May 2004
7-2. Interpretation of Multiple
Regression
123
Prof.VuThieu May 2004
7-3. The meaning of partial
regression coefficients
Yi= ß1+ ß2X2i + ß3X3 +….+ ßsXs+ ui
ßk measures the change in the mean
value of Y per unit change in Xk,
holding the rest explanatory variables
constant. It gives the “direct” effect of
unit change in Xk on the E(Yi), net of Xj
(j # k)
How to control the “true” effect of a
unit change in Xk on Y? (read pages
195-197)
124
Prof.VuThieu May 2004
7-4. OLS and ML estimation of the
partial regression coefficients
This section (pages 197-201) provides:
1. The OLS estimators in the case of three-
variable regression
Yi= ß1+ ß2X2i + ß3X3+ ui
2. Variances and standard errors of OLS
estimators
3. 8 properties of OLS estimators (pp 199-201)
4. Understanding on ML estimators
125
Prof.VuThieu May 2004
7-5. The multiple coefficient of
determination R2 and the multiple
coefficient of correlation R
This section provides:
1. Definition of R2 in the context of multiple
regression like r2 in the case of two-variable
regression
2. R = R2 is the coefficient of multiple
regression, it measures the degree of association
between Y and all the explanatory variables
jointly
3. Variance of a partial regression coefficient
Var(ß^k) = 2/ x2k (1/(1-R2k)) (7.5.6)
Where ß^k is the partial regression coefficient
of regressor Xk and R2k is the R2 in the
regression of Xk on the rest regressors
126
Prof.VuThieu May 2004
7-6. Example 7.1: The
expectations-augmented Philips
Curve for the US (1970-1982)
127
Prof.VuThieu May 2004
7-7. Simple regression in the
context of multiple regression:
Introduction to specification bias
128
Prof.VuThieu May 2004
7-8. R2 and the Adjusted-R2
R2 is a non-decreasing function of the number of
explanatory variables. An additional X variable will not
decrease R2
R2= ESS/TSS = 1- RSS/TSS = 1-u^2I / y^2i (7.8.1)
This will make the wrong direction by adding more
irrelevant variables into the regression and give an idea
for an adjusted-R2 (R bar) by taking account of degree of
freedom
R2bar= 1- [ u^2I /(n-k)] / [y^2i /(n-1) ] , or (7.8.2)
R2bar= 1- ^2 / S2Y (S2Y is sample variance of Y)
K= number of parameters including intercept term
– By substituting (7.8.1) into (7.8.2) we get
R2bar = 1- (1-R2) (n-1)/(n- k) (7.8.4)
– For k > 1, R2bar < R2 thus when number of X variables
increases R2bar increases less than R2 and R2bar can be
negative 129
Prof.VuThieu May 2004
7-8. R2 and the Adjusted-R2
131
Prof.VuThieu May 2004
7-9. Partial Correlation Coefficients
132
Prof.VuThieu May 2004
7-10. Example 7.3: The Cobb-
Douglas Production function
More on functional form
(7.11.3)
Example 7.4: Estimating the Total Cost
Function
Data set is in Table 7.4
Empirical results is in page 221
--------------------------------------------------------------
7-12. Summary and Conclusions
(page 221) 134
Prof.VuThieu May 2004
B a s ic E c onom e tric s
Chapter 8
MULTIPLE REGRESSION
ANALYSIS:
The Problem of Inference
135
Prof.VuThieu May 2004
Chapter 8
MULTIPLE REGRESSION ANALYSIS:
The Problem of Inference
8-3. Hypothesis testing in multiple regression:
Testing hypotheses about an individual partial regression
coefficient
Testing the overall significance of the estimated multiple
regression model, that is, finding out if all the partial slope
coefficients are simultaneously equal to zero
Testing that two or more coefficients are equal to one
another
Testing that the partial regression coefficients satisfy
certain restrictions
Testing the stability of the estimated regression model over
time or in different cross-sectional units
Testing the functional form of regression models
136
Prof.VuThieu May 2004
Chapter 8
MULTIPLE REGRESSION ANALYSIS:
The Problem of Inference
8-4. Hypothesis testing about individual partial
regression coefficients
With the assumption that u i ~ N(0,2) we can
use t-test to test a hypothesis about any
individual partial regression coefficient.
H0: 2 = 0
H1: 2 0
If the computed t value > critical t value at the
chosen level of significance, we may reject the
null hypothesis; otherwise, we may not reject it
137
Prof.VuThieu May 2004
Chapter 8
MULTIPLE REGRESSION ANALYSIS:
The Problem of Inference
8-5. Testing the overall significance of a multiple
regression: The F-Test
For Yi = 1 + 2X2i + 3X3i + ........+ kXki + ui
To test the hypothesis H0: 2 =3 =....= k= 0 (all
slope coefficients are simultaneously zero) versus H1: Not at
all slope coefficients are simultaneously zero,
compute
F=(ESS/df)/(RSS/df)=(ESS/(k-1))/(RSS/(n-k)) (8.5.7)
(k = total number of parameters to be estimated
including intercept)
If F > F critical = F(k-1,n-k), reject H0
Otherwise
Prof.VuThieu you do not reject it 138
May 2004
Chapter 8
MULTIPLE REGRESSION ANALYSIS:
The Problem of Inference
8-5. Testing the overall significance of a multiple
regression
Alternatively, if the p-value of F obtained from
(8.5.7) is sufficiently low, one can reject H0
An important relationship between R2 and F:
F=(ESS/(k-1))/(RSS/(n-k)) or
R2/(k-1)
F = ---------------- (8.5.1)
(1-R2) / (n-k)
( see prove on page 249)
139
Prof.VuThieu May 2004
Chapter 8
MULTIPLE REGRESSION ANALYSIS:
The Problem of Inference
8-5. Testing the overall significance of a multiple
regression in terms of R2
For Yi = 1 + 2X2i + 3X3i + ........+ kXki + ui
To test the hypothesis H0: 2 = 3 = .....= k = 0 (all
slope coefficients are simultaneously zero) versus
H1: Not at all slope coefficients are simultaneously
zero, compute
F = [R2/(k-1)] / [(1-R2) / (n-k)] (8.5.13) (k = total
number of parameters to be estimated including
intercept)
If F > F critical = F (k-1,n-k), reject H0
140
Prof.VuThieu May 2004
Chapter 8
MULTIPLE REGRESSION ANALYSIS:
The Problem of Inference
8-5. Testing the overall significance of a multiple
regression
Alternatively, if the p-value of F obtained from
(8.5.13) is sufficiently low, one can reject H0
(8.7.3)
Prof.VuThieu
146
May 2004
Chapter 8
MULTIPLE REGRESSION ANALYSIS:
The Problem of Inference
8-7. Restricted least square: Testing linear equality
restrictions
How to test (8.7.3)
The t Test approach (unrestricted): test of the hypothesis
H0:2 + 3 = 1 can be conducted by t- test:
t = [(^2 + ^3) – (2 + 3)] / se(^2 - ^3) (8.7.4)