0% found this document useful (0 votes)
198 views8 pages

Lecture 05 - Addendum - Chisquare, T and F Distributions

1. Chi-square, t, and F distributions are related. Chi-square distributions are indexed by degrees of freedom and relate to t distributions. The t distribution can be expressed as Z divided by the square root of chi-square over degrees of freedom. 2. The F distribution can be expressed as the ratio of two chi-square distributions divided by their respective degrees of freedom. This allows chi-square tables to also represent F distributions. 3. The t and F distributions are also related, with the t-statistic equaling the square root of F with 1 and degrees of freedom degrees of freedom. This relationship is useful in assessing partial effects in linear regression models.

Uploaded by

Najmi Tajudin
Copyright
© Attribution Non-Commercial (BY-NC)
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
198 views8 pages

Lecture 05 - Addendum - Chisquare, T and F Distributions

1. Chi-square, t, and F distributions are related. Chi-square distributions are indexed by degrees of freedom and relate to t distributions. The t distribution can be expressed as Z divided by the square root of chi-square over degrees of freedom. 2. The F distribution can be expressed as the ratio of two chi-square distributions divided by their respective degrees of freedom. This allows chi-square tables to also represent F distributions. 3. The t and F distributions are also related, with the t-statistic equaling the square root of F with 1 and degrees of freedom degrees of freedom. This relationship is useful in assessing partial effects in linear regression models.

Uploaded by

Najmi Tajudin
Copyright
© Attribution Non-Commercial (BY-NC)
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 8

Chi-square, t-, and F-Distributions (and Their Interrelationship)

1 Some Genesis

Z1, Z2, . . . , Ziid N(0,1)X2Z1+Z2+. . .+Z2.


1

Specically, if= 1,Z22 . The density function of chi-square distribution will not be pursued here. We only note that: Chi-square is aclassof distribution indexed by itsdegree of freedom, like thet-distribution. In fact, chi-square has a relation witht. We will show this later.

Mean and Variance

IfX22, we show that: E{X2}=, VAR{X2}= 2. For the above example,Zj2 ,ifEZj= 1,VARZj= 2,j= 1,2. . . , , then
1

E{X2}=E{Z1+. . .+Z}
2 2

=EZ1+. . .+EZ = 1 + 1 +. . .+ 1 =. Similarly,VAR{X2}= 2. So it suces to show E{Z2}= 1,VAR{Z2}= 2. SinceZis distributed as N(0,1),E{Z2}is calculated as 1
2

z2ez/2dz, 2

which can easily be shown to be 1. Furthermore,VAR{Z2}can be calculated through the formula:VAR{Z2}=E{Z4} (E{Z2})2.

Illustration: As an example for Law of Large Number(LLN) and Central Limit Theorem(CLT)

IfX22,X2can be written asX2=Z1+. . .+Z,for someZ1, . . . , Ziid N(0,1). Then when , X2 EX2 X2 2) =N(0,1) (CLT); 2 VAR(X X22 1 (LLN).

moreover,

Application: To Make Inference on2


IfX1, . . . , XniidN(, 2), thenZj(Xj)/N(0,1), j= 1, . . . , n. We know, from a previous context, thatnZj2, or equivalently,
1 n n n

)2 2,
n

Xj 2
1(Xj

{}=
j=1

ifisknown, or otherwise (ifis unknown)needs to be estimated (byX, say,) such that (still needs to be proved):
n 1(Xj

X)2 21.(1) 2
n

Usuallyis not known, so we use formula (1) to make inference on2. Denotes 2as thep-th percentile from the rightfor the2-distribution, the two,p

sided condence interval (CI) ofcan be constructed as follows:


n 1(Xj

X)2 < 21,/2}= 100(1)%


n

Pr{21,1/2<
n

2 2

Pr{> >2 21,1/2


n(XX)2

}= 100(1)%

n1,/2
j 1n n2n2 1(XjX)1(XjX)

< 2< 21,/2


n

21,1/2
n

That is, the 100(1)% CI for2is


n 1(XjX) 2 n 1(XjX) 2

(, 21,/2
n

) 21,1/2
n

[QUESTION:How to use this formula with actual data?]

The Interrelationship Betweent-,2-, andF- Statistics


tversus2 IfX1, . . . , XniidN(, 2), then X N(0,1). /n Whenis unknown, (XiX)2 tn1,where = n1 Note that X = /n X 1 /n 1 =Z =
3

= Combining (3) and (4) gives tn1= or, in general, t =

(XiX)2 (n1)2

Z .(4)
21
n

n1

Z ,
21
n

n1

Z
2

Fversus2 From some notes about ANOVA, we have learned that 2/a Fa,b
a

(Sir R. A. Fisher). 2/b


b

(5)

The original concern of Fisher is to construct astatisticwhich has a sampling distribution, in some extent, free from the degrees of freedomaandbunder the null hypothesis. With this concern, he presented his F-statistic in a way that: Since2has expectationa, so the numerator2/ahas expectation 1;
a a

similarly, the denominator also has expectation 1. As Fisher said,the value of F-statistic will uctuate near 1 under the null hypothesisH0:1= . . .=(if=a+ 1). From (5), we are able to express the2-distribution in terms ofF: 2/a
a

2/a =a, 1
b

Fa,= limb(2/b)

from Section 3 (LLN). So2-table can be treated as a part ofF-tables. 5.3 tversusF Z t = 2/

2 /1
1

= =F1,.

2/

Or, in other words,t2=F1,, a formulae useful in assessingpartial eectin

linear regression model.

You might also like