0% found this document useful (0 votes)
64 views

Lecture16 Module3 Anova 1

1. The document discusses analysis of variance (ANOVA) for a two-way classification model with interactions. 2. It presents the linear model and defines the parameters and error terms. 3. It shows how to calculate sum of squares for different effects - factors A and B, and their interaction - by minimizing the error sum of squares subject to various null hypotheses. F-statistics are used to test these null hypotheses at a given significance level.

Uploaded by

amanpreet2190
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
64 views

Lecture16 Module3 Anova 1

1. The document discusses analysis of variance (ANOVA) for a two-way classification model with interactions. 2. It presents the linear model and defines the parameters and error terms. 3. It shows how to calculate sum of squares for different effects - factors A and B, and their interaction - by minimizing the error sum of squares subject to various null hypotheses. F-statistics are used to test these null hypotheses at a given significance level.

Uploaded by

amanpreet2190
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 10

Analysis of Variance and Analysis of Variance and Analysis of Variance and Analysis of Variance and

Design Design of Experiments of Experiments--II


MODULE MODULE III III
gg pp
EXPERIMENTAL DESIGN EXPERIMENTAL DESIGN
LECTURE LECTURE - - 16 16
EXPERIMENTAL DESIGN EXPERIMENTAL DESIGN
MODELS MODELS
Dr. Shalabh
Department of Mathematics and Statistics
Indian Institute of Technology Kanpur
Two-way classification with interactions
2
Consider the two-way classification with an equal number, say K observations per cell. Let : k
th
observation in
(i, j)
th
cell , i.e., receiving the treatments i
th
level of factor A and j
th
level of factor B,
and are independentlydrawn from so that the linear model under consideration is
ijk
y
1,2,..., ; 1,2,..., ; 1,2,..., i I j J k K = = =
ijk
y
2
( , )
ij
N and are independently drawn from so that the linear model under consideration is
ijk
y ( , )
ij
N
ijk ij ijk
y = +
where are identically and independently distributed following Thus
ijk

2
(0, ). N
( )
( ) ( ) ( )
where
ijk ij
oo io oo oj oo ij io oj oo
i j ij
E y


=
= + + + +
= + + +
where
oo
i io oo
j oj oo



=
=
=
1 1 1 1
0, 0, 0, 0.
with
ij ij io oj oo
I J I J
i j ij ij
i j i j


= = = =
= +
= = = =

1 1 1 1 i j i j
Assume that the design matrix X is of full rank so that all the parametric functions of are estimable.
ij

: 0
The null hypothesis are
= H = = =
3
0 1 2
0 1 2
0
: ... 0
: ... 0
: 0 , . all All for
=
=
I
J
ij
H
H
H i j

= = =
= = =
=
1
1
: ,
: ,
The corresponding alternative hypothesis is
At least one for
At least one for


i j
i j
H i j
H i j





1
1
: , . At least one for
i j
ij ik
j
H j k



Minimizing the error sum of squares
I J K
2
1 1 1
( ) ,
I J K
ijk i j ij
i j k
E y
= = =
=

0
the normal equations are obtained as
E
= 0,
0 ,
0
for all
for all and


i
E
i
E
j

= 0
0 .
for all and
for all and


j
ij
j
E
i j

1
The least squares estimates are obtained as
I J K
4
1 1 1
1
1



I J K
ooo ijk
i j k
I
i ioo ooo ijk ooo
i
y y
IJK
y y y y
JK

= = =
=
= =
= =

1
1


J
j ojo ooo ijk ooo
j
ij ijo ioo ojo ooo
y y y y
IK
y y y y

=
= =
= +

1 1
1

I J
ijk ioo ojo ooo
i j
y y y y
K
= =
= +

The error sumof square is


2

, , ,
1 1 1
2
( )

( )

The error sum of square is
i j
I J K
ijk i j ij
ij i j k
I J K
SSE Min y

= = =
=

2
1 1 1
2
1 1 1
( )
( )
ijk i j ij
i j k
I J K
ijk ijo
i j k
y
y y

= = =
= = =
=
=

2
2
~ ( ( 1)). with
SSE
IJ K


Now minimizing the error sum of squares under , i.e., minimizing
0 1 2
... 0

= = = = =
I
H
5
2
1
1 1 1
( )
= = =
=

I J K
ijk j ij
i j k
E y
with respect to and and solving the normal equations ,
j

ij
1 1 1
0, 0 0 for all and for all and
j ij
E E E
j i j


= = =

gives the least squares estimates as gives the least squares estimates as

ooo
j ojo ooo
y
y y

=
=
.
ij ijo ioo ojo ooo
y y y y = +
The sum of squares due to is
0
H

2
, ,
1 1 1
( )
j ij
I J K
ijk j ij
i j k
Min y


= = =

2
1 1 1
2 2

( )
( ) ( )
I J K
ijk j ij
i j k
I J K I
ijk ijo ioo ooo
y
y y JK y y

= = =
=
= +


1 1 1 1
2
1
( ) .
ijk ijo ioo ooo
i j k i
I
ioo ooo
i
SSE JK y y
= = = =
=
= +

Thus the sum of squares due to deviation from or the sum of squares due to effect A is
0
H

6
2
0
1
2
2
( )
~ ( 1).
Sum of squares due to
with


I
ioo ooo
i
SSA H SSE JK y y
SSA
I

=
= =

Minimizing the error sum of squares under i.e., minimizing


0 1 2
: ... 0
J
H

= = = =
2
2
1 1 1
( )
I J K
ijk i ij
i j k
E y
= = =
=

and solving the normal equations
2 2 2
0, 0 0 for all and for all and
i ij
E E E
j i j


= = =

yields the least squares estimators

ooo
i iooo ooo
y
y y

=
=
.
ij ijo ioo ojo ooo
y y y y = +
The minimum error sum of squares is
I J K J
2 2
1 1 1 1
( ) ( )
ijk i ij ojo ooo
i j k j
y SSE IK y y
= = = =
= +

and the sum of squares due to deviation from or the sum of squares due to effect B is
0
H

7
0
2
1
2
( )
( 1)
Sum of squares due to
ith


J
ojo ooo
j
SSB H SSE
IK y y
SSB
J

=
=
=

Next, minimizing the error sum of squares under for all i, j, i.e., minimizing
2
2
~ ( 1). with J


0
: 0
ij
H all

=
2
3
1 1 1
( )
I J K
ijk i j
i j k
E y
= = =
=

with respect to and and solving the normal equations ,
j

yields the least squares estimators as


3 3 3
0, 0 0 for all and for all
i j
E E E
i j


= = =

.
ooo
i ioo ooo
j ojo ooo
y
y y
y y

=
=
=
j ojo ooo
y y
The sum of squares due to is
0
H

8
2
, ,
1 1 1
2
( )

( )
i j
I J K
ijk i j
i j k
I J K
ijk i j
Min y
y



= = =

=

1 1 1
2

1 1
( ) .
i j k
I J
ijo ioo ojo ooo
i j
SSE K y y y y
= = =
= =
= + +

Thus the sum of squares due to deviation from or the sum of squares due to interaction effect AB is
0
H

0
2
( )
Sum of squares due to
I J
SSAB H SSE
K

=

2
1 1
2
2
( )
~ (( 1) 1)). with

ijo ioo ojo ooo
i j
K y y y y
SSAB
I J

= =
= +


The total sum of squares can be partitioned as
TSS = SSA + SSB + SSAB + SSE
where SSA, SSB, SSAB and SSE are mutually orthogonal. So either using the independence of SSA, SSB, SSAB and
2
SSE as well as their respective distributions or using the likelihood ratio test approach, the decision rules for the
null hypothesis at level of significance are based on F - statistics as follows:
2

( 1) IJ K SSA
9
[ ]
[ ]
1 0
2 0
( 1)
. ~ ( 1, ( 1) ,
1
( 1)
. ~ ( 1, ( 1) ,
under
under

IJ K SSA
F F I IJ K H
I SSE
IJ K SSB
F F J IJ K H

=
[ ]
[ ]
2 0
3 0
( , ( ) ,
1
( 1)
. ~ ( 1)( 1), ( 1) .
( 1)( 1)
and
under
J SSE
IJ K SSAB
F F I J IJ K H
I J SSE

=

[ ]
0 1 1
0 2 1
( 1), ( 1)
So
Reject
Reject
if
if
H F F I IJ K
H F F

>
>
[ ]
( 1), ( 1) J IJ K


If or is rejected, one can use t -test or multiple comparisontest to find which pairs of are

[ ]
0 3 1
( 1)( 1), ( 1) . Reject if H F F I J IJ K

>
0
H
0
H

' '
i j
s s or
If or is rejected, one can use t test or multiple comparison test to find which pairs of are
significantly different.
If is rejected, one would not usually explore it further but theoretically t- test or multiple comparison tests can be
0
H
0
H

i j
s s o
0
H

used.
It can also be shown that
2 2
1
( )
1
I
i
i
J
JK
E SSA
I

=
= +


10
2 2
1
2 2
1 1
( )
1
( )
( 1)( 1)
J
j
j
I J
ij
i j
IK
E SSB
J
K
E SSAB
I J


=
= =
= +

= +

2
( ) . E SSE =
The analysis of variance table is as follows:
Source of Degrees of Sum of squares Mean squares F - value Source of
variation
Degrees of
freedom
Sum of squares Mean squares F value
Factor A ( I - 1) SSA
1
MSA
F
MSE
=
1
MSA
MSA
I
=

1
( , ) C F p n p

=
Factor B (J - 1) SSB
2
MSB
F
MSE
=
1
MSB
MSB
J
=

0
0
: H =
Interaction AB (I - 1) (J - 1) SSAB
3
MSAB
F
MSE
=
( )( 1)
SSAB
MSAB
I I J
=

SSE
Error I J (K - 1) SSE
Total (I J K 1) TSS
( 1)
SSE
MSE
IJ K
=

You might also like