0% found this document useful (0 votes)
243 views3 pages

Homework 2

1. The document contains homework problems on linear models. 2. For problem 2, it proves that the cell means μij are estimable for all i and j in a two-way crossed model if the differences of the form αi - αi' and βj - βj' are estimable. 3. For problem 3, it shows that the projection matrix P onto the column space of X is equal to X(X'V-1X)-1X'V-1, proving P is an orthogonal projection. 4. For problem 4, it derives an expression for the difference in probabilities Pr[U>C|U~χk2+2] - Pr[

Uploaded by

euler96
Copyright
© Attribution Non-Commercial (BY-NC)
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
243 views3 pages

Homework 2

1. The document contains homework problems on linear models. 2. For problem 2, it proves that the cell means μij are estimable for all i and j in a two-way crossed model if the differences of the form αi - αi' and βj - βj' are estimable. 3. For problem 3, it shows that the projection matrix P onto the column space of X is equal to X(X'V-1X)-1X'V-1, proving P is an orthogonal projection. 4. For problem 4, it derives an expression for the difference in probabilities Pr[U>C|U~χk2+2] - Pr[

Uploaded by

euler96
Copyright
© Attribution Non-Commercial (BY-NC)
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 3

1

線性模式作業二 080501

HW2 Due: May 6

1. Prove that if r(A2)=r(A) then C(A) is the orthogonal complement to N(A).


2. In the two-way crossed example, if differences of the form α i  α i ' i  i' and

β j  β j' j  j' are estimable, prove that the cell means μ ij  μ  α i  β j are

estimable for all i, j, irrespective of whether the cell is filled or empty.


3. If the inner product of two vectors u and v is defined as u′V-1v, then Xβ0GLS is an
orthogonal projection onto C(X) and the projection matrix is
W=X(X′V-1X)-X′V-1.

4. Find an expression for Pr[U  C | U ~ k22 ]  Pr[U  C | U ~ k2 ] using

integration by parts.

0   1 2
5. Let Y  a  BX with a   1 , B   1 0  , and X ~ N2 (0, I 2 ) .
 
1   0 1 

a. Find a basis for N(V) where V=cov(Y)


b. In what space does the distribution of Y lie?
c. Find Pr( 2Y1  Y2  2 )
d. Find Pr( Y1  Y2  2Y3  0 )
2
線性模式作業二 080501

Homework 2
2. Since i   p and  j   q are estimable for all i  p, j  q, i  1...a , j  1...b.

 
 0a 0b 
 1
 1 
0b   
 a  
 C    I a 1 0b1  ,  =  a  , r (C )  a  b  2
   1 
 0a 1b 
0   
 a 1  I b1  
  b 
Let two-way crossed model matrix is X .
we have r (C )  a  b  2  r ( X )  r ( X )  a  b  1
Thus for all i  1...a, j  1...b, at least one    p   q is estimable for some p, q.
and we know linear combinations of estimable functions are estimable.
   i   j  (    p   q )  ( i   p )  (  j   q )
must be estimable i, j.

3. Let Py  X  GLS
0
 X ( X V 1 X )  X V 1 y
we have known that P  X ( X V 1 X )  X V 1 is a projection matrix onto C ( X ).
Claim Py , ( I  P ) y  0
Py , ( I  P ) y  y PV 1 ( I  P ) y

  X ( X V 1 X )  X V 1 y  V 1  I  X ( X V 1 X )  X V 1  y

 y (V 1 ) X ( X V 1 X )  X V 1 y  y (V 1 ) X ( X V 1 X )  X V 1 X ( X V 1 X )  X V 1 y
 y (V 1 ) X ( X V 1 X )  X V 1 y  y (V 1 ) X ( X V 1 X )  X V 1 y
 0 done!

4. Pr[U  c | U ~  k2 2 ]
k 2 x
1 
 x 2 e 2
 k 2
dx
c
k 2 2
( )2
2
 k k
k 2 1
 u  x 2 du  x dx
Let  2
 
x

x

dv  e dx
2 v  2 e 2
3
線性模式作業二 080501


k x k x k c k x
 1   1 
2 x 2 e 2  x2 e 2 c2e 2  x2 e 2
 k 2
 k k 2
 k
 k 2
dx
c c
k 2 2 k 2 k 2 2 k
( )2 ( )2 2 ( )2 ( )2 2
2 c 2 2 2
k c

2
c e 2
 k
 Pr[U  C | U ~  k2 ]
k 2 2
( )2
2
k c

2
c e 2
 Pr[U  c | U ~  k2 2 ]  Pr[U  C | U ~  k2 ]  k
.
k 2 2
( )2
2
5. Y  a  BX , X ~ N (0, I )
2 2

  0   5 1 2  
 
 Y ~ N 3 ( a, BB)  N 3   1 ,  1 1 0  
   
  1   2 0 1 
   
a).
V  cov(Y )  BB  r (V)  2  N (V )  span 1 1 2  .
b).
by (a) (1,1, 2) is one of the normal vector of Y
 Y1  Y2  2Y3  c, the space though (0, 1,1)  Y1  Y2  2Y3  3.
c). Pr(2Y1  Y2  2)
Note that: the linear combination of multivariate normal distribution
still normal distribution.
Let W  2Y1  Y2
w  2  0  ( 1)  1,  w2  4  5  1  4  ( 1)  25
3
 W ~ N (1, 23)  Pr(W  2)  Pr( Z  )  0.7257.
25
d). Pr(Y1  Y2  2Y3  0)  0
 Y distribution lies on Y1  Y2  2Y3  3
the space is parallel to Y1  Y2  2Y3  0.

You might also like