Module 11: GLM Estimation
GLM
A standard GLM can be written:
Y = X" + !
where
! ~ N (0, V )
&Y1 # &1 X 11 " X 1 p # & ) 0 # & (1 # ! $) ! $ ! $Y ! $1 X " X (2 ! 21 2p! $ 1 ! $ 2! = $ $ ' + ! ! ! $ ! ! $!! $ ! ! $! ! $ ! $ ! $ ! $1 X " X Y np np " % ) p " % n" % %( n "
V is the covariance matrix whose format depends on the noise model.
fMRI Data
Design matrix
Noise
Regression coefficients
Problem Formulation
! Assume the model:
Y = X" + !
! ~ N (0, I! )
! The matrices X and Y are assumed to be known, and the noise is considered to be uncorrelated. ! Our goal is to find the value of ! that minimizes:
(y ! X! ) (y ! X! )
OLS Solution
Ordinary least squares solution
T !1 T ! = ( X X) X y
y e
Properties: Maximum likelihood estimate
X1 X2
fit
=! E !
( )
!1 2 T Var (! ) = " ( X X )
Can we do better?
Gauss Markov Theorem
! The Gauss-Markov Theorem states that any other unbiased estimator of ! will have a larger variance than the OLS solution.
! is an unbiased estimator of !. ! Assume !
! Then according to G-M Theorem,
) ! ) ! Var (! Var (!
is the best linear unbiased estimator (BLUE) of !. ! !
Estimation
! If " is i.i.d., then Ordinary Least Square (OLS) estimate is optimal
model estimate
Y = X" + !
!1 ! = ( X' X) X' Y
! If Var(") =V#2 $ I#2, then Generalized Least Squares (GLS) estimate is optimal
model estimate
Y = X" + !
= ( X' V !1X) !1 X' V !1Y !
GLM Summary
model estimate
Y = X" + !
fitted values
!1 !1 !1 ! = ( X' V X) X' V Y
residuals
= X! Y
r = Y!Y
= (I ! (X ' V!1X )!1 X ' V!1 )Y
= RY
Estimating the Variance
! Even if we assume " is i.i.d., we still need to estimate the residual variance, #2.
! Our estimate:
rT r = ! tr ( RV )
2
! For OLS:
rT r = " N!p
2
! Estimating V $ I more difficult.
End of Module
!"#$%&'('&