0% found this document useful (0 votes)
41 views9 pages

Module 11

This document provides an overview of generalized linear models (GLM) estimation. It discusses: 1) The standard form of a GLM as Y = Xβ + ε, where ε ~ N(0,V) and V is the covariance matrix depending on the noise model. 2) The ordinary least squares (OLS) solution that minimizes (y - Xβ)T(y - Xβ) to estimate β as β̂ = (XTX)-1XTy. 3) The Gauss-Markov theorem stating that among unbiased estimators, OLS produces the smallest variance for β. 4) Generalized least squares (GLS) estimation to account

Uploaded by

augustus1189
Copyright
© Attribution Non-Commercial (BY-NC)
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
41 views9 pages

Module 11

This document provides an overview of generalized linear models (GLM) estimation. It discusses: 1) The standard form of a GLM as Y = Xβ + ε, where ε ~ N(0,V) and V is the covariance matrix depending on the noise model. 2) The ordinary least squares (OLS) solution that minimizes (y - Xβ)T(y - Xβ) to estimate β as β̂ = (XTX)-1XTy. 3) The Gauss-Markov theorem stating that among unbiased estimators, OLS produces the smallest variance for β. 4) Generalized least squares (GLS) estimation to account

Uploaded by

augustus1189
Copyright
© Attribution Non-Commercial (BY-NC)
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 9

Module 11: GLM Estimation

GLM
A standard GLM can be written:

Y = X" + !
where

! ~ N (0, V )

&Y1 # &1 X 11 " X 1 p # & ) 0 # & (1 # ! $) ! $ ! $Y ! $1 X " X (2 ! 21 2p! $ 1 ! $ 2! = $ $ ' + ! ! ! $ ! ! $!! $ ! ! $! ! $ ! $ ! $ ! $1 X " X Y np np " % ) p " % n" % %( n "

V is the covariance matrix whose format depends on the noise model.

fMRI Data

Design matrix

Noise

Regression coefficients

Problem Formulation
! Assume the model:

Y = X" + !

! ~ N (0, I! )

! The matrices X and Y are assumed to be known, and the noise is considered to be uncorrelated. ! Our goal is to find the value of ! that minimizes:

(y ! X! ) (y ! X! )

OLS Solution
Ordinary least squares solution
T !1 T ! = ( X X) X y

y e

Properties: Maximum likelihood estimate

X1 X2

fit

=! E !

( )

!1 2 T Var (! ) = " ( X X )

Can we do better?

Gauss Markov Theorem


! The Gauss-Markov Theorem states that any other unbiased estimator of ! will have a larger variance than the OLS solution.
! is an unbiased estimator of !. ! Assume !

! Then according to G-M Theorem,

) ! ) ! Var (! Var (!
is the best linear unbiased estimator (BLUE) of !. ! !

Estimation
! If " is i.i.d., then Ordinary Least Square (OLS) estimate is optimal
model estimate

Y = X" + !

!1 ! = ( X' X) X' Y

! If Var(") =V#2 $ I#2, then Generalized Least Squares (GLS) estimate is optimal
model estimate

Y = X" + !

= ( X' V !1X) !1 X' V !1Y !

GLM Summary
model estimate

Y = X" + !
fitted values

!1 !1 !1 ! = ( X' V X) X' V Y

residuals

= X! Y

r = Y!Y
= (I ! (X ' V!1X )!1 X ' V!1 )Y

= RY

Estimating the Variance


! Even if we assume " is i.i.d., we still need to estimate the residual variance, #2.
! Our estimate:
rT r = ! tr ( RV )
2

! For OLS:

rT r = " N!p
2

! Estimating V $ I more difficult.

End of Module

!"#$%&'('&

You might also like