Lecture 9 PRINCIPAL COMPONENTS
Lecture 9 PRINCIPAL COMPONENTS
Theorem: Let ∑ be the covariance matrix associated with the random vector 𝑋 ′ =
[𝑋1 , 𝑋2 , … … … . . , 𝑋𝑝 ]. Let ∑ have the eigenvalue-eigenvector pairs
(𝜆1 , 𝑒1 ), (𝜆2 , 𝑒2 ), … … … … , (𝜆𝑝 , 𝑒𝑝 ) where 𝜆1 ≥ 𝜆2 … … … … . ≥ 𝜆𝑝 ≥ 0. Then the ith
principal component is given by
𝑌𝑖 = 𝑒𝑖′ 𝑋 = 𝑒𝑖1 𝑋1 + 𝑒𝑖2 𝑋2 + ⋯ … … … . . +𝑒𝑖𝑝 𝑋𝑝 , 𝑖 = 1, 2, … … … … … . , 𝑝
With these choices,
𝑉𝑎𝑟(𝑌𝑖 ) = 𝑒𝑖′ ∑𝑒𝑖 = 𝜆𝑖 , 𝑖 = 1,2, … … … … , 𝑝
} … … … … . (4)
𝐶𝑜𝑣 (𝑌𝑖 , 𝑌𝑘 ) = 𝑒𝑖′ ∑𝑒𝑘 = 0, 𝑖≠𝑘
If some 𝜆𝑖 are equal, the choices of the corresponding coefficient vectors, 𝑒𝑖 , and
hence 𝑌𝑖 , are not unique.
The proportion of total variance accounted for by the first principal component is
𝜆1 5.83
= = .73
𝜆1 + 𝜆2 + +𝜆3 8
Further, the first two components account for a proportion (5.83 + 2)/8 = .98 of the
population variance. In this case, the components Y 1 and Y2 could replace the
original three variables with little loss of information.
Next, we obtain