Eigen Vector and Value
Eigen Vector and Value
Ari Fadli
Teknik Elektro Fakultas Teknik
Universitas Jenderal Soedirman
• Terminology
• Properties
• Sample Problem
• Eigenface
Solving the equation |A – In| = 0 for leads to all the eigenvalues of A. On expending the determinant
|A – In|, we get a polynomial in . This polynomial is called the characteristic polynomial of A. The
equation |A – In| = 0 is called the characteristic equation of A.
Ax1 Ax 2 x1 x 2
A(x1 x 2 ) (x1 x 2 )
Property 2: The eigenvalues of an upper (or lower) triangular matrix are the elements on the main
diagonal.
1 0 0 0
0 2 0 0
A 1, 2, 3, 4
0 0 3 0
0 0 0 4
Property 6 : The eigenvalues of a triangular matrix are the entries on its main
diagonal.
Property 7 The scalar λ is an eigenvalue of A if and only if the equation ( A - λI )x = 0 has a nontrivial
solution, that is, if and only if the equation has a free variable.
Property 9 If n x n matrices A and B are similar, then they have the same characteristic polynomial and
hence the same eigenvalues (with the same multiplicities).
6 6 x1
3 0
3 x2
This leads to the system of equations
6 x1 6 x2 0
3x1 3 x2 0
giving x1 = –x2. The solutions to this system of equations are x1 = –r, x2 = r, where r is a
scalar. Thus the eigenvectors of A corresponding to = 2 are nonzero vectors of the
form
x1 1 1
v1 x2 r
x2 1 1
3 6 x1
3 0
6 x2
This leads to the system of equations
3x1 6 x2 0
3 x1 6 x2 0
Thus x1 = –2x2. The solutions to this system of equations are x1 = –2s and x2 = s, where s is a
scalar. Thus the eigenvectors of A corresponding to = –1 are nonzero vectors of the form
x1 2 2
v 2 x2 s
x2 1 1
Email: [email protected] Teknik Elektro Unsoed: AGREE
Sample Problem
import numpy as np import numpy as np
from numpy.linalg import eig from sklearn.preprocessing import StandardScaler
from numpy.linalg import eig
a = np.array([[0, 2],
[2, 3]]) students = np.array([[85.4, 5],
w,v=eig(a) [82.3, 6],
print('E-value:', w) [97, 7],
print('E-vector', v) [96.5, 6.5]])
sc = StandardScaler()
a = np.array([[2, 2, 4], students_scaled = sc.fit_transform(students)
[1, 3, 5], cov_matrix = np.cov(students_scaled.T)
[2, 3, 4]]) eigenvalues, eigenvectors = eig(cov_matrix)
w,v=eig(a)
print('E-value:', w) cov_matrix.dot(eigenvectors[:, 0])
print('E-vector', v) eigenvalues[0]*eigenvectors[:, 0]
5 4 2
A I 3 4 5 2
2
2
2
The characteristic polynomial of A is |A – I3|. Using row and column operations to simplify
determinants, we get
5 4 2 1 1 0
A I 3 4 5 2 4 5 2
2 2 2 2 2 2
1− 0 0
= 4 9− 2 K2 = K1 + K2
2 4 2−
1− 0 0
Final Matriks
= 4 9− 2 Elementary
2 4 2−
1 = 10 We get 2 = 1 We get
( A 10I 3 )x 0 ( A 1I 3 )x 0
5 4 2 x1 4 4 2 x1
4 5 2 x2 0 4 4 2 x2 0
2 2 8 x3 2 2 1 x3
The solution to this system The solution to this system of
of equations are x1 = 2r, x2 = 2 equations can be shown to be s t
s
r 2
2r, and x3 = r, where r is a x1 = – s – t, x2 = s, and x3 = 2t,
scalar. where s and t are scalars.
Thus the eigenspace of 1 = 1 Thus the eigenspace of 2 = 1 2t
10 is the one-dimensional is the space of vectors of the
space of vectors of the form. form.
s t 1 1
s s 1 t 0
2t 0 2
Thus the eigenspace of = 1 is a two-dimensional subspace of R3 with basis
1 1
1, 0
0 0
If an eigenvalue occurs as a k times repeated root of the characteristic
equation, we say that it is of multiplicity k.
• One of the simplest and most effective PCA approaches used in face recognition systems is the so-called eigenface
approach.
• Eigenfaces is the name given to a set of eigenvectors when they are used in the computer vision problem of
human face recognition. The eigenvectors are derived from the covariance matrix of the probability
distribution over the high-dimensional vector space of face images. The eigenfaces themselves form a basis set of
all images used to construct the covariance matrix. This produces dimension reduction by allowing the smaller set
of basis images to represent the original training images. Classification can be achieved by comparing how faces
are represented by the basis set