Chapter 8 (Sec8.1, Sec8.3, Sec8.4)
Chapter 8 (Sec8.1, Sec8.3, Sec8.4)
Linear Algebra:
Matrix Eigenvalue Problems
5 2
A .
2 2
EXAMPLE 1 (continued 1)
Determination of Eigenvalues and Eigenvectors
Solution.
(a) Eigenvalues. These must be determined first.
Equation (1) is
5 2 x1 x1
Ax ;
2 2 x2 x2
in components
5x1 2 x2 x1
2 x1 2 x2 x2 .
EXAMPLE 1 (continued 2)
Determination of Eigenvalues and Eigenvectors
Solution. (continued 1)
(a) Eigenvalues. (continued 1)
Transferring the terms on the right to the left, we get
( 5 )x1 2 x2 0
(2*)
2 x1 ( 2 )x2 0
This can be written in matrix notation
(3*) (A I)x 0
Because (1) is Ax − λx = Ax − λIx = (A − λI)x = 0,
which gives (3*).
EXAMPLE 1 (continued 3)
Determination of Eigenvalues and Eigenvectors
Solution. (continued 2)
(a) Eigenvalues. (continued 2)
We see that this is a homogeneous linear system. It has a
nontrivial solution (an eigenvector of A we are looking for)
if and only if its coefficient determinant is zero, that is,
5 2
D( ) det( A I)
2 2
(4*) ( 5 )( 2 ) 4 2 7 6 0.
EXAMPLE 1 (continued 4)
Determination of Eigenvalues and Eigenvectors
Solution. (continued 3)
(a) Eigenvalues. (continued 3)
We call D(λ) the characteristic determinant or, if expanded,
the characteristic polynomial, and D(λ) = 0 the
characteristic equation of A. The solutions of this
quadratic equation are λ1 = −1 and λ2 = −6. These are the
eigenvalues of A.
(b1) Eigenvector of A corresponding to λ1. This vector is
obtained from (2*) with λ = λ1 = −1, that is,
4 x1 2 x2 0
2 x1 x2 0.
Section 8.1 p10 Advanced Engineering Mathematics, 10/e by Edwin Kreyszig
Copyright 2011 by John Wiley & Sons. All rights reserved.
8.1 The Matrix Eigenvalue Problem. Determining
Eigenvalues and Eigenvectors
EXAMPLE 1 (continued 5)
Determination of Eigenvalues and Eigenvectors
Solution. (continued 4)
(b1) Eigenvector of A corresponding to λ1. (continued)
A solution is x2 = 2x1, as we see from either of the two
equations, so that we need only one of them. This
determines an eigenvector corresponding to λ1 = −1 up to a
scalar multiple. If we choose x1 = 1, we obtain the
eigenvector
1 5 2 1 1
x1 , Check: Ax1 ( 1) x1 1 x1 .
2 2 2 2 2
EXAMPLE 1 (continued 6)
Determination of Eigenvalues and Eigenvectors
Solution. (continued 5)
(b2) Eigenvector of A corresponding to λ2.
For λ = λ2 = −6, equation (2*) becomes
x1 2 x2 0
2 x1 4 x2 0.
A solution is x2 = −x1/2 with arbitrary x1. If we choose x1 = 2,
we get x2 = −1. Thus an eigenvector of A corresponding to
λ2 = −6 is
2 5 2 2 12
x 2 , Check: Ax 2 ( 6)x2 2 x2 .
1 2 2 1 6
an1 x1 ann xn xn .
Transferring the terms on the right side to the left side, we
have
( a11 )x1 a12 x2 a1n xn 0
a21 x1 ( a22 )x2 a 2 n xn 0
(2)
an1 x1 an 2 x2 ( ann )xn 0.
In matrix notation,
(3) (A I)x 0.
By Cramer’s theorem in Sec. 7.7, this homogeneous linear
system of equations has a nontrivial solution if and only if
the corresponding determinant of the coefficients is zero:
a11 a12 a1n
a21 a22 a2 n
(4) D( ) det( A I) 0.
an1 an 2 ann
Theorem 1
Eigenvalues
The eigenvalues of a square matrix A are the roots of the
characteristic equation (4) of A.
Hence an n × n matrix has at least one eigenvalue and at most n
numerically different eigenvalues.
Theorem 2
Eigenvectors, Eigenspace
If w and x are eigenvectors of a matrix A corresponding to the
same eigenvalue λ, so are w + x (provided x ≠ −w) and kx for
any k ≠ 0.
Hence the eigenvectors corresponding to one and the same
eigenvalue λ of A, together with 0, form a vector space, called the
eigenspace of A corresponding to that λ.
2 2 3
A 2 1 6 .
1 2 0
Theorem 3
Eigenvalues of the Transpose
The transpose AT of a square matrix A has the same eigenvalues
as A.
Definitions
Symmetric, Skew-Symmetric, and Orthogonal Matrices
A real square matrix A = [ajk] is called
symmetric if transposition leaves it unchanged,
(1) AT = A, thus akj = ajk,
skew-symmetric if transposition gives the negative of A,
(2) AT = −A, thus akj = −ajk,
orthogonal if transposition gives the inverse of A,
(3) AT = A−1.
Theorem 1
Eigenvalues of Symmetric
and Skew-Symmetric Matrices
(a) The eigenvalues of a symmetric matrix are real.
(b) The eigenvalues of a skew-symmetric matrix are pure
imaginary or zero.
Orthogonal Transformations
and Orthogonal Matrices
Orthogonal transformations are transformations
(5) y = Ax where A is an orthogonal matrix.
Theorem 3
Orthonormality of Column and Row Vectors
A real square matrix is orthogonal if and only if its column vectors
a1, … , an (and also its row vectors) form an orthonormal system,
that is,
0 if j k
(10) a j ak a j ak
T
1 if j k.
Theorem 4
Determinant of an Orthogonal Matrix
The determinant of an orthogonal matrix has the value +1 or −1.
Theorem 5
Eigenvalues of an Orthogonal Matrix
The eigenvalues of an orthogonal matrix A
are real or complex conjugates in pairs and
have absolute value 1.
Theorem 1
Basis of Eigenvectors
Theorem 2
Symmetric Matrices
DEFINITION
Theorem 3
Eigenvalues and Eigenvectors of Similar Matrices
Theorem 4
Diagonalization of a Matrix
Solution.
• The characteristic determinant gives the characteristic
equation −λ3 −λ2 + 12λ = 0. The roots (eigenvalues of A)
are λ1 = 3, λ2 = −4, λ3 = 0.
• By the Gauss elimination applied to (A − λI)x = 0 with λ
= λ1, λ2, λ3 we find eigenvectors to form X and then find
X−1 by the Gauss–Jordan elimination.
Section 8.4 p44 Advanced Engineering Mathematics, 10/e by Edwin Kreyszig
Copyright 2011 by John Wiley & Sons. All rights reserved.
8.4 Eigenbases. Diagonalization.
Quadratic Forms
EXAMPLE 4 (continued 1) Diagonalization
Solution. (continued 1)
The results are
1 1 2 1 1 2
x 1 3 , x 2 = 1 , x 3 = 1 , X 3 1 1 ,
1 3 4 1 3 4
D X 1AX X 1A x 1 x2 x 3
X 1 1x1 2 x 2 3 x 3
0.7 0.2 0.3 3 4 0 3 0 0
1.3 0.2 0.7 9 4 0 0 4 0 .
0.8 0.2 0.2 3 12 0 0 0 0
j 1 k 1
Theorem 5
Principal Axes Theorem
Notations
DEFINITION
Hermitian, Skew-Hermitian, and Unitary Matrices
Eigenvalues
It is quite remarkable that the matrices under
consideration have spectra (sets of eigenvalues; see Sec.
8.1) that can be characterized in a general way as follows
(see Fig. 163).
Theorem 1
Eigenvalues
Theorem 2
Invariance of Inner Product
DEFINITION
Unitary System
1 if j k.
Theorem 4
Determinant of a Unitary Matrix
Theorem 5
Basis of Eigenvectors
j 1 k 1