Eigenvalues and Eigenvectors 2 Lectures
Eigenvalues and Eigenvectors 2 Lectures
Department of Mathematics
Indian Institute of Technology Kharagpur
Definition
1 A nonzero vector v ∈ V is called an eigenvector or
characteristic vector of T if T(v) = λv for some scalar λ.
2 A scalar λ is called an eigenvalue or characteristic value of T if
there exists a non-zero vector v ∈ V such that T(v) = λv.
In this case, v is called an eigenvector of T corresponding to the
eigenvalue λ.
3 The set of all the eigenvalues of T is called the spectrum of T.
Definition
Let A be an n × n matrix over a field F (e.g., F = R or C).
1 A nonzero column vector v ∈ Fn is called an eigenvector or
characteristic vector of A if Av = λv for some λ ∈ F.
2 A scalar λ is called an eigenvalue or characteristic value of A if
there exists a nonzero column vector v ∈ Fn such that Av = λv.
When F = R, there is a geometric interpretation to this notion.
y
R2
2 • λv (λ > 0)
1 v
0 •
−2 −1 x
• λv (λ 0< 0) 1 2 3 4 5 6
−1
Dr. Dipankar Ghosh (IIT Kharagpur) Eigenvalues and eigenvectors
Example 1: eigenvalues and eigenvectors of stretching
Let c ∈ R. Define T : R2 −→ R2 by
x x x
T =c for every ∈ R2 .
y y y
y
c 0
The matrix representation: [T] =
3 0 c
1
(x, y)
0
−3 −2 −1 0 1 2 3 4 5 6 x
Every v (6= 0) ∈ R2 is an eigenvector of T with the eigenvalue c.
1
(x, y)
0 mirror
−1 0 1 2 3 4 x
−1
(x, −y)
x
For x 6= 0, is an eigenvector of T with eigenvalue 1.
0
0
For y 6= 0, is an eigenvector of T with eigenvalue −1.
y
These are ALL the eigenvectors of T. (Verify it!)
0 1
Consider the matrix A = over C, the set of complex
−1 0
numbers.
Does A have eigenvalues and eigenvectors over C? Ans. Yes.
Note that λ2 + 1 has solutions: ±i ∈ C.
Then, for each λ = ±i, in view of the previous slide, one should
solve the system
−λ 1 x 0
= .
−1 −λ y 0
to get
0 1 i i 0 1 i i
=i and = −i
−1 0 −1 −1 −1 0 1 1
Conclusion: The matrix A has eigenvalues and eigenvectors
over C, but not over R.
• T(x, y)
2
1 θ •
(x, y)
0
−3 −2 −1 0 1 2 3 4 x
−1
Lemma
The following statements are equivalent:
1 λ ∈ C is an eigenvalue of A.
2 det(λIn − A) = 0.
Definition
1 The characteristic polynomial of A, denoted by pA (x), is the
polynomial defined by pA (x) := det(xIn − A).
Theorem
Let A be an n × n matrix over C. For every eigenvalue λ of A, we have
1 1 6 AMA (λ) 6 n and 1 6 GMA (λ) 6 n.
Pr
i=1 AMA (λi ) = n, the sum varies over all the eigenvalues of A.
2
Proof.
From the facts deg(pA (x)) = n and pA (x) = (x − λ)AMA (λ) f (x) for
some f , one concludes that 1 6 AMA (λ) 6 n.
Since GMA (λ) = dim (Null(A − λIn )), we have 1 6 GMA (λ) 6 n.
The statement (2) follows from
Qr
pA (x) = i=1 (x − λi )AMA (λi ) and deg(pA (x)) = n.
We will skip the proof of (3).
General process:
First compute the characteristic polynomial of A:
pA (x) = det(xIn − A).
Next compute the roots of pA (x) by factorizing it into linear factors.
Which gives the eigenvalues (and their algebraic multiplicities).
Then, for each eigenvalue λ, solve the homogeneous system:
(A − λIn )X = 0
to get eigenspace of A associated to λ. (The dimension of
Null(A − λIn ) gives the geometric multiplicity of λ).
Recall that in order to solve a homogeneous linear system, you
may apply elementary row operations on the coefficient matrix to
make it into a row echelon form or row reduced echelon form,
then get the set of solutions by back substitution.
Definition
Two n × n matrices A and B are called similar if there exists an
invertible n × n matrix P such that B = P−1 AP.
Definition
A matrix A is said to be diagonalizable if A is similar to a diagonal
matrix D, i.e., if there is an invertible matrix P such that
λ1 0 · · · 0
0 λ2 · · · 0
P−1 AP = . . (a diagonal matrix).
.. . .
.. . . ..
0 0 ··· λn
3 10 5
Example 1. Show that A = −2 −3 −4 is not diagonalizable.
3 5 7
Hint. First find the eigenvalues of A. Then find algebraic and
geometric multiplicities of each eigenvalue of A. Observe that
GMA (λ) 6= AMA (λ) for some eigenvalue λ of A. (Using the above
theorem), conclude that A is not diagonalizable.
2 −1 1
Example 2. Show that A = −1 2 −1 is diagonalizable.
1 −1 2
Hint. Same procedure as above. Verify that GMA (λ) = AMA (λ) for
each eigenvalue λ of A. Hence conclude that A is diagonalizable.
Theorem (Cayley-Hamilton)
Consider the characteristic polynomial pA (x) := det(xIn − A). Then
pA (A) = 0 (zero matrix of order n × n).
Example
1
2
If A = , then pA (x) = x2 − 5x − 2. The Cayley-Hamilton
3
4
2 0 0
Theorem says that A − 5A − 2I2 = .
0 0
Theorem
Let λ1 , λ2 , . . . , λn be the eigenvalues (not necessarily distinct) of an n × n
matrix A. Then
1 det(A) = λ1 λ2 . . . λn .
2 trace(A) = λ1 + λ2 + · · · + λn , where trace(A) is the sum of the main
diagonal entries of A.
.. .. .. .. = (x − λi )
. . . . i=1
−an1 −an2 ··· x − ann
= (x − λ1 )(x − λ2 ) · · · (x − λn )
= xn + xn−1 (−λ1 − λ2 − · · · − λn ) + · · · + (−1)n (λ1 λ2 . . . λn )
From the determinant formula, we can find that the coefficient of xn−1 in
the left hand side is given by (−a11 − a22 − · · · − ann ). While the
coefficient of xn−1 from the right hand side is (−λ1 − λ2 − · · · − λn ).
Comparing both sides, we get
a11 + a22 + · · · + ann = λ1 + λ2 + · · · + λn .
Thus trace(A) = λ1 + λ2 + · · · + λn .
Dr. Dipankar Ghosh (IIT Kharagpur) Eigenvalues and eigenvectors
Some properties of eigenvalues and eigenvectors
Theorem
All the eigenvalues of a Hermitian matrix are real. In particular, all the
eigenvalues of a real symmetric matrix are real.
Dr. Dipankar Ghosh (IIT Kharagpur) Eigenvalues and eigenvectors
Skew-Hermitian matrix
1 A complex square matrix A is called a skew-Hermitian matrix if
its conjugate transpose is equal to the negative of A, i.e.,
T
A = −A.
T
Here A is the transpose of A, and −A is a matrix of the same
order whose (i, j)th entry is −aij , where aij is the (i, j)th entry of A.
2 A real skew-Hermitian matrix A is nothing but a real
skew-symmetric matrix, i.e., AT = −A.
Theorem
1 All the eigenvalues of a skew-Hermitian matrix are either purely
imaginary complex numbers or zero.
2 In particular, all the eigenvalues of a real skew-symmetric matrix are
either purely imaginary complex numbers or zero.
Theorem
1 All Hermitian (hence real symmetric) matrices are diagonalizable.
2 All skew-Hermitian (hence real skew-symmetric) matrices are
diagonalizable.
3 All unitary (hence orthogonal) matrices are diagonalizable.
Dr. Dipankar Ghosh (IIT Kharagpur) Eigenvalues and eigenvectors
Applications of Eigenvalues and Eigenvectors