0% found this document useful (0 votes)
89 views129 pages

Intro 2 Eigen Stuff

Uploaded by

Zul Kid
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
89 views129 pages

Intro 2 Eigen Stuff

Uploaded by

Zul Kid
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 129

Defining Eigenstuffs The Characteristic Equation Introduction to Applications

Introduction to Eigenvalues and Eigenvectors

A. Havens

Department of Mathematics
University of Massachusetts, Amherst

April 2 - 6, 2018

A. Havens Introduction to Eigenvalues and Eigenvectors


Defining Eigenstuffs The Characteristic Equation Introduction to Applications

Outline

1 Defining Eigenstuffs
Motives
Eigenvectors and Eigenvalues

2 The Characteristic Equation


Determinant Review
The Characteristic polynomial
Similar Matrices

3 Introduction to Applications
Classifying Endomorphisms of R2
Linear Recursion and Difference Equations
Linear Differential Equations

A. Havens Introduction to Eigenvalues and Eigenvectors


Defining Eigenstuffs The Characteristic Equation Introduction to Applications

Motives

We will study the behavior of linear endomorphisms of R-vector


spaces, i.e., R-linear transformations T : V → V , by studying
subspaces E ⊆ V which are preserved via scaling by the
endomorphism:
T (x) = λx for all x ∈ E .
Such a subspace is called an eigenspace of the endomorphism T ,
associated to the number λ, which is called an eigenvalue. A
nonzero vector x such that T (x) = λx for some number λ is called
an eigenvector.

“Eigen-” is a German adjective which means “characteristic” or


“own”. Henceforth, we’ll bandy the prefix “eigen-” about without
apology, whenever we refer to objects which arise from eigenspaces
of some linear endomorphism.

A. Havens Introduction to Eigenvalues and Eigenvectors


Defining Eigenstuffs The Characteristic Equation Introduction to Applications

Motives

Motivating Applications

Understanding eigendata associated to a linear endomorphism T is


among the most fruitful ways to analyze linear behavior in
applications. There are three primary, non-exclusive themes, which
I name principal directions, characteristic dynamical modes, and
spectral methods.
Principal directions arise whenever an eigenvector determines
a physically/geometrically relevant axis or direction.
Characteristic dynamical modes arise in dynamical problems,
whenever a general solution is a superposition (i.e., a linear
combination) of certain characteristic solutions.
Spectral methods involve studying the eigenvalues themselves,
as an invariant of the object to which they are associated.
A wildly noncomprehensive list of applications follows.

A. Havens Introduction to Eigenvalues and Eigenvectors


Defining Eigenstuffs The Characteristic Equation Introduction to Applications

Motives

Motivating Applications

In mechanics, the eigenvectors of the moment of inertia tensor


for a rigid body give the principal axes.
In differential geometry, the eigenvalues of the shape operator
of a smooth surface give the principal curvature functions of
the surface, and the eigenvectors give tangent vector fields to
the lines of curvature.
In statistics, one may study large data sets via principal
component analysis (PCA), which uses eigendecomposition to
stratify the data into components which are statistically
independent (so the covariance vanishes between
components). The eigenvectors giving principal component
directions are a data-science analog of the principal axes in
mechanics.

A. Havens Introduction to Eigenvalues and Eigenvectors


Defining Eigenstuffs The Characteristic Equation Introduction to Applications

Motives

First-order linear difference equations xk = Axk−1 , which


model some discrete dynamical systems and recursive linear
equation systems, can be solved using eigentheory.
A special case is linear Markov chains, which model
probabalistic processes, and are used, e.g., in signal and image
processing, and also in machine learning.
Facial recognition software uses the concept of an eigenface in
facial identification, while voice recognition software employs
the concept of an eigenvoice. These allow dimension
reduction, and are special cases of principal component
analysis.

A. Havens Introduction to Eigenvalues and Eigenvectors


Defining Eigenstuffs The Characteristic Equation Introduction to Applications

Motives

In the study of continuous dynamical systems, eigenfunctions


of a linear differential operator are used to construct general
solutions. The eigenvalues may correspond to physically
important quantities, like rates or energies, and
eigenvectors/eigenfunctions represent solutions of the
dynamics.
In particular:
in an oscillatory system, the eigenvalues are called
eigenfrequencies, while the associated eigenfunctions represent
the shapes of corresponding vibrational modes.
quantum numbers are eigenvalues, associated to eigenstates,
which are solutions to the Schrödinger equation.
In epidemiology, the basic reproduction number, which
measures the average number of infected cases generated by
an infected individual in an uninfected population, is the
maximum eigenvalue of the “next generation matrix.”

A. Havens Introduction to Eigenvalues and Eigenvectors


Defining Eigenstuffs The Characteristic Equation Introduction to Applications

Motives

The study of spectral graph theory examines the eigenvalues


of adjacency matrices of graphs and their associated discrete
Laplacian operators to deduce properties of graphs. Such
eigenanalysis made the Google era possible (as the original
Google PageRank algorithm is based on spectral graph
analysis.)
The spectra of smooth Laplacians are of interest in the study
of elastic, vibrating membranes, such as a drum head. A
famous problem in continuum mechanics, as phrased by
Lipman Bers, is ”can you hear the shape of a drum?”
The most tenable application for us, in this class, is the
complete classification of the geometry of linear
transformations T : R2 → R2 .

A. Havens Introduction to Eigenvalues and Eigenvectors


Defining Eigenstuffs The Characteristic Equation Introduction to Applications

Motives

We will develop the theory of real eigenvectors and eigenvalues of


real square matrices and examine a few simple applications.

Many of the examples listed above require more sophisticated


mathematics, as well as additional application-specific background
beyond the scope of this course.

Nevertheless, you shall discover the power of eigenstuffs in a few


examples.

A. Havens Introduction to Eigenvalues and Eigenvectors


Defining Eigenstuffs The Characteristic Equation Introduction to Applications

Eigenvectors and Eigenvalues

Formal Definitions

Definition
Let T : Rn → Rn be a linear transformation of Rn . Then a
nonzero vector x ∈ Rn − {0} is called an eigenvector of T if there
exists some number λ ∈ R such that
T (x) = λx.
The real number λ is called a real eigenvalue of the real linear
transformation T .
Let A be an n × n matrix representing the linear transformation T .
Then, x is an eigenvector of the matrix A if and only if it is an
eigenvector of T , if and only if
Ax = λx
for an eigenvalue λ.

A. Havens Introduction to Eigenvalues and Eigenvectors


Defining Eigenstuffs The Characteristic Equation Introduction to Applications

Eigenvectors and Eigenvalues

Remark
We will prioritize the study of real eigenstuffs, primarily using 2 × 2
and 3 × 3 matrices, which give a good general sense of the theory.

However, it will later be fruitful, even for real matrices, to allow


λ ∈ C, and x ∈ Cn (the case of complex eigenvalues is related to
the geometry of rotations, and occurs in dynamical systems
featuring oscillatory behavior).

Thus, we will have a modified definition for eigenvalues and


eigenvectors in the future, when we are ready to study complex
eigentheory for real matrices.

A. Havens Introduction to Eigenvalues and Eigenvectors


Defining Eigenstuffs The Characteristic Equation Introduction to Applications

Eigenvectors and Eigenvalues

Examples in 2-Dimensions

Example
Let v ∈ R2 be a nonzero vector, and ` = Span {v}. Let
Ref ` : R2 → R2 be the linear transformation of the plane given by
reflection through the line `.
Then, since Ref ` (v) = 1v, v is an eigenvector of Ref ` with
eigenvalue 1, and ` = Span {v} is an eigenline or eigenspace of the
reflection. Note, any nonzero multiple of v is also an eigenvector
with eigenvalue 1, by linearity.
Can you describe another eigenvector of Ref ` , with a different
associated eigenvalue? What is the associated eigenspace?

A. Havens Introduction to Eigenvalues and Eigenvectors


Defining Eigenstuffs The Characteristic Equation Introduction to Applications

Eigenvectors and Eigenvalues

Examples in 2-Dimensions

Example
Let v ∈ R2 be a nonzero vector, and ` = Span {v}. Let
Ref ` : R2 → R2 be the linear transformation of the plane given by
reflection through the line `.
Then, since Ref ` (v) = 1v, v is an eigenvector of Ref ` with
eigenvalue 1, and ` = Span {v} is an eigenline or eigenspace of the
reflection. Note, any nonzero multiple of v is also an eigenvector
with eigenvalue 1, by linearity.
Can you describe another eigenvector of Ref ` , with a different
associated eigenvalue? What is the associated eigenspace?
If u ∈ R2 is any nonzero vector perpendicular to v, then u is an
eigenvector of Ref ` with eigenvalue −1. The line spanned by u is
also an eigenspace.

A. Havens Introduction to Eigenvalues and Eigenvectors


Defining Eigenstuffs The Characteristic Equation Introduction to Applications

Eigenvectors and Eigenvalues

Examples in 2-Dimensions

Example
v·x
For v and ` as above, the orthogonal projection proj` (x) = v·v x
has the same eigenspaces as Ref ` , but a different eigenvalue for
the line `⊥ = Span {u} for u ∈ R2 − {0} with u · v = 0.

Indeed, proj` u = 0, whence, u is an eigenvector whose associated


eigenvalue is 0.

It is crucial to remember: eigenvectors must be nonzero, but


eigenvalues may be zero, or any other real number.

A. Havens Introduction to Eigenvalues and Eigenvectors


Defining Eigenstuffs The Characteristic Equation Introduction to Applications

Eigenvectors and Eigenvalues

Examples in 2-Dimensions

Example
ñ ô
1 k
Let A = , for a nonzero real number k.
0 1

The map x 7→ Ax is a shearing transformation of R2 .

Given that 1 is the only eigenvalue of A, describe a basis of the


associated eigenspace.

A. Havens Introduction to Eigenvalues and Eigenvectors


Defining Eigenstuffs The Characteristic Equation Introduction to Applications

Eigenvectors and Eigenvalues

Examples in 2-Dimensions

Example
An eigenvector x of the shearing matrix A with eigenvalue 1 must
satisfy Ax = x, whence x is a solution of the homogeneous
equation Ax − I2 x = (A − I2 )x = 0.
Defining Eigenstuffs The Characteristic Equation Introduction to Applications

Eigenvectors and Eigenvalues

Examples in 2-Dimensions

Example
An eigenvector x of the shearing matrix A with eigenvalue 1 must
satisfy Ax = x, whence x is a solution of the homogeneous
equation Ax − I2 x = (A − I2 )x = 0.
Therefore the components x1 and x2 of x must satisfy
ñ ô ñ ôñ ô ñ ô
0 1−1 k x1 0x1 + kx2
= =
0 0 1−1 x2 0x1 + 0x2

A. Havens Introduction to Eigenvalues and Eigenvectors


Defining Eigenstuffs The Characteristic Equation Introduction to Applications

Eigenvectors and Eigenvalues

Examples in 2-Dimensions

Example
An eigenvector x of the shearing matrix A with eigenvalue 1 must
satisfy Ax = x, whence x is a solution of the homogeneous
equation Ax − I2 x = (A − I2 )x = 0.
Therefore the components x1 and x2 of x must satisfy
ñ ô ñ ôñ ô ñ ô
0 1−1 k x1 0x1 + kx2
= =
0 0 1−1 x2 0x1 + 0x2
=⇒ kx2 = 0 =⇒ x2 = 0 .

A. Havens Introduction to Eigenvalues and Eigenvectors


Defining Eigenstuffs The Characteristic Equation Introduction to Applications

Eigenvectors and Eigenvalues

Examples in 2-Dimensions

Example
ñ ô
t
Thus, x = , t ∈ R − {0} is an eigenvector of the shearing
0
matrix A, with eigenvalue 1, and the x1 axis is the corresponding
eigenspace.

One can check directly that there are no other eigenvalues or


eigenspaces (a good exercise!).

A. Havens Introduction to Eigenvalues and Eigenvectors


Defining Eigenstuffs The Characteristic Equation Introduction to Applications

Eigenvectors and Eigenvalues

Examples in 2-Dimensions

Example
ñ ô
0 −1
The matrix J = has no real eigenvectors.
1 0

Indeed, the only proper subspace of R2 preserved by the map


x 7→ Jx is the trivial subspace.

All lines through 0 are rotated by π/2. We will later see that this
matrix has purely imaginary eigenvalues, as will be the case with
other rotation matrices.

A. Havens Introduction to Eigenvalues and Eigenvectors


Defining Eigenstuffs The Characteristic Equation Introduction to Applications

Eigenvectors and Eigenvalues

Example: a 3 × 3 Upper triangular Matrix

Example
Consider the upper triangular matrix
 
a11 a12 a13
A =  0 a22 a23  .
 
0 0 a33

Show that the eigenvalues are the entries a11 , a22 and a33 along
the main diagonal.

A. Havens Introduction to Eigenvalues and Eigenvectors


Defining Eigenstuffs The Characteristic Equation Introduction to Applications

Eigenvectors and Eigenvalues

Example: a 3 × 3 Upper triangular Matrix

Example
Consider the upper triangular matrix
 
a11 a12 a13
A =  0 a22 a23  .
 
0 0 a33

Show that the eigenvalues are the entries a11 , a22 and a33 along
the main diagonal.
If λ is an eigenvalue of A, then there exists a nonzero vector
x ∈ R3 such that Ax = λx. But then Ax − λx = (A − λI3 )x = 0
must have a nontrivial solution.

A. Havens Introduction to Eigenvalues and Eigenvectors


Defining Eigenstuffs The Characteristic Equation Introduction to Applications

Eigenvectors and Eigenvalues

Example: a 3 × 3 Upper triangular Matrix

Example
Consider the upper triangular matrix
 
a11 a12 a13
A =  0 a22 a23  .
 
0 0 a33

Show that the eigenvalues are the entries a11 , a22 and a33 along
the main diagonal.
If λ is an eigenvalue of A, then there exists a nonzero vector
x ∈ R3 such that Ax = λx. But then Ax − λx = (A − λI3 )x = 0
must have a nontrivial solution.
But the homogeneous equation has a nontrivial solution if and only
if the square matrix A − λI3 has determinant equal to 0.
A. Havens Introduction to Eigenvalues and Eigenvectors
Defining Eigenstuffs The Characteristic Equation Introduction to Applications

Eigenvectors and Eigenvalues

Example: a 3 × 3 Upper triangular Matrix

Example
Since  
a11 − λ a12 a13
A − λI3 =  0 a22 − λ a23  ,
 
0 0 a33 − λ

A. Havens Introduction to Eigenvalues and Eigenvectors


Defining Eigenstuffs The Characteristic Equation Introduction to Applications

Eigenvectors and Eigenvalues

Example: a 3 × 3 Upper triangular Matrix

Example
Since  
a11 − λ a12 a13
A − λI3 =  0 a22 − λ a23  ,
 
0 0 a33 − λ

det(A − λI3 ) = (a11 − λ)(a22 − λ)(a33 − λ) .

A. Havens Introduction to Eigenvalues and Eigenvectors


Defining Eigenstuffs The Characteristic Equation Introduction to Applications

Eigenvectors and Eigenvalues

Example: a 3 × 3 Upper triangular Matrix

Example
Since  
a11 − λ a12 a13
A − λI3 =  0 a22 − λ a23  ,
 
0 0 a33 − λ

det(A − λI3 ) = (a11 − λ)(a22 − λ)(a33 − λ) .


Thus det(A − λI3 ) = 0 ⇐⇒ (a11 − λ)(a22 − λ)(a33 − λ) = 0,
which holds if and only if λ ∈ {a11 , a22 , a33 }.

A. Havens Introduction to Eigenvalues and Eigenvectors


Defining Eigenstuffs The Characteristic Equation Introduction to Applications

Eigenvectors and Eigenvalues

Example: a 3 × 3 Upper triangular Matrix

Example
Since  
a11 − λ a12 a13
A − λI3 =  0 a22 − λ a23  ,
 
0 0 a33 − λ

det(A − λI3 ) = (a11 − λ)(a22 − λ)(a33 − λ) .


Thus det(A − λI3 ) = 0 ⇐⇒ (a11 − λ)(a22 − λ)(a33 − λ) = 0,
which holds if and only if λ ∈ {a11 , a22 , a33 }.

The equation det A − λI3 = (a11 − λ)(a22 − λ)(a33 − λ) = 0 is an


example of a characteristic equation.

A. Havens Introduction to Eigenvalues and Eigenvectors


Defining Eigenstuffs The Characteristic Equation Introduction to Applications

Eigenvectors and Eigenvalues

A Theorem: Eigenvalues of an Upper triangular Matrix

We can extend the idea of the above example to prove the


following theorem.
Theorem
If A is an n × n triangular matrix, then the eigenvalues of A are
precisely the elements on the main diagonal.

In particular, the eigenvalues of a diagonal matrix are the entries


{a11 , . . . , ann } of the main diagonal.

A. Havens Introduction to Eigenvalues and Eigenvectors


Defining Eigenstuffs The Characteristic Equation Introduction to Applications

Eigenvectors and Eigenvalues

A Theorem: Independence of Eigenvectors with Distinct


Eigenvalues

Theorem
If v1 , . . . , vr are eigenvectors that correspond respectively to
distinct eigenvalues λ1 , . . . , λr of an n × n matrix A, then the set
{v1 , . . . , vr } is linearly independent.

Proof.
We proceed by contradiction. Suppose {v1 , . . . , vr } is a linearly
dependent set.

A. Havens Introduction to Eigenvalues and Eigenvectors


Defining Eigenstuffs The Characteristic Equation Introduction to Applications

Eigenvectors and Eigenvalues

A Theorem: Independence of Eigenvectors with Distinct


Eigenvalues

Theorem
If v1 , . . . , vr are eigenvectors that correspond respectively to
distinct eigenvalues λ1 , . . . , λr of an n × n matrix A, then the set
{v1 , . . . , vr } is linearly independent.

Proof.
We proceed by contradiction. Suppose {v1 , . . . , vr } is a linearly
dependent set.
Observe that, being a set of eigenvectors, vi 6= 0 for any
i = 1, . . . r , and by linear dependence we can find an index p,
1 < p < r such that {v1 , . . . , vp } is linearly independent, and
vp+1 ∈ Span {v1 , . . . vp }.

A. Havens Introduction to Eigenvalues and Eigenvectors


Defining Eigenstuffs The Characteristic Equation Introduction to Applications

Eigenvectors and Eigenvalues

A Proof

Proof (continued.)
So there exist constants c1 , . . . , cp−1 not all zero such that
vp+1 = c1 v1 + . . . cp vp .
Defining Eigenstuffs The Characteristic Equation Introduction to Applications

Eigenvectors and Eigenvalues

A Proof

Proof (continued.)
So there exist constants c1 , . . . , cp−1 not all zero such that
vp+1 = c1 v1 + . . . cp vp .
Left-multiplying both sides of this relation by A, we obtain

Avp+1 = A(c1 v1 + . . . cp vp )

A. Havens Introduction to Eigenvalues and Eigenvectors


Defining Eigenstuffs The Characteristic Equation Introduction to Applications

Eigenvectors and Eigenvalues

A Proof

Proof (continued.)
So there exist constants c1 , . . . , cp−1 not all zero such that
vp+1 = c1 v1 + . . . cp vp .
Left-multiplying both sides of this relation by A, we obtain

Avp+1 = A(c1 v1 + . . . cp vp ) =⇒
λp+1 vp+1 = c1 λ1 v1 + . . . cp λp vp

A. Havens Introduction to Eigenvalues and Eigenvectors


Defining Eigenstuffs The Characteristic Equation Introduction to Applications

Eigenvectors and Eigenvalues

A Proof

Proof (continued.)
So there exist constants c1 , . . . , cp−1 not all zero such that
vp+1 = c1 v1 + . . . cp vp .
Left-multiplying both sides of this relation by A, we obtain

Avp+1 = A(c1 v1 + . . . cp vp ) =⇒
λp+1 vp+1 = c1 λ1 v1 + . . . cp λp vp

Scaling the original relation by λp+1 , and subtracting the relations,


we obtain

0 = c1 (λ1 − λp+1 )v1 + . . . cp (λp − λp+1 )vp .

A. Havens Introduction to Eigenvalues and Eigenvectors


Defining Eigenstuffs The Characteristic Equation Introduction to Applications

Eigenvectors and Eigenvalues

Proof (continued.)
This final relation is impossible:

A. Havens Introduction to Eigenvalues and Eigenvectors


Defining Eigenstuffs The Characteristic Equation Introduction to Applications

Eigenvectors and Eigenvalues

Proof (continued.)
This final relation is impossible:
since the set {v1 , . . . vp } is linearly independent, this equation
requires that the scalar weights all vanish, but we know at least
one ci must be nonzero since vp+1 is an eigenvector (and hence
nonzero), while since the eigenvalues are all distinct, λi − λp+1 6= 0
for any i = 1, . . . p.

A. Havens Introduction to Eigenvalues and Eigenvectors


Defining Eigenstuffs The Characteristic Equation Introduction to Applications

Eigenvectors and Eigenvalues

Proof (continued.)
This final relation is impossible:
since the set {v1 , . . . vp } is linearly independent, this equation
requires that the scalar weights all vanish, but we know at least
one ci must be nonzero since vp+1 is an eigenvector (and hence
nonzero), while since the eigenvalues are all distinct, λi − λp+1 6= 0
for any i = 1, . . . p.

Thus our assumption that the set {v1 , . . . , vr } was linearly


dependent is untenable.

A. Havens Introduction to Eigenvalues and Eigenvectors


Defining Eigenstuffs The Characteristic Equation Introduction to Applications

Eigenvectors and Eigenvalues

An Eigenvalue of 0

Let A be a matrix which has an eigenvector x such that the


associated eigenvalue is λ = 0. Then the eigenspace associated to
the zero eigenvalue is the null space of A.

This is easy to see. Let E0 be the 0-eigenspace. Then for any


x ∈ E0 , Ax = 0x = 0, whence x ∈ Nul A for all x ∈ E0 , which
implies E0 ⊆ Nul A. Conversely, for any x ∈ Nul A, Ax = 0 = 0x,
so x ∈ E0 , and Nul A ⊆ E0 . Thus E0 = Nul A.

It follows that A is invertible if and only if 0 is not a eigenvalue of


A.

A. Havens Introduction to Eigenvalues and Eigenvectors


Defining Eigenstuffs The Characteristic Equation Introduction to Applications

Determinant Review

Determinant Via Row-Ops

We recall some facts about determinants.

Suppose a square matrix A can be row reduced to an echelon form


B = (bij ) using only r row interchanges, and elementary row
replacements Ri − sRj 7→ Ri , without row scalings sRi 7→ Ri . Then
( Qn
(−1)r i=1 bii when A is invertible
det A = .
0 when A is not invertible

In particular, A is invertible if and only if det A 6= 0.

A. Havens Introduction to Eigenvalues and Eigenvectors


Defining Eigenstuffs The Characteristic Equation Introduction to Applications

Determinant Review

Review of Determinant Properties

Theorem
Let A, B ∈ Rn×n . Then
Qn
a. If A = (aij ) is triangular, then det A = i=1 aii , the product
of the diagonal entries.
b. det(AB) = (det A)(det B).
c. det At = det A.
d. A is invertible if and only if det A 6= 0.
e. A row replacement operation on A does not alter det A. A
row swap operation on A reverses the sign of det A. A row
scaling by s of a row of A scales the det A by s.

A. Havens Introduction to Eigenvalues and Eigenvectors


Defining Eigenstuffs The Characteristic Equation Introduction to Applications

The Characteristic polynomial

Finding Eigenvalues

Given eigenvalues of A, it is straightforward to solve for associated


eigenvectors using our knowledge of linear systems. But how do we
find the eigenvalues of A?

The observations about the determinant and invertibility are the


key.

We’ll construct a determinant equation, yielding a polynomial,


such that its solutions are the eigenvalues.

A. Havens Introduction to Eigenvalues and Eigenvectors


Defining Eigenstuffs The Characteristic Equation Introduction to Applications

The Characteristic polynomial

Determinants and Characteristic Equations


Let A ∈ Rn×n . Suppose λ is an eigenvalue of A with eigenvector
x. Then

Ax = λx =⇒ (A − λIn )x = 0 =⇒ x ∈ Nul (A − λIn ) .

Since x 6= 0 (being an eigenvector), we deduce that Nul (A − λIn )


is nontrivial, whence it is noninvertible and det(A − λIn ) = 0.
Definition
Given a matrix A ∈ Rn×n , the characteristic equation for A is

det(A − λIn ) = 0 .

The left hand expression det(A − λIn ) determines a polynomial in


λ, called the characteristic polynomial, whose real roots are
precisely the real eigenvalues of A.
A. Havens Introduction to Eigenvalues and Eigenvectors
Defining Eigenstuffs The Characteristic Equation Introduction to Applications

The Characteristic polynomial

A 2 × 2 Example

Example
ñ ô
6 8
Let A = . Find the eigenvalues and eigenvectors of A.
−2 −4
Defining Eigenstuffs The Characteristic Equation Introduction to Applications

The Characteristic polynomial

A 2 × 2 Example

Example
ñ ô
6 8
Let A = . Find the eigenvalues and eigenvectors of A.
−2 −4

Solution: The characteristic equation is


Çñ ô ñ ôå
6 8 λ 0
0 = det −
−2 −4 0 λ

A. Havens Introduction to Eigenvalues and Eigenvectors


Defining Eigenstuffs The Characteristic Equation Introduction to Applications

The Characteristic polynomial

A 2 × 2 Example

Example
ñ ô
6 8
Let A = . Find the eigenvalues and eigenvectors of A.
−2 −4

Solution: The characteristic equation is


Çñ ô ñ ôå
6 8 λ 0
0 = det −
−2 −4 0 λ
= (6 − λ)(−4 − λ) − (−2)(8)

A. Havens Introduction to Eigenvalues and Eigenvectors


Defining Eigenstuffs The Characteristic Equation Introduction to Applications

The Characteristic polynomial

A 2 × 2 Example

Example
ñ ô
6 8
Let A = . Find the eigenvalues and eigenvectors of A.
−2 −4

Solution: The characteristic equation is


Çñ ô ñ ôå
6 8 λ 0
0 = det −
−2 −4 0 λ
= (6 − λ)(−4 − λ) − (−2)(8)
= −24 + 4λ − 6λ + λ2 + 16 = λ2 − 2λ − 8
.

A. Havens Introduction to Eigenvalues and Eigenvectors


Defining Eigenstuffs The Characteristic Equation Introduction to Applications

The Characteristic polynomial

A 2 × 2 Example

Example
ñ ô
6 8
Let A = . Find the eigenvalues and eigenvectors of A.
−2 −4

Solution: The characteristic equation is


Çñ ô ñ ôå
6 8 λ 0
0 = det −
−2 −4 0 λ
= (6 − λ)(−4 − λ) − (−2)(8)
= −24 + 4λ − 6λ + λ2 + 16 = λ2 − 2λ − 8
= (λ + 2)(λ − 4) .

A. Havens Introduction to Eigenvalues and Eigenvectors


Defining Eigenstuffs The Characteristic Equation Introduction to Applications

The Characteristic polynomial

A 2 × 2 Example

Example
ñ ô
6 8
Let A = . Find the eigenvalues and eigenvectors of A.
−2 −4

Solution: The characteristic equation is


Çñ ô ñ ôå
6 8 λ 0
0 = det −
−2 −4 0 λ
= (6 − λ)(−4 − λ) − (−2)(8)
= −24 + 4λ − 6λ + λ2 + 16 = λ2 − 2λ − 8
= (λ + 2)(λ − 4) .

Thus λ1 = −2 and λ2 = 4 are the eigenvalues of A.

A. Havens Introduction to Eigenvalues and Eigenvectors


Defining Eigenstuffs The Characteristic Equation Introduction to Applications

The Characteristic polynomial

A 2 × 2 Example

Example
To obtain the eigenvectors, we must solve systems associated to
each eigenvalue:
Ä ä Ä ä
A − (−2)I2 x = 0 and A − (4)I2 x = 0.

A. Havens Introduction to Eigenvalues and Eigenvectors


Defining Eigenstuffs The Characteristic Equation Introduction to Applications

The Characteristic polynomial

A 2 × 2 Example

Example
To obtain the eigenvectors, we must solve systems associated to
each eigenvalue:
Ä ä Ä ä
A − (−2)I2 x = 0 and A − (4)I2 x = 0.
For λ1 = −2, this yields a homogeneous system with augmented
matrix ñ ô
8 8 0
,
−2 −2 0
which is solved so long as the components x1 and x2 of x satisfy
x2 = −x1 ,

A. Havens Introduction to Eigenvalues and Eigenvectors


Defining Eigenstuffs The Characteristic Equation Introduction to Applications

The Characteristic polynomial

A 2 × 2 Example

Example
To obtain the eigenvectors, we must solve systems associated to
each eigenvalue:
Ä ä Ä ä
A − (−2)I2 x = 0 and A − (4)I2 x = 0.
For λ1 = −2, this yields a homogeneous system with augmented
matrix ñ ô
8 8 0
,
−2 −2 0
which is solved so long as the components x1 and x2 of x satisfy
x2 = −x1 ,
ñ ô
1
Thus, e.g., spans the −2-eigenspace.
−1

A. Havens Introduction to Eigenvalues and Eigenvectors


Defining Eigenstuffs The Characteristic Equation Introduction to Applications

The Characteristic polynomial

A 2 × 2 Example

Example
For λ1 = 4 the corresponding homogeneous system has augmented
matrix ñ ô
2 8 0
,
−2 −8 0
which is solved whenever the components x1 and x2 satisfy
x1 = −4x2 .

A. Havens Introduction to Eigenvalues and Eigenvectors


Defining Eigenstuffs The Characteristic Equation Introduction to Applications

The Characteristic polynomial

A 2 × 2 Example

Example
For λ1 = 4 the corresponding homogeneous system has augmented
matrix ñ ô
2 8 0
,
−2 −8 0
which is solved whenever the components x1 and x2 satisfy
x1 = −4x2 .
ñ ô
4
Thus, e.g., spans the 4-eigenspace.
−1

A. Havens Introduction to Eigenvalues and Eigenvectors


Defining Eigenstuffs The Characteristic Equation Introduction to Applications

The Characteristic polynomial

A 3 × 3 example

Example
   
2 −2 −1 1
Let A =  −1 1 −1  and let v =  1 .
   
−1 −2 2 1

A. Havens Introduction to Eigenvalues and Eigenvectors


Defining Eigenstuffs The Characteristic Equation Introduction to Applications

The Characteristic polynomial

A 3 × 3 example

Example
   
2 −2 −1 1
Let A =  −1 1 −1  and let v =  1 .
   
−1 −2 2 1
(a) Show that v is an eigenvector of A. What is its associated
eigenvalue?
(b) Find the characteristic equation of A.
(c) Find the remaining eigenvalue(s) of A, and describe the
associated eigenspace(s).

A. Havens Introduction to Eigenvalues and Eigenvectors


Defining Eigenstuffs The Characteristic Equation Introduction to Applications

The Characteristic polynomial

A 3 × 3 example

Example
Solution:
(a) It is easy to check that Av = −v, whence v is an eigenvector
with associated eigenvalue −1.

A. Havens Introduction to Eigenvalues and Eigenvectors


Defining Eigenstuffs The Characteristic Equation Introduction to Applications

The Characteristic polynomial

A 3 × 3 example

Example
Solution:
(a) It is easy to check that Av = −v, whence v is an eigenvector
with associated eigenvalue −1.
Observe that since v = e1 + e2 + e3 , this can only be the case
since the sum of entries in each row of A is −1.

A. Havens Introduction to Eigenvalues and Eigenvectors


Defining Eigenstuffs The Characteristic Equation Introduction to Applications

The Characteristic polynomial

A 3 × 3 example

Example
Solution:
(a) It is easy to check that Av = −v, whence v is an eigenvector
with associated eigenvalue −1.
Observe that since v = e1 + e2 + e3 , this can only be the case
since the sum of entries in each row of A is −1. More
generally, e1 + e2 + . . . + en is an eigenvalue of an n × n
matrix if and only if the sum of all entries in the rows of the
matrix equal a constant λ, which is then the eigenvalue for v.

A. Havens Introduction to Eigenvalues and Eigenvectors


Defining Eigenstuffs The Characteristic Equation Introduction to Applications

The Characteristic polynomial

A 3 × 3 example

Example
Solution:
(a) It is easy to check that Av = −v, whence v is an eigenvector
with associated eigenvalue −1.
Observe that since v = e1 + e2 + e3 , this can only be the case
since the sum of entries in each row of A is −1. More
generally, e1 + e2 + . . . + en is an eigenvalue of an n × n
matrix if and only if the sum of all entries in the rows of the
matrix equal a constant λ, which is then the eigenvalue for v.
Ä ä
(b) Let χA (λ) = det A − λI3 be the characteristic polynomial.
To find χA (λ), we thus need to calculate the determinant of
the 3 × 3 matrix A − λI3 .

A. Havens Introduction to Eigenvalues and Eigenvectors


Defining Eigenstuffs The Characteristic Equation Introduction to Applications

The Characteristic polynomial

A 3 × 3 example

Example
(b) (continued.)

2−λ −2 −1

χA (λ) = −1 1 − λ −1


−1 −2 2 − λ

A. Havens Introduction to Eigenvalues and Eigenvectors


Defining Eigenstuffs The Characteristic Equation Introduction to Applications

The Characteristic polynomial

A 3 × 3 example

Example
(b) (continued.)

2−λ −2 −1

χA (λ) = −1 1 − λ −1


−1 −2 2 − λ


= (2 − λ)(1 − λ)(2 − λ) − 2 − 2
− (1 − λ) − 2(2 − λ) − 2(2 − λ)

A. Havens Introduction to Eigenvalues and Eigenvectors


Defining Eigenstuffs The Characteristic Equation Introduction to Applications

The Characteristic polynomial

A 3 × 3 example

Example
(b) (continued.)

2−λ −2 −1

χA (λ) = −1 1 − λ −1


−1 −2 2 − λ


= (2 − λ)(1 − λ)(2 − λ) − 2 − 2
− (1 − λ) − 2(2 − λ) − 2(2 − λ)
= −λ3 + 5λ2 − 3λ − 9

A. Havens Introduction to Eigenvalues and Eigenvectors


Defining Eigenstuffs The Characteristic Equation Introduction to Applications

The Characteristic polynomial

A 3 × 3 example

Example
(c) Any eigenvalue λ of A satisfies the characteristic equation,
and thus is a root of the characteristic polynomial χA (λ).

A. Havens Introduction to Eigenvalues and Eigenvectors


Defining Eigenstuffs The Characteristic Equation Introduction to Applications

The Characteristic polynomial

A 3 × 3 example

Example
(c) Any eigenvalue λ of A satisfies the characteristic equation,
and thus is a root of the characteristic polynomial χA (λ).
Thus, we seek solutions of the polynomial equation

0 = χA (λ) =⇒ 0 = −λ3 + 5λ2 − 3λ − 9, or equivalently,


0 = λ3 − 5λ2 + 3λ + 9

A. Havens Introduction to Eigenvalues and Eigenvectors


Defining Eigenstuffs The Characteristic Equation Introduction to Applications

The Characteristic polynomial

A 3 × 3 example

Example
(c) Any eigenvalue λ of A satisfies the characteristic equation,
and thus is a root of the characteristic polynomial χA (λ).
Thus, we seek solutions of the polynomial equation

0 = χA (λ) =⇒ 0 = −λ3 + 5λ2 − 3λ − 9, or equivalently,


0 = λ3 − 5λ2 + 3λ + 9

We already know −1 is an eigenvalue, so we can divide by


λ + 1 to obtain a quadratic:

0 = λ2 − 6λ + 9 = (λ − 3)2 .

A. Havens Introduction to Eigenvalues and Eigenvectors


Defining Eigenstuffs The Characteristic Equation Introduction to Applications

The Characteristic polynomial

A 3 × 3 example

Example
(c) Any eigenvalue λ of A satisfies the characteristic equation,
and thus is a root of the characteristic polynomial χA (λ).
Thus, we seek solutions of the polynomial equation

0 = χA (λ) =⇒ 0 = −λ3 + 5λ2 − 3λ − 9, or equivalently,


0 = λ3 − 5λ2 + 3λ + 9

We already know −1 is an eigenvalue, so we can divide by


λ + 1 to obtain a quadratic:

0 = λ2 − 6λ + 9 = (λ − 3)2 .

Thus there is precisely one other eigenvalue, λ = 3.

A. Havens Introduction to Eigenvalues and Eigenvectors


Defining Eigenstuffs The Characteristic Equation Introduction to Applications

The Characteristic polynomial

A 3 × 3 example

Example
(c) (continued.) To find the associated eigenvector(s), we need to
solve the homogeneous system (A − 3I3 )x = 0.
   
−1 −2 −1 0 1 2 1 0
RREF −1 −2 −1 0 ∼ 0 0 0 0
   
−1 −2 −1 0 0 0 0 0

The solutions thus have the form


     
−2s − t −2 −1
x= s =s 1 +t 0  .
     
t 0 1

A. Havens Introduction to Eigenvalues and Eigenvectors


Defining Eigenstuffs The Characteristic Equation Introduction to Applications

The Characteristic polynomial

A 3 × 3 example

Example
(c) (continued.) Observe that this solution space is merely the
plane with equation x1 + 2x2 + x3 = 0, and it is spanned by
the vectors    
−2 −1
 1  , and  0  .
   
0 1
Thus the 3-eigenspace is two dimensional.

A. Havens Introduction to Eigenvalues and Eigenvectors


Defining Eigenstuffs The Characteristic Equation Introduction to Applications

The Characteristic polynomial

Repeated Eigenvalues: Multiplicities

Remark
In the preceding example, the eigenvalue 3 appeared as a double
root of the characteristic polynomial. We say that 3 has algebraic
multiplicity 2.

The associated eigenspace was spanned by two independent


eigenvectors, so the eigenvalue 3 is said in this case to also have
geometric multiplicity 2.

A. Havens Introduction to Eigenvalues and Eigenvectors


Defining Eigenstuffs The Characteristic Equation Introduction to Applications

The Characteristic polynomial

Multiplicities Defined

Definition
Let A ∈ Rn×n be a real square matrix with characteristic
polynomial χA (λ). Suppose ν ∈ R is an eigenvalue of A, so
χA (ν) = 0. Let Eν := {x ∈ Rn | Ax = νx} ⊆ Rn be the
ν-eigenspace.
The algebraic multiplicity m := m(ν) of the eigenvalue ν is
the largest integer such that λ − ν divides χA (λ):
χA (λ) = (λ − ν)m q(λ),
where q(λ) is a polynomial of degree n − m with q(ν) 6= 0.
The geometric multiplicity µ := µ(ν) of the eigenvalue ν is
the dimension of Eν : µν = dim Eν .

A. Havens Introduction to Eigenvalues and Eigenvectors


Defining Eigenstuffs The Characteristic Equation Introduction to Applications

The Characteristic polynomial

Algebraic versus Geometric Multiplicity

An important question, whose answer is relevant for our


forthcoming discussion of similar matrices and diagonalization is
the following:
For a given real eigenvalue ν of a real n × n matrix A, are m(ν)
and µ(ν) equal?

A little thought about previous examples shows they are not.


Indeed, consider the shearing transform of R2 discussed above.

A. Havens Introduction to Eigenvalues and Eigenvectors


Defining Eigenstuffs The Characteristic Equation Introduction to Applications

The Characteristic polynomial

ñ ô
1 k
Let k ∈ R and recall that the matrix A = has eigenvalue
0 1
λ = 1 with algebraic multiplicity 2.

If k 6= 0, then the 1-eigenspace is the line Span {e1 } whence the


geometric multiplicity of λ = 1 is 1.

On the other hand, if k = 0, then the algebraic and geometric


multiplicity are equal, since the entire plane R2 becomes the
1-eigenspace.

Thus a discrepancy can occur between algebraic and geometric


multiplicities of eigenvalues.

A. Havens Introduction to Eigenvalues and Eigenvectors


Defining Eigenstuffs The Characteristic Equation Introduction to Applications

The Characteristic polynomial

An Inequality

Proposition
An eigenvalue ν of an n × n matrix A has algebraic multiplicity at
least as large as its geometric multiplicity:

1 ≤ µ(ν) ≤ m(ν) ≤ n .

The inequalities 1 ≤ µ(ν) and m(ν) ≤ n should be clear. The


interesting thing to prove is that µ(ν) ≤ m(ν).

Before we prove this, we introduce a useful equivalence relation for


square matrices, called similarity, which implies strong relationships
between the eigendata of matrices among a given similarity
equivalence class.

A. Havens Introduction to Eigenvalues and Eigenvectors


Defining Eigenstuffs The Characteristic Equation Introduction to Applications

Similar Matrices

An Equivalence Relation: Similarity of Matrices

Definition
Given two n × n matrices A and B, the matrix A is said to be
similar to B if there exists an invertible matrix P such that
A = PBP−1 .

Observe that if A is similar to B via some invertible P, then taking


Q = P−1 , one has B = QAQ−1 , whence B is similar to A. Thus
we can say unambiguously that A and B are similar matrices.

It is easy to check the remaining conditions to show that similarity


is an equivalence relation of square matrices: convince yourself
that A is always similar to itself, and that if A is similar to B, and
B is similar to C, then A and C are also similar.

A. Havens Introduction to Eigenvalues and Eigenvectors


Defining Eigenstuffs The Characteristic Equation Introduction to Applications

Similar Matrices

Similarity and Characteristic Polynomials

Similar matrices are not necessarily row equivalent, but there is a


relationship between their characteristic polynomials, and
correspondingly, their eigenvalues:
Theorem
Let A and B be similar matrices. Then:
χA = χB , and thus A and B share eigenvalues and respective
algebraic multiplicities,
for any eigenvalue λ of A and B, the geometric multiplicity of
λ for A is the same as for B.

A. Havens Introduction to Eigenvalues and Eigenvectors


Defining Eigenstuffs The Characteristic Equation Introduction to Applications

Similar Matrices

A Proof

Proof.
Assume A = PBP−1 for some invertible matrix P ∈ Rn×n .
Observe that
A−λIn = PBP−1 −λPP−1

A. Havens Introduction to Eigenvalues and Eigenvectors


Defining Eigenstuffs The Characteristic Equation Introduction to Applications

Similar Matrices

A Proof

Proof.
Assume A = PBP−1 for some invertible matrix P ∈ Rn×n .
Observe that
A−λIn = PBP−1 −λPP−1 = PBP−1 −PλIn P−1

A. Havens Introduction to Eigenvalues and Eigenvectors


Defining Eigenstuffs The Characteristic Equation Introduction to Applications

Similar Matrices

A Proof

Proof.
Assume A = PBP−1 for some invertible matrix P ∈ Rn×n .
Observe that
A−λIn = PBP−1 −λPP−1 = PBP−1 −PλIn P−1 = P(B−λIn )P−1 ,
whence

χA (λ) = det(A − λIn )

A. Havens Introduction to Eigenvalues and Eigenvectors


Defining Eigenstuffs The Characteristic Equation Introduction to Applications

Similar Matrices

A Proof

Proof.
Assume A = PBP−1 for some invertible matrix P ∈ Rn×n .
Observe that
A−λIn = PBP−1 −λPP−1 = PBP−1 −PλIn P−1 = P(B−λIn )P−1 ,
whence

χA (λ) = det(A − λIn )


= det(P(B − λIn )P−1 )

A. Havens Introduction to Eigenvalues and Eigenvectors


Defining Eigenstuffs The Characteristic Equation Introduction to Applications

Similar Matrices

A Proof

Proof.
Assume A = PBP−1 for some invertible matrix P ∈ Rn×n .
Observe that
A−λIn = PBP−1 −λPP−1 = PBP−1 −PλIn P−1 = P(B−λIn )P−1 ,
whence

χA (λ) = det(A − λIn )


= det(P(B − λIn )P−1 )
= det(P) det(B − λIn ) det(P−1 )

A. Havens Introduction to Eigenvalues and Eigenvectors


Defining Eigenstuffs The Characteristic Equation Introduction to Applications

Similar Matrices

A Proof

Proof.
Assume A = PBP−1 for some invertible matrix P ∈ Rn×n .
Observe that
A−λIn = PBP−1 −λPP−1 = PBP−1 −PλIn P−1 = P(B−λIn )P−1 ,
whence

χA (λ) = det(A − λIn )


= det(P(B − λIn )P−1 )
= det(P) det(B − λIn ) det(P−1 )
= det(B − λIn )

A. Havens Introduction to Eigenvalues and Eigenvectors


Defining Eigenstuffs The Characteristic Equation Introduction to Applications

Similar Matrices

A Proof

Proof.
Assume A = PBP−1 for some invertible matrix P ∈ Rn×n .
Observe that
A−λIn = PBP−1 −λPP−1 = PBP−1 −PλIn P−1 = P(B−λIn )P−1 ,
whence

χA (λ) = det(A − λIn )


= det(P(B − λIn )P−1 )
= det(P) det(B − λIn ) det(P−1 )
= det(B − λIn )
= χB (λ) .

A. Havens Introduction to Eigenvalues and Eigenvectors


Defining Eigenstuffs The Characteristic Equation Introduction to Applications

Similar Matrices

Proof (continued.)
Now, suppose λ is an eigenvalue of both A and B, and suppose
the geometric multiplicity of λ for A is µ. Then there exist linearly
independent vectors v1 , . . . vµ spanning the λ-eigenspace of A, and
for any vi , i = 1, . . . , µ, Avi = λvi .
Then
λP−1 vi = P−1 Avi = B(P−1 vi ) ,
whence P−1 vi is an eigenvector for B with eigenvalue λ. Since P is
invertible, the map x 7→ P−1 x is an isomorphism, whence this
induces a one-to-one correspondence of eigenvectors of A and B
with eigenvalue λ. Thus, the geometric multiplicity of λ for B is
also µ.

A. Havens Introduction to Eigenvalues and Eigenvectors


Defining Eigenstuffs The Characteristic Equation Introduction to Applications

Similar Matrices

Proving the inequality

We can now prove the inequality µ(ν) ≤ m(ν) for an eigenvalue ν


of an n × n matrix A.
Proof.
Let ν be an eigenvalue of A with geometric multiplicity µ := µ(ν).
Thus, there exists an eigenbasis v1 , . . . , vµ spanning the
ν-eigenspace Eν ,

A. Havens Introduction to Eigenvalues and Eigenvectors


Defining Eigenstuffs The Characteristic Equation Introduction to Applications

Similar Matrices

Proving the inequality

We can now prove the inequality µ(ν) ≤ m(ν) for an eigenvalue ν


of an n × n matrix A.
Proof.
Let ν be an eigenvalue of A with geometric multiplicity µ := µ(ν).
Thus, there exists an eigenbasis v1 , . . . , vµ spanning the
ν-eigenspace Eν , and this eigenbasis can be extended to a basis
B = {v1 , . . . , vµ , u1 , . . . un−µ } of Rn .

A. Havens Introduction to Eigenvalues and Eigenvectors


Defining Eigenstuffs The Characteristic Equation Introduction to Applications

Similar Matrices

Proving the inequality

We can now prove the inequality µ(ν) ≤ m(ν) for an eigenvalue ν


of an n × n matrix A.
Proof.
Let ν be an eigenvalue of A with geometric multiplicity µ := µ(ν).
Thus, there exists an eigenbasis v1 , . . . , vµ spanning the
ν-eigenspace Eν , and this eigenbasis can be extended to a basis
B = {v1 , . . . , vµ , u1 , . . . un−µ } of Rn .

Let î ó
P= v1 . . . vµ u1 . . . un−µ .

A. Havens Introduction to Eigenvalues and Eigenvectors


Defining Eigenstuffs The Characteristic Equation Introduction to Applications

Similar Matrices

Proving the inequality

We can now prove the inequality µ(ν) ≤ m(ν) for an eigenvalue ν


of an n × n matrix A.
Proof.
Let ν be an eigenvalue of A with geometric multiplicity µ := µ(ν).
Thus, there exists an eigenbasis v1 , . . . , vµ spanning the
ν-eigenspace Eν , and this eigenbasis can be extended to a basis
B = {v1 , . . . , vµ , u1 , . . . un−µ } of Rn .

Let î ó
P= v1 . . . vµ u1 . . . un−µ .
Consider the product AP.

A. Havens Introduction to Eigenvalues and Eigenvectors


Defining Eigenstuffs The Characteristic Equation Introduction to Applications

Similar Matrices

Proof (continued.)

î ó
AP = Av1 . . . Avµ Au1 . . . Aun−µ

A. Havens Introduction to Eigenvalues and Eigenvectors


Defining Eigenstuffs The Characteristic Equation Introduction to Applications

Similar Matrices

Proof (continued.)

î ó
AP = Av1 . . . Avµ Au1 . . . Aun−µ
î ó
= νv1 . . . νvµ Au1 . . . Aun−µ

A. Havens Introduction to Eigenvalues and Eigenvectors


Defining Eigenstuffs The Characteristic Equation Introduction to Applications

Similar Matrices

Proof (continued.)

î ó
AP = Av1 . . . Avµ Au1 . . . Aun−µ
î ó
= νv1 . . . νvµ Au1 . . . Aun−µ

Observe that since B is a basis, P is invertible, whence we can


compute P−1 AP:

A. Havens Introduction to Eigenvalues and Eigenvectors


Defining Eigenstuffs The Characteristic Equation Introduction to Applications

Similar Matrices

Proof (continued.)

î ó
AP = Av1 . . . Avµ Au1 . . . Aun−µ
î ó
= νv1 . . . νvµ Au1 . . . Aun−µ

Observe that since B is a basis, P is invertible, whence we can


compute P−1 AP:

P−1 AP =
î ó
νe1 . . . νeµ P−1 Au1 . . . P−1 Aun−µ

A. Havens Introduction to Eigenvalues and Eigenvectors


Defining Eigenstuffs The Characteristic Equation Introduction to Applications

Similar Matrices

Proof (continued.)

î ó
AP = Av1 . . . Avµ Au1 . . . Aun−µ
î ó
= νv1 . . . νvµ Au1 . . . Aun−µ

Observe that since B is a basis, P is invertible, whence we can


compute P−1 AP:

P−1 AP =
î ó
νe1 . . . νeµ P−1 Au1 . . . P−1 Aun−µ
ñ ô
νIµ ∗
= .
0(n−µ)×(n−µ) ∗

A. Havens Introduction to Eigenvalues and Eigenvectors


Defining Eigenstuffs The Characteristic Equation Introduction to Applications

Similar Matrices

Proof (continued.)
Since there is a diagonal block of νIµ in P−1 AP, we see that
P−1 AP has a factor of (ν − λ)µ in its characteristic polynomial.

A. Havens Introduction to Eigenvalues and Eigenvectors


Defining Eigenstuffs The Characteristic Equation Introduction to Applications

Similar Matrices

Proof (continued.)
Since there is a diagonal block of νIµ in P−1 AP, we see that
P−1 AP has a factor of (ν − λ)µ in its characteristic polynomial.
But since A and P−1 AP are similar, they share the same
characteristic polynomial.

A. Havens Introduction to Eigenvalues and Eigenvectors


Defining Eigenstuffs The Characteristic Equation Introduction to Applications

Similar Matrices

Proof (continued.)
Since there is a diagonal block of νIµ in P−1 AP, we see that
P−1 AP has a factor of (ν − λ)µ in its characteristic polynomial.
But since A and P−1 AP are similar, they share the same
characteristic polynomial.
Thus, the algebraic multiplicity m(ν) for the eigenvalue ν of A is
at least µ.

A. Havens Introduction to Eigenvalues and Eigenvectors


Defining Eigenstuffs The Characteristic Equation Introduction to Applications

Similar Matrices

Proof (continued.)
Since there is a diagonal block of νIµ in P−1 AP, we see that
P−1 AP has a factor of (ν − λ)µ in its characteristic polynomial.
But since A and P−1 AP are similar, they share the same
characteristic polynomial.
Thus, the algebraic multiplicity m(ν) for the eigenvalue ν of A is
at least µ.

Observation
If µ(ν) = m(ν), then we get a maximal diagonal block νIm in
P−1 AP; if χA factors completely into a product of terms
(νi − λ)µ(νi ) with i µ(νi ) = n for real numbers νi , then P−1 AP
P

will be a completely diagonal matrix. We’ll study the process of


diagonalization shortly.

A. Havens Introduction to Eigenvalues and Eigenvectors


Defining Eigenstuffs The Characteristic Equation Introduction to Applications

2
Classifying Endomorphisms of R

Linear transformations T : R2 → R2

We’ll briefly discuss the role of eigenanalysis in studying the


geometry of linear transformations of the plane R2 .

First, we remark that there is a dichotomy: linear maps


T : R2 → R2 are either invertible or non-invertible. We know that
the map T (x) = Ax is non-invertible if and only if A is singular, if
and only if 0 is an eigenvalue of A.

Thus, let us first understand the geometry of maps x → Ax where


A has a 0 eigenvalue.

A. Havens Introduction to Eigenvalues and Eigenvectors


Defining Eigenstuffs The Characteristic Equation Introduction to Applications

2
Classifying Endomorphisms of R

Projections
Since χA is a degree two polynomial for any A ∈ R2×2 , there are
two possibilities for zero eigenvalues: a single zero eigenvalue and
one nonzero eigenvalue λ, or a zero eigenvalue with algebraic
multiplicity m = 2.
If
ñ the eigenvalues
ô of A are 0 and λ 6= 0, then A is similar to
λ 0
, which represents a stretched projection map T (x) = Ax
0 0
projecting onto its nonzero eigenspace Eλ , with stretching factor λ:
if λ = 1 then T is an unstretched orthogonal or oblique
projection onto the eigenline Eλ ,
if |λ| < 1 then T is a contracted projection onto Eλ ,
if |λ| > 1 then T is a dilated projection onto Eλ ,
if λ < 0 then T is a additionally acts by reflection, reversing
the eigenline Eλ .
A. Havens Introduction to Eigenvalues and Eigenvectors
Defining Eigenstuffs The Characteristic Equation Introduction to Applications

2
Classifying Endomorphisms of R

Nilpotent maps

In the case of a zero eigenvalue of algebraic multiplicity 2, there are


two possibilities: the zero matrix, or a nilpotent matrix. Nilpotent
matrices are (nonzero) square matrices N ∈ Rn×n for which there
exists a positive integer power r such that Nr = 0n×n .
ñ ô
0 1
Every 2 × 2 nilpotent matrix is similar to N = .
0 0

Can you give a geometric interpretation of nilpotence, given the


similarity to N? (Once we study diagonalization, you will hopefully
see how to show this claim, and interpret it. . . )

A. Havens Introduction to Eigenvalues and Eigenvectors


Defining Eigenstuffs The Characteristic Equation Introduction to Applications

2
Classifying Endomorphisms of R

Linear Automorphisms of the Plane

Now we can examine invertible matrices A, which always


determine linear automorphisms of R2 .

This classification is more subtle than the classification of singular


A, and will require some additional results from the theory of
diagonalization and the theory of complex eigenvalues, which we
will visit later.

To get a better picture, we first examine the general form of the


characteristic polynomial of a 2 × 2 matrix.

A. Havens Introduction to Eigenvalues and Eigenvectors


Defining Eigenstuffs The Characteristic Equation Introduction to Applications

2
Classifying Endomorphisms of R

The Characteristic Polynomial of A ∈ R2×2

ñ ô
a11 a12
If A = , then the characteristic polynomial satisfies
a21 a22

χA (λ) = λ2 − (a11 + a22 )λ + a11 a22 − a12 a21

A. Havens Introduction to Eigenvalues and Eigenvectors


Defining Eigenstuffs The Characteristic Equation Introduction to Applications

2
Classifying Endomorphisms of R

The Characteristic Polynomial of A ∈ R2×2

ñ ô
a11 a12
If A = , then the characteristic polynomial satisfies
a21 a22

χA (λ) = λ2 − (a11 + a22 )λ + a11 a22 − a12 a21


= λ2 − tr (A)λ + det(A)

where tr (A) = a11 + a22 is called the trace of the matrix A.

A. Havens Introduction to Eigenvalues and Eigenvectors


Defining Eigenstuffs The Characteristic Equation Introduction to Applications

2
Classifying Endomorphisms of R

The Characteristic Polynomial of A ∈ R2×2

ñ ô
a11 a12
If A = , then the characteristic polynomial satisfies
a21 a22

χA (λ) = λ2 − (a11 + a22 )λ + a11 a22 − a12 a21


= λ2 − tr (A)λ + det(A)

where tr (A) = a11 + a22 is called the trace of the matrix A.

By the fundamental theorem of algebra, the characteristic


polynomial factors into linear factors as χA (λ) = (λ − ν1 )(λ − ν2 )
where ν1 , ν2 may be complex and are not necessarily distinct.

A. Havens Introduction to Eigenvalues and Eigenvectors


Defining Eigenstuffs The Characteristic Equation Introduction to Applications

2
Classifying Endomorphisms of R

Eigenvalues, Trace, and Determinant

Multiplying this factorization out, observe that

λ2 − (ν1 + ν2 )λ + ν1 ν2 = λ2 − trace (A)λ + det A ,

whence ν1 ν2 = det A and ν1 + ν2 = tr A. That is, the product of


the eigenvalues is the determinant, and the sum of the eigenvalues
is the trace. This rule holds generally for any size of square matrix.

By the quadratic formula, we can also express the eigenvalues of a


2 × 2 matrix directly in terms of its trace and determinant.

The following proposition gives the explicit formulae, and describes


easily proved results characterizing eigendata and geometry of a
linear map x 7→ Ax.

A. Havens Introduction to Eigenvalues and Eigenvectors


Defining Eigenstuffs The Characteristic Equation Introduction to Applications

2
Classifying Endomorphisms of R

Proposition
Let A be a matrix determining a linear map T (x) = Ax of R2 and
let ∆ = det(A), τ = tr (A). Then the eigenvalues of A are
Ä √ ä Ä √ ä
λ+ := 21 τ + τ 2 − 4∆ , λ− := 12 τ − τ 2 − 4∆ .

A has a repeated eigenvalue if and only if τ = ±2 ∆, and
otherwise has two distinct eigenvalues.
A has a zero eigenvalue if and only if ∆ = 0, and if in addition
τ = 0, then the matrix is either nilpotent or the zero matrix.
If τ 2 ≥ 4∆ then the eigenvalues λ± are real.
Otherwise, if τ 2 < 4∆ then the matrix has distinct complex
eigenvalues with strictly nonzero imaginary parts, occuring as
a conjugate pair λ = a + bi, λ̄ = a − bi. Moreover, the
determinant in this case is |λ|2 = λλ̄ = a2 + b 2 > 0.

A. Havens Introduction to Eigenvalues and Eigenvectors


Defining Eigenstuffs The Characteristic Equation Introduction to Applications

2
Classifying Endomorphisms of R

Proposition (Proposition (continued.))


T is area preserving if and only if |∆| = 1, contracts areas if
and only if |∆| < 1, and expands areas if and only if |∆| > 1.
Assuming no zero eigenvalues, T is orientation preserving if
and only if ∆ > 0, and orientation reversing if and only if
∆ < 0. If there is one zero eigenvalue and one nonzero
eigenvalue λ, it reverses the eigenline if and only if λ < 0.

Henceforth, assume that the eigenvalues of A are both nonzero, so


T (x) = Ax is an automorphism of R2 .
We’ll characterize the possible geometric actions of this map from
the eigendata.
We first consider repeated eigenvalues, where the possibilities are
quite limited. We then investigate distinct eigenvalues.

A. Havens Introduction to Eigenvalues and Eigenvectors


Defining Eigenstuffs The Characteristic Equation Introduction to Applications

2
Classifying Endomorphisms of R

Generalized Shearing

In the repeated eigenvalue case with eigenvalue λ of algebraic


multiplicity 2, the matrix is either ±I2 , a contraction or dilation
matrix obtained from scaling ±I2 by λ, or the matrix of a
generalized shearing map.
A generalized shearing map with eigenvalue λ is a map x → Ax
such that m(λ) = 2 but µ(λ) = 1 and such that A similar to a
matrix of the form ñ ô
1 1
Jλ = λ .
0 1
î ó
The invertible matrix P = b1 b2 giving the similarity
A = PJλ P−1 consists of a vector b1 spanning Eλ and a vector b2
which is the pre-image of λb1 under the transformation
x 7→ (A − λI2 )x.

A. Havens Introduction to Eigenvalues and Eigenvectors


Defining Eigenstuffs The Characteristic Equation Introduction to Applications

2
Classifying Endomorphisms of R

The Geometry of Plane Linear Automorphisms

Next we consider the case where A has two distinct eigenvalues.


The map T (x) = Ax may then be classified by the following
geometric considerations:
1 the effect of the map on areas,
2 the effect of the map on orientations,
3 the effect of the map on distances from the origin,
4 the existence or nonexistence of an eigenframe, or
equivalently, the nonexistence or existence respectively of a
minimum rotation angle between x and the line spanned by its
image T (x).
We’ll unpack each of these effects via conditions on the
eigenvalues.

A. Havens Introduction to Eigenvalues and Eigenvectors


Defining Eigenstuffs The Characteristic Equation Introduction to Applications

2
Classifying Endomorphisms of R

The Geometry of Plane Linear Automorphisms

It’s clear that condition (4) is related to whether the eigenvalues


are real or complex.

If the eigenvalues are real, since they are assumed distinct, we


know there are two linearly independent eigenvectors spanning
distinct eigenlines.

Any such pair gives an eigenframe, which is a frame of vectors


giving a basis of R2 such that it remains invariant under the action
of the linear map T (x) = Ax.

We thus will need to examine the other geometric effects to


understand the map.

A. Havens Introduction to Eigenvalues and Eigenvectors


Defining Eigenstuffs The Characteristic Equation Introduction to Applications

2
Classifying Endomorphisms of R

The Geometry of Plane Linear Automorphisms

On the other hand, if the eigenvalues are complex, we’ll be able to


prove that all vectors are rotated by T (not necessarily equally)
and there will be some minimum rotation angle between a vector
and the line spanned by its image.

The map will necessarily be orientation preserving (as the


determinant is positive), but the other geometric considerations
still apply.

In every case, knowing the eigenvalues, we can construct a matrix


similar to A which captures the essential geometry in a suitable
coordinate system.

A. Havens Introduction to Eigenvalues and Eigenvectors


Defining Eigenstuffs The Characteristic Equation Introduction to Applications

2
Classifying Endomorphisms of R

The Geometry of Plane Linear Automorphisms

We summarize the results in a table:

Click here for the table!

A. Havens Introduction to Eigenvalues and Eigenvectors


Defining Eigenstuffs The Characteristic Equation Introduction to Applications

Linear Recursion and Difference Equations

Definition
An n-th order recurrence relation is a discrete relation of the form

xk = f (xk−n , xk−n+1 , . . . xk−1 ) ,

for integers k ≥ n where f is some function.

Such a relation, if solvable, defines a sequence {xk : k ∈ Z≥0 }


determined by the first n terms {x0 , . . . , xn−1 }.

An initial value recurrence problem for such a recurrence relation is


given if one knows the function f , and the values of the first n
terms x0 , x1 , . . . , xn−1 , and wishes to solve the recurrence to
express the general term xk , k ≥ n as a function of k.

A. Havens Introduction to Eigenvalues and Eigenvectors


Defining Eigenstuffs The Characteristic Equation Introduction to Applications

Linear Recursion and Difference Equations

Definition
An n-th order recurrence is linear homogeneous if f is a
homogeneous linear function, i.e., if the recurrence relation is of
the form
Pn−1
xk = a0 xk−n + a1 xk−n+1 + . . . + an−1 xk−1 = i=0 ai xk−n+i ,
for numbers a0 , . . . , an−1 .

Observe that for an n-th order linear recurrence, xn satisfies


xn = a0 x0 + a1 x1 + . . . + an−1 xn−1 ,
whence the sequence {xk : k ∈ Z≥0 } is determined uniquely by the
initial values x0 , x1 , . . . xn−1 .

A. Havens Introduction to Eigenvalues and Eigenvectors


Defining Eigenstuffs The Characteristic Equation Introduction to Applications

Linear Recursion and Difference Equations

Observe that a linear recurrence can be written in the form


a0 xk−n
   
 ..   .. 
xk = a · xk−1 = at xk−1 , a :=  .  , xk−1 :=  .  .
an−1 xk−1

We can define a vector xk consisting of n consecutive terms ending


with xk , and a matrix C, called a companion matrix, such that
xk = Cxk−1 :
 

xk−n+1

0 1 0 ··· 0
xk−n+2
 .. 
0 0 1 . 0
  ñ ô  
 
 ..  0 In 
.. .. .. .. ..

xk :=   , C := = .
 
 .  at  . . . . . 
 
 xk−1  
 0 0 ··· 0 1


xk a0 a1 · · · · · · an−1

A. Havens Introduction to Eigenvalues and Eigenvectors


Defining Eigenstuffs The Characteristic Equation Introduction to Applications

Linear Recursion and Difference Equations

In this formulation, one sees that xn+k = Ck+1 xn−1 , for k ≥ −1. If
one can find an invertible matrix P such that C is similar to a
diagonal matrix D via P, then one can compute an explicit formula

xn+k = PDk+1 P−1 xn−1

whose first entry gives an expression for xk in terms the first n


terms x0 , x1 , . . . xn−1 and powers of the entries of D.

In particular, one will see that the eigenvalues and eigenvectors of


C build a solution to the linear homogeneous initial value recursion
problem.

A. Havens Introduction to Eigenvalues and Eigenvectors


Defining Eigenstuffs The Characteristic Equation Introduction to Applications

Linear Recursion and Difference Equations

The Fibonacci Numbers

Definition
The Fibonacci numbers Fk are the numbers defined by the simple
linear recurrence

Fk+1 = Fk + Fk−1 , F0 = 0 , F1 = 1 .

The Fibonacci sequence is thus the sequence starting with


0, 1, 1, 2, 3, 5, 8 . . . whose next term is always the sum of the
preceding two terms

We can get an explicit formula for the k-th Fibonacci number


using eigentheory.

A. Havens Introduction to Eigenvalues and Eigenvectors


Defining Eigenstuffs The Characteristic Equation Introduction to Applications

Linear Recursion and Difference Equations

From the recurrence relation Fk+1 = Fk + Fk−1 with the initial


values F0 = 0 and F1 = 1, we can rewrite this as a linear discrete
dynamical system of the form xk = Cxk−1 where
ñ ô ñ ô
Fk−1 0 1
xk = , C= .
Fk 1 1

We will diagonalize C in order to solve to obtain an explicit


formula for Fk .
The first step is to compute the characteristic polynomial χC (λ):

χC (λ) = λ2 − λ − 1 .

A. Havens Introduction to Eigenvalues and Eigenvectors


Defining Eigenstuffs The Characteristic Equation Introduction to Applications

Linear Recursion and Difference Equations



1± 5
We get two real, irrational eigenvalues, λ± = 2 .

We momentarily digress to mention that this polynomial and its


roots have been widely studied since antiquity; the positive
√ root
1+ 5
λ+ is the same as the famous golden ration φ = 2 . The
negative root is just −1/φ.

Observe that φ satisfies the useful and interesting relations


1 √
φ−1= , φ + 1 = φ2 , 1 + φ2 = 5φ ,
φ

as well as being given by the amusing (but less useful) formulae



q
1 »
φ=1+ 1 = 1+ 1+ 1 + ....
1+ 1+ 1
..
.

A. Havens Introduction to Eigenvalues and Eigenvectors


Defining Eigenstuffs The Characteristic Equation Introduction to Applications

Linear Recursion and Difference Equations

Exercise
Show that
®ñ ô´ ®ñ ô´
1 −φ
Eφ = Span , E−1/φ = Span .
φ 1
ñ ô ñ ô
1 −φ 1/φ 1
Let P = . Show that P−1 = √1 , and
φ 1 5 −1 1/φ
check that
ñ ô ñ ôñ ôñ ô
0 1 1 1 −φ φ 0 1/φ 1
=√
1 1 5 φ 1 0 −1/φ −1 1/φ

A. Havens Introduction to Eigenvalues and Eigenvectors


Defining Eigenstuffs The Characteristic Equation Introduction to Applications

Linear Recursion and Difference Equations

Then, using that


ñ ô ñ ôk ñ ô
Fk 0 1 0
= ,
Fk+1 1 1 1
we have that
φk
ñ ô ñ ôñ ôñ ôñ ô
Fk 1 1 −φ 0 1/φ 1 0
=√
Fk+1 5 φ 1 0 (−1/φ)k −1 1/φ 1
φk−1 − (−1/φ)k−1 φk − (−1/φ)k
ñ ôñ ô
1 0
=√
5 φk − (−1/φ)k φk+1 − (−1/φ)k+1 1
φk − (−1/φ)k
ñ ô
1
=√ k+1 ,
5 φ − (−1/φ)k+1
whence
ÑÇ √ åk Ç √ åk é
1 Ä ä 1 1+ 5 1− 5
Fk = √ φk − (−1/φ)k = √ − .
5 5 2 2

A. Havens Introduction to Eigenvalues and Eigenvectors


Defining Eigenstuffs The Characteristic Equation Introduction to Applications

Linear Recursion and Difference Equations

Observe that $ Ç √ åk '


1 1+ 5
Fk = √ ,
5 2
where bxe is the nearest integer function.
More generally, n-th order linear recursions can be solved with
general solutions that are linear combinations of products of
powers of eigenvalues of the associated companion matrix times
certain powers of the index.
Challenge Problem: Consider a general homogeneous n-th order
linear recurrence of the form xn = a0 x0 + . . . an−1 xn−1 .
n−1
(1) Show that the polynomial t n − i=0 ai t i is the characteristic
P

polynomial of the associated companion matrix C.


(2) For any eigenvalue λ which is a root of order m of the
characteristic polynomial, show that xk = k p λk , is a solution
of the recurrence equation for p ∈ {0, 1, . . . m − 1}
A. Havens Introduction to Eigenvalues and Eigenvectors
Defining Eigenstuffs The Characteristic Equation Introduction to Applications

Linear Recursion and Difference Equations

Challenge Problem (continued):


(3) Show that the general solution is given by linear combinations
of terms of the form k p λk . That is, show that any solution xk
of the recurrence has form
ml
l X
X
xk = bi,j k j−1 λki
i=1 j=1

= λk1 (b1,0 + b1,1 k + . . . b1,m1 k m1 −1 )


+ . . . λkl (bl,0 + bl,1 k + . . . bl,ml k ml −1 ) ,

where λ1 . . . λl are distinct eigenvalues of C with respective


algebraic multiplicities m1 , . . . , ml , and bi,j are constants.
(4) If one specifies values for x0 , . . . xn−1 , does this uniquely
determine the constants bi,j ?

A. Havens Introduction to Eigenvalues and Eigenvectors


Defining Eigenstuffs The Characteristic Equation Introduction to Applications

Linear Differential Equations

An n-dimensional linear system of differential equations is a system


of the form 
dx1 Pn
dt = j=1 a1,j (t)xj (t)



 dx2 Pn
dt = j=1 a2,j (t)xj (t)





 ..
.


dxi Pn .
 dt = j=1 ai,j (t)xj (t)




 ..

 
 .
 dxn
 Pn
dt = j=1 an,j (t)xj (t)

For the system to be linear in the variables xk , we must assert that


the coefficients aij are independent of xk for all k, i.e. that
∂aij /∂xk = 0 for all i, j, and k. The system is called autonomous
if the coefficients aij satisfy daij /dt = 0 for all i and j, i.e., if the
coefficients are also constant in time.

A. Havens Introduction to Eigenvalues and Eigenvectors


Defining Eigenstuffs The Characteristic Equation Introduction to Applications

Linear Differential Equations

Such a system can be compactly described by a linear vector


differential equation
dx
(t) = A(t)x(t) .
dt
Often one has an initial value problem, where at time t = 0 one is
given x(0) = x0 for some constant vector x0 ∈ Rn .
In the case of autonomous systems with a constant coefficient
matrix A, one can attempt to construct solutions as linear
combinations of the eigenfunctions of the form e λt vλ where λ is an
eigenvalue of A and vλ is an associated eigenvector.

A. Havens Introduction to Eigenvalues and Eigenvectors


Defining Eigenstuffs The Characteristic Equation Introduction to Applications

Linear Differential Equations

It’s not hard to see that such vectors furnish solutions: if λ is an


eigenvalue of A and vλ is the associated eigenvector, then
d Ä λt ä
e vλ = λe λt vλ ,
dt
while Ä ä
A e λt vλ = e λt Avλ = e λt (λvλ ) = λe λt vλ ,

so e λt vλ satisfies the differential equation x0 (t) = Ax(t).


The remarkable fact is that when A has n distinct eigenvalues
λ1 , . . . , λn , any solution is an element of
¶ ©
Span e λ1 t vλ1 , . . . , e λn t vλn .

A. Havens Introduction to Eigenvalues and Eigenvectors


Defining Eigenstuffs The Characteristic Equation Introduction to Applications

Linear Differential Equations

Example
Consider the linear system of differential equations:
A
( z }| ô{
dx1 ñ
dt = −x1 + 2x2 d −1 2
dx2
←→ dt x(t) = x(t) .
= 3x1 + 4x2 3 4
dt
The matrix A has eigenvalues −2 and 5 with respective
eigenvectors
ñ ô ñ ô
−2 1
v−2 = and v5 = ,
1 3
whence
(
x1 (t) = −2c1 e −2t + c2 e 5t
x(t) = c1 e −2t v−2 + c2 e 5t v5 ←→
x2 (t) = c1 e −2t + 3c2 e 5t
gives a general solution.

A. Havens Introduction to Eigenvalues and Eigenvectors


Defining Eigenstuffs The Characteristic Equation Introduction to Applications

Linear Differential Equations

Example (An Example with Imaginary Eigenvalues)


Consider the second order linear homogeneous differential equation

x 00 (t) + x(t) = 0 .

We can convert it to a first order linear system by introducing a


new variable: the velocity v (t) = x 0 (t). The system becomes the
matrix equation
ñ ô ñ ôñ ô
d x 0 1 x
= .
dt v −1 0 v

The matrix of the system has


ñ eigenvalues
ô ±iñwith ôrespective
1 1
complex eigenvectors vi = and v−i = .
i −i

A. Havens Introduction to Eigenvalues and Eigenvectors


Defining Eigenstuffs The Characteristic Equation Introduction to Applications

Linear Differential Equations

Example (An Example with Imaginary Eigenvalues - continued)


The general solution to the first order system is
ñ ô ñ ô ñ ô
x it 1 −it 1
= c1 e + c2 e , c1 , c2 ∈ C .
v i −i

It follows that a complex solution to the second order equation has


the form x(t) = c1 e it + c2 e −it . Using that e it = cos t + i sin t, and
setting a = c1 + c2 and b = i(c1 − c2 ), we obtain

x(t) = a cos(t) + b sin(t) .

One can determine real coefficients a and b given sufficient real


initial conditions are provided, such as real values for x(t0 ) and
x 0 (t0 ) for some initial time t0 ∈ R.

A. Havens Introduction to Eigenvalues and Eigenvectors


Defining Eigenstuffs The Characteristic Equation Introduction to Applications

Linear Differential Equations

Final Exam Information

The final exam for all sections will be held


Monday 5/7/18, 10:30AM-12:30PM, in Boyden gym.

A. Havens Introduction to Eigenvalues and Eigenvectors

You might also like