Orthogonal Diagonalization: Linear Algebra Eigenvalues and Diagonalization
Orthogonal Diagonalization: Linear Algebra Eigenvalues and Diagonalization
Orthogonal diagonalization
What you need to know already: What you can learn here:
By using what we have learned about eigenvalues and similarity, we can prove So, orthogonal matrices are also nice in the sense that their eigenvalues are
a really intriguing fact. simple.
Yes, and that implies that a matrix similar to an orthogonal matrix can also
have only 1 and -1 as eigenvalues, a fact that may make other computations easy.
Technical fact But it turns out that another linkage between similarity and orthogonal matrices
leads to interesting results.
If Q is an orthogonal matrix, its only possible
eigenvalues are 1 and -1.
Definition
Proof A matrix A is orthogonally diagonalizable if it can
be written as
Assume that Q is an orthogonal matrix, is one of its eigenvalues and v A = QDQ−1
is one of its eigenvectors. Then:
Qv = v Qv = v where D is a diagonal matrix and Q is an orthogonal
matrix.
But, being an orthogonal matrix, Q preserves lengths. Moreover, the scalar
multiple can be separated by using its absolute value. Therefore:
Qv = v v = v = 1 = 1
So, we are looking at a matrix that is diagonalizable through an orthogonal
matrix.
Linear Algebra Chapter 10: Eigenvalues and diagonalization Section 6: Orthogonal diagonalization Page 1
Yes: isn’t that fun to say? But there is an easier characterization of these
matrices, one that may be a big surprise for you. In fact, the big surprise I have
mentioned a few sections back! Technical fact
Let me start by pointing out a key property of orthogonally diagonalizable The Spectral Theorem
matrices.
for symmetric matrices
Every symmetric matrix is orthogonally
Technical fact diagonalizable.
AT = QDQ ( ) = ( QDQ ) = (Q )
−1 T T T T T
DT QT = QDQ −1 = A
1 −2 0
Therefore, the matrix is symmetric.
Example: −2 0 2
Too bad that this is another one of those one-way statements, or we could say
0 2 −1
that orthogonally diagonalizable matrices and symmetric matrices are the same This matrix is symmetric, so we should be able to diagonalize it by using an
thing. orthogonal matrix. We use the usual method of finding eigenvalues and
eigenvectors and then arranging them properly.
Well, that is the surprise: they are!
For eigenvalues, we get:
1− −2 0
1− 0 1− −2
−2 − 2 = −2 − (1 + )
−2 2 −2 −
0 2 −1 −
(
= −4 + 4 − (1 + ) 2 − − 4 )
= −4 + 4 − − − 4 + − − 4 = − + 9 = − 2 − 9
2 3 2 3
( )
Linear Algebra Chapter 10: Eigenvalues and diagonalization Section 6: Orthogonal diagonalization Page 2
Therefore, the eigenvalues are = −3, 0, 3 , which allows us to get the −1 2 −2 −3 0 0
Q = −2 1 2 , D = 0 0 0 A = QDQ −1
1
diagonal matrix:
3
−3 0 0 2 2 1 0 0 3
D = 0 0 0
0 0 3
Cool! But why is it called the Spectral Theorem?
Now we construct unit eigenvectors for each eigenvalue:
Because its proof is based on finding certain characteristics of the spectrum of
4 −2 0 2 0 1 −1
= −3 −2 3 2
REF
0 1 1 v1 = s −2
eigenvalues, that is, the entire set of eigenvalues of the matrix. It is also related to a
larger set of theorems that are also called spectral for similar reasons (they relate to
0 2 2 0 0 0 2 the entire spectrum of something)
1 −2 0 1 0 −1 2
= 0 −2 0 2 REF
0 2 −1 v 2 = t 1
Outline of the
proof
0 2 −1 0 0 0 2
−2 −2 0 1 0 2 −2 The really difficult part of the proof is to show that a symmetric nn matrix
= 3 −2 −3 2 REF
0 1 −2 v 3 = k 2
A only has real eigenvalues, that is, that all solutions of the characteristic
equation are real numbers. This part of the proof requires the use of
0 2 −4 0 0 0 1
complex numbers and their theory.
Now notice that the three basic vectors we have found are perpendicular to Once this is done, we know that the sum of the algebraic multiplicities of the
each other. By normalizing each of them, we can obtain an orthonormal set eigenvalues of A equals n, but what about their geometric multiplicities? It
of vectors: turns out that for a symmetric matrix, geometric and algebraic multiplicities
−1 −1 are equal for each eigenvalue. This is less difficult, but still tricky.
1
v1 = s −2 u1 = −2 Once we know this, we know that A has enough eigenvectors to make it
3 diagonalizable, so one only needs to prove that the eigenspaces of different
2 2 eigenvalues are not just independent, but orthogonal. This is another
challenging step, but it turns out to be true. This, together with some steps of
2 2 the modified Gram-Schmidt process, tell us that we can find a set of
1
v 2 = t 1 u 2 = 1 orthogonal eigenvectors to make up the matrix Q.
3
2 2
Although I have not given you the details of the proof, we can extract from the
−2 −2 outline an interesting set of facts about symmetric matrices.
1
v3 = k 2 u3 = 2
3
1 1
If we arrange these three vectors as columns of a matrix, we obtain an
orthogonal one, as claimed. Therefore:
Linear Algebra Chapter 10: Eigenvalues and diagonalization Section 6: Orthogonal diagonalization Page 3
1 0 0 u1T
0
Technical facts 0 u 2T
A = u1 u 2 un 2
Given a symmetric, nn matrix A:
0 0 n u nT
It is always possible to find an orthogonal set
consisting of n eigenvectors for A. And now we only have to compute this product:
1u1T
Any two eigenvectors of A corresponding to
2u 2T
different eigenvalues are orthogonal. A = u1 u 2 un = u u T + 2u 2u 2T + + nu nu nT
1 1 1
If u1, u2 , , un is a set of orthonormal T
nu n
column eigenvectors, corresponding,
respectively, to the eigenvalues 1, 2 , , n as claimed.
Summary
Symmetric matrices are the one and only matrices that can be diagonalized through an orthogonal matrix.
This implies that symmetric matrices can be decomposed into a sum of products of orthogonal vectors.
Linear Algebra Chapter 10: Eigenvalues and diagonalization Section 6: Orthogonal diagonalization Page 5
Learning questions for Section LA 10-6
Review questions:
Memory questions:
1. What are the possible eigenvalues for an orthogonal matrix? 3. Which matrices are orthogonally diagonalizable?
2. What does the Spectral theorem state? 4. Are all symmetric matrices diagonalizable?
Computation questions:
Linear Algebra Chapter 10: Eigenvalues and diagonalization Section 6: Orthogonal diagonalization Page 6
Theory questions:
1. If Q is an orthogonal matrix, what can the product of its eigenvalues be? 5. Does every orthogonal matrix have eigenvalues?
2. What is special about symmetric matrices with respect to diagonalization? 6. Assuming that a matrix A is orthogonally diagonalizable, when may the Gram-
Schmidt process be needed to complete such diagonalization?
3. What feature of the matrix A tells us that it is not orthogonally diagonalizable?
7. Which theorem relates symmetric and orthogonal matrices?
4. Can an upper triangular matrix be orthogonally diagonalizable?
Proof questions:
1. Prove that if Q is an orthogonal matrix and is one of its eigenvalues, then Q is also orthogonal for any whole number n.
n
2. Prove that if A and B are two orthogonal matrices of the same size, then the square of the product of all eigenvalues of BA is 1.
Templated questions:
Linear Algebra Chapter 10: Eigenvalues and diagonalization Section 6: Orthogonal diagonalization Page 7
Linear Algebra Chapter 10: Eigenvalues and diagonalization Section 6: Orthogonal diagonalization Page 8