0% found this document useful (0 votes)
67 views3 pages

8.2 Orthogonal Matrices

The document discusses orthogonal matrices. Key points: - Orthogonal matrices are square matrices whose columns are orthogonal unit vectors. - The eigenvectors of a symmetric matrix form an orthogonal matrix. This makes diagonalizing the matrix nicer. - Normalizing eigenvectors to have unit length produces an orthogonal matrix for diagonalization. - An orthogonal matrix is easy to compute the inverse of, as the inverse is equal to the transpose. - Orthogonal matrices preserve lengths and inner products of vectors. They also remain orthogonal under multiplication and inversion.

Uploaded by

JosephKuo
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOC, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
67 views3 pages

8.2 Orthogonal Matrices

The document discusses orthogonal matrices. Key points: - Orthogonal matrices are square matrices whose columns are orthogonal unit vectors. - The eigenvectors of a symmetric matrix form an orthogonal matrix. This makes diagonalizing the matrix nicer. - Normalizing eigenvectors to have unit length produces an orthogonal matrix for diagonalization. - An orthogonal matrix is easy to compute the inverse of, as the inverse is equal to the transpose. - Orthogonal matrices preserve lengths and inner products of vectors. They also remain orthogonal under multiplication and inversion.

Uploaded by

JosephKuo
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOC, PDF, TXT or read online on Scribd
You are on page 1/ 3

8.

2 Orthogonal Matrices
The fact that the eigenvectors of a symmetric matrix A are orthogonal implies the
coordinate system defined by the eigenvectors is an orthogonal coordinate system. In
particular, it makes the diagonalization A = TDT-1 of the matrix A particularly nice.
Recall T is the matrix whose columns are the eigenvectors v1,vn of A and D is the
diagonal matrix with the eigenvalues 1,,n of A on the main diagonal and zero's
elsewhere. It is convenient to normalize the eigenvectors before using them as the
columns of T. Recall, normalizing a vector v means dividing it by its length | v | so the
resulting vector has unit length. If we normalize an eigenvector the resulting vector is
still an eigenvector, since if we multiply an eigenvector by a constant it is still an
eigenvector. If we use the normalized eigenvectors as the columns of T then the columns
of T are orthogonal and have length one. Such a matrix is called an orthogonal matrix.
Definition 1. An orthogonal matrix is a square matrix S whose columns are orthogonal
and have length one.
Example 1. Find the diagonalization of A = using the normalized eigenvectors as the
columns of T. From Example 3 in the previous section, the eigenvalues of A are 1 = 5
and 2 = 20 and the eigenvectors are v1 = and v2 = . One has | v1 | = | v2 | = . So the
normalized eigenvectors are u1 = and u2 = . So T = and the diagonalization of A is
A = TDT-1 =

-1

Note that T = is an orthogonal matrix.


Example 2. Let R = be the matrix for a rotation by an angle . Then is orthogonal R. Let F =
be the matrix for a reflection across the y axix. Then F is orthogonal. Also RF = = is
orthogonal.
Proposition 1. If S = is orthogonal then S = R or S = RF where is the angle makes with the x
axis.
Proof. S is orthogonal is a unit vector a2 + c2 = 1 r = 1 and are the polar coordinates
of a = cos and c = sin . S is orthogonal and are orthogonal = t for some number t.
S is orthogonal is a unit vector t = 1 or t = - 1. If t = 1 then S = R. If t = - 1 then S =
RF. //

An orthogonal matrix is nice because it is easy to compute its inverse since its inverse
turns out to be equal to their transpose.
Theorem 2. S is orthogonal if and only if S-1 = ST.

8.2 - 1

Proof. It suffices to show S is orthogonal STS = I, i.e. (STS)ij = Iij. One has (STS)ij
equal to the product of row i of ST and column j of S. However row i of ST is the
transpose of column i of S. So (STS)ij = (S,i)T(S,j) = S,i . S,j. Note that S is orthogonal
S,i . S,j = 0 = Iij if i j and S,i . S,j = 1 = Iij if i = j. So S is orthogonal = Iij. //
Example 3. Find the inverse of T = .
T is orthogonal so T-1 = TT = . In particular the diagonalization of A in Example 1 is = .
An application of this would be to compute An. One has
An =

=
Problem 2. Let A = be the matrix in Problem 1 in section 6.1. (a) Normalize the eigenvectors
you found in that problem. (b) What is the matrix T whose columns are the normalized
eigenvectors. (c) Find the inverse of T. (d) Find An.
Problem 2. Let A = be the matrix in Problem 2 in section 6.1. (a) Normalize the eigenvectors
you found in that problem. (b) What is the matrix T whose columns are the normalized
eigenvectors. (c) Find the inverse of T. (d) Find An. (e) Find the solution to the difference
equations
sn+1 = 0.98sn + 0.12gn
gn+1 = 0.12sn + 1.08gn
with the initial conditions s0 = 1 and g0 = 2.
Answers: (a) u1 = and u1 = . (b) T = . (c) T is orthogoanl so T-1 = TT = . (d) An = (e) = An =

One consequence of proposition 1 is that the product of two orthogonal matrices is


orthogonal and the inverse of an orthogonal matrix is orthogonal.
Proposition 3. If S and T are orthogonal then so is ST and S-1.
Proof. S and T are orthogonal S-1 = ST and T-1 = TT (ST)-1 = T-1S-1 = ST = (ST)T ST
is orthogonal. S is orthogonal S-1 = ST (S-1)-1 = (ST)-1 = (S-1)T S-1 is orthogonal. //
Proposition 1 has one characterization of orthogonal matrices. Another is the following
that says the orthogonal matrices are the ones that preserve inner products and lengths.
Proposition 4. (a) S is orthogonal if and only if (Sx) . (Sy) = x . y for all x and y. (b) S
is orthogonal if and only if | Sx | = | x | for all x.
Proof. (a)

8.2 - 2

8.2 - 3

You might also like