0% found this document useful (0 votes)
66 views13 pages

Math 5610 Fall 2018 Notes of 10/16/18

The document discusses orthogonal iteration and the QR algorithm for computing eigenvalues and eigenvectors of matrices. Orthogonal iteration uses nested orthogonal transformations to compute multiple eigenvalues/eigenvectors of a matrix simultaneously. The QR algorithm refines orthogonal iteration by applying orthogonal QR factorizations in an iterative process to reduce a matrix to real Schur form, from which the eigenvalues can be extracted. Key aspects of the QR algorithm include using Householder reflections and Givens rotations to preserve the upper Hessenberg structure at each step, and employing a double shift of origin to accelerate convergence.

Uploaded by

bb sparrow
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
66 views13 pages

Math 5610 Fall 2018 Notes of 10/16/18

The document discusses orthogonal iteration and the QR algorithm for computing eigenvalues and eigenvectors of matrices. Orthogonal iteration uses nested orthogonal transformations to compute multiple eigenvalues/eigenvectors of a matrix simultaneously. The QR algorithm refines orthogonal iteration by applying orthogonal QR factorizations in an iterative process to reduce a matrix to real Schur form, from which the eigenvalues can be extracted. Key aspects of the QR algorithm include using Householder reflections and Givens rotations to preserve the upper Hessenberg structure at each step, and employing a double shift of origin to accelerate convergence.

Uploaded by

bb sparrow
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 13

Math 5610 Fall 2018

Notes of 10/16/18

Orthogonal Iteration

• How can we use the power method to compute


more than one eigenvalue?
• How about
1. Pick a random n × r matrix Y0 .
2. For k = 0, 1, 2, . . .:

Zk+1 = AYk
Zk+1
Yk+1 =
kZk+1 k

• This is no good. The columns of Yk do not in-


teract with each other, and so we run r copies
of the scalar power method. Each column of
Yk will converge to the same dominant eigen-
vector.
• We need to keep those columns linearly inde-
pendent.
• How independent?
• How about orthogonal?

Math 5610 Fall 2018 Notes of 10/16/18 page 1


• This gives rise to Orthogonal Iteration:
1. Suppose Q0 = Q is n × r with orthonormal
columns, i.e.,

QT Q = Ir

2. For k = 0, 1, 2, . . .:

compute Zk+1 = AQk


factor Zk+1 = Qk+1 Rk+1

where

Rk is upper triangular and QTk Qk = Ir .

• Note that as far as the first s columns of Qk


are concerned, this is just orthogonal iteration
with s columns.

• We have r nested orthogonal iterations.



How about r = n?

• We would find all eigenvectors.

Math 5610 Fall 2018 Notes of 10/16/18 page 2


• It turns out that Orthogonal Iteration with
r = n is equivalent to running the QR iter-
ation:
1. T0 = A
2. For k = 0, 1, 2, . . .:

Factor Tk = Qk Rk
Compute Tk+1 = Rk Qk

or, more succinctly


2. For k = 0, 1, 2, . . .:

Compute T = QR
Overwrite T = RQ

• Or yet more succinctly, if you don’t mind over-


writing A:
2. For k = 0, 1, 2, . . .:

Compute A = QR
Overwrite A = RQ

• matlab demo
• Why does this work?
• go back to orthogonal iteration:

Math 5610 Fall 2018 Notes of 10/16/18 page 3


Q0 = I
For k = 1, 2, 3, . . .
Zk = AQk−1
Zk = Qk Rk

• Let
Tk = QTk AQk .

Tk is similar to A!

• Also note that

AQk−1 = Zk = Qk Rk

• Tk is obtained from Tk−1 by the QR iteration.


To see this note that the QR factorization of
Tk−1 is given by

Tk−1 = QTk−1 AQk−1 = QTk−1 Zk = QTk−1 Qk Rk = QR

where

Q = QTk−1 Qk and R = Rk . (1)

Math 5610 Fall 2018 Notes of 10/16/18 page 4


• Now consider Tk :

Tk = QTk AQk
= QTk AQk−1 QTk−1 Qk
| {z }
Zk

= QTk Zk QTk−1 Qk
|{z}
Qk Rk

= QTk Qk Rk QTk−1 Qk
| {z }
I
= Rk QTk−1 Qk
= RQ

where Q and R are given in (1).


• Note that every step of this algorithm requires
O(n3 ) operations.
• The QR algorithm starts with this idea and
greatly refines it.

Math 5610 Fall 2018 Notes of 10/16/18 page 5


QR Algorithm Overview

• The QR Algorithm computes the eigenvalues


(and the eigenvectors, if required) of a general
full matrix.
• Variants of the QR algorithm exist for sym-
metric or sparse matrices.
• The QR algorithm is much more complicated
than its counterpart, Gaussian Elimination,
the basic technique for solving linear systems.
• Best description of the QR algorithm is in
Golub/van Loan. Gene H. Golub and Charles
F. van Loan, Matrix Computations, 4th ed.,
Johns Hopkins University Press, 2013, ISBN
978-42140794-4. See section 7.3–5 for the gen-
eral QR algorithm, and section 8.3 for the
symmetric matrix version.
• However, as a first introduction you may want
to read the classic article by David S. Watkins,
Understanding the QR Algorithm, SIAM Re-
view, 1982, Vol. 24, No. 4: pp. 427-440.
• We sometimes cover the QR algorithm in depth
in Math 6610. If you contemplate taking that
course check with the instructor before the
semester starts.
• Studying the QR algorithm is well worth your
time and effort since it embodies most, if not
all, of the key concepts of numerical linear
algebra.

Math 5610 Fall 2018 Notes of 10/16/18 page 6


Outline of the QR Algorithm
• The basic idea is to apply orthogonal similar-
ity transforms to convert A to Real Schur
Form:
 
R11 R12 . . . R1m
T
 0 R22 . . . R2m 
R = Q AQ =  .. .. .. .. 
. . . . 
0 0 . . . Rmm
where R is real and upper block triangular, Q
is orthogonal, and each Rii is either a 1 × 1
matrix or a 2 × 2 matrix having conjugate
complex eigenvalues.
• A block triangular matrix whose diagonal blocks
are 1 × 1 or 2 × 2 is called upper quasi-
triangular.
• The eigenvalues of the Rii are the eigenvalues
of A.
• The Real Schur form exists for every square
matrix A.
• The first step of the QR Algorithm consists
of finding an orthogonal similarity transform
that takes A to upper Hessenberg Form.
• A matrix A is upper Hessenberg if

i>j+1 =⇒ aij = 0.

Thus it is upper triangular except that the


entries immediately below the diagonal may

Math 5610 Fall 2018 Notes of 10/16/18 page 7


also be non-zero. For example, a 6 × 6 upper
Hessenberg matrix has the form
 
x x x x x x
x x x x x x 
 
0 x x x x x 
H = 
0 0 x x x x 
 
0 0 0 x x x
0 0 0 0 x x

• The initial reduction to upper Hessenberg form


can be computed in a finite number of opera-
tions. The algorithm is based on using House-
holder reflection based orthogonal similarity
transforms to go from column to column, sim-
ilarly to the computation of the QR factoriza-
tion.
• Exercise: work out the details of this initial
stage.
• An upper Hessenberg matrix is unreduced if
all subdiagonal entries are non-zero.
• The second stage of the QR Algorithm is an
iteration that consists of applying orthogonal
transformations, based on Givens Rotations,
that reduce the upper Hessenberg matrix to
Real Schur Form.
• The iterations in the second stage involve a
double shift of origin which accelerates the
reduction of the upper Hessenberg matrix to
quasi-triangular form.

Math 5610 Fall 2018 Notes of 10/16/18 page 8


Ingredients and Principles

• Even though eigenvalues and eigenvectors may


be complex, the arithmetic is real throughout.
• All similarity transforms are orthogonal, and
those in stage 2 preserve the upper Hessen-
berg structure.
• The initial stage requires O(n3 ) operations.
• Every step of the iteration in the second stage
requires only O(n2 ) operations.
• Accomplishing each iteration in O(n2 ) oper-
ations is the most complicated part of the al-
gorithm. The breakthrough that made this
possible was a technique developed by John
Francis and published in 1961. This is the
Francis QR step, Algorithm 7.5.1 on page
390 in Golub/van Loan.
• The following outline is taken (and modified)
from Algorithm 7.5.2 of Golub/van Loan

Math 5610 Fall 2018 Notes of 10/16/18 page 9


The QR Algorithm
Let tol be a tolerance greater than the
roundoff unit.
Compute the Hessenberg Reduction

H = U0T AU0
where H is upper Hessenberg and U0 is
orthogonal.
Set q = 0
until q = n
Set to zero all subdiagonal entries of H
that satisfy

|hi,i−1 | ≤ tol(|hii | + |hi−1,i−1 |)

Find the largest nonnegative q and the


smallest non-negative p such that

p n−p−q q
!
p H11 H12 H13
H = n−p−q 0 H22 H23
q 0 0 H33

where H33 is upper quasi-triangular


and H22 is unreduced
If q < n perform a Francis QR step on H22
Upper triangularize all 2 × 2 blocks in H that
have real eigenvalues.

Math 5610 Fall 2018 Notes of 10/16/18 page 10


• According to Golub/van Loan, The algorithm
requires 25n3 flops if the eigenvectors are com-
puted and 10n3 flops if only the eigenvalues
are computed. These counts are very approx-
imate and based on the empirical observation
that on average only two Francis iterations
are required before lower right 1 × 1 or 2 × 2
submatrix of H22 decouples.
• The QR Algorithm is extremely sophisticated,
but it grew out of simple ideas in natural
steps. Golub/van Loan and Watkins both ex-
plain this very well.

Math 5610 Fall 2018 Notes of 10/16/18 page 11


Summary

• These are some of the key ingredients of the


design and analysis of the QR algorithm:
− Accomplish each task by multiplying with
an orthogonal matrix.
− Work on subproblems and embed the re-
quired matrices in the identity matrix.
− power iterations
− orthogonal iterations
− many nested orthogonal iterations
− shift of origin
− real Schur form
− Use of Householder reflections and Givens
rotations
− O(n2 ) effort per step
− real arithmetic
− start with upper Hessenberg
− keep it upper Hessenberg
− Implicit Q Theorem (underlying Francis step).

Math 5610 Fall 2018 Notes of 10/16/18 page 12


• The story has a happy ending. In 2009 the
following item was posted by Frank Uhlig on
the Numerical Analysis Bulletin board:

From: Frank Uhlig <[email protected]>


Date: Wed, 25 Mar 2009 08:48:14 -0500
Subject: John Francis of QR
John Francis and 50 years of QR
John Francis submitted his first QR paper al-
most 50 years ago in October 1959. By 1962 he
had left the NA field. When his algorithm was
judged one of the top ten algorithms of the 20th
century in 2000 by Jack Dongarra and Fran-
cis Sullivan, nobody alive in the mathematics
community had ever seen John Francis or knew
where or if he lived. Gene Golub and Frank
Uhlig independently tracked John Francis down,
joined forces, and visited and interviewed him
over the last couple of years.
When first contacted, John Francis had no idea
about QR’s impact. He is 74 years old now and
well. Re QR, he remembers his math and com-
putational work of 50 years ago clearly. John
Francis will be the lead-off speaker at a mini
symposium, held in his honor, at the 23rd Bi-
ennial Conference on Numerical Analysis, June
23rd - 26th 2009 in Glasgow to which everyone
is cordially invited.

Math 5610 Fall 2018 Notes of 10/16/18 page 13

You might also like