Lecture 6
Lecture 6
Orthogonal matrices
• independent basis, orthogonal basis,
orthonormal vectors, normalization
• Put orthonormal vectors into a matrix
– Generally rectangular matrix – matrix with
orhonormal columns
– Square matrix with orthonormal colums –
orthogonal matrix Q
• Matrix A has non-orthogonal columns,
A → Q … Gram-Schmidt
Orthogonalization
• Q-1 = QT
• Having Q simplifies many formulas. e. g.
AT Ax AT b QT Qxˆ QT b Ixˆ QT b xˆ QT b
• QR decomposition: A = QR
Determinants
a b
ad bc
c d
• Properties:
– Exchange rows/cols of a matrix: reverse
sign of det
– Equals to 0 if two rows/cols are same.
– Det of triangular matrix is a product of
diagonal terms.
• determinant is a test for invertibility
– matrix is invertible if |A| ≠ 0
– matrix is singular if |A| = 0
Eigenvalues and eigenvectors
Ax x
• Action of matrix A on vector x returns the
vector parallel (same direction) to vector x.
• x … eigenvectors, λ … eigenvalues
• spectrum, trace (sum of λ), determinant
(product of λ)
• λ = 0 … singular matrix
• Find eigendecomposition by solving
characteristic equation (leading to
characteristic polynomial)
det(A - λI) = 0
• Find λ first, then find eigenvectors by Gauss
elimination, as the solution of (A - λI)x = 0.
• algebraic multiplicity of an eigenvalue
– multiplicity of the corresponding root
• geometric multiplicity of an eigenvalue
– number of linearly independent eigenvectors
with that eigenvalue
– degenerate, deffective matrix
Diagonalization
• Square matrix A … diagonalizable if there
exists an invertible matrix P such that P
−1AP is a diagonal matrix.
A: m x n
U: m x m
V: n x n
Σ: m x n
• Consider case of square full-rank matrix
Am x m.
• Look again at Av1 = σ1u1
– σ1u1 is a linear combination of columns of A.
Where lies σ1u1?
• In C(A)
– And where lies v1?
• In C(AT)
• In SVD I’m looking for an orthogonal basis
in row space of A that gets knocked over
into an orthogonal basis in column space
of A.
• Can I construct an orthogonal basis of
C(AT)?
• Of course I can, Gram-Schmidt does this.
• However, when I take such a basis and
multiply it by A, there is no reason why the
result should also be orthogonal.
• I am looking for the special setup, where A
turns orthogonal basis vectors from row
space into orthogonal basis in the column
space.
1
basis vectors basis vectors
in row space in column space
of A of A 2
A v1v2 vm u1u2 um multiplying
factors
m
4 4 1 0 32 0 1 2 1 2
A
3 3 0 1 0 18 1 2 1 2
1 1 1 0 2 0 1 2 1 2
A
0 0 0 1 0 0 1 2 1 2
Matlab code
c1=[1 2 4 8]'
c2=[3 6 9 12]
c3=[12 16 3 5]'
c4 = c1-c2+0.0000001*(rand(4,1)) 1 3 12 1.99
A = [c1 c2 c3 c4] 2 6 16 3.99
format long A
4 9 3 4.99
[U,S,V]=svd(A)
8 12 5 3.99
U1 = U; S1=S; V1 = V;
S1(4,4)=0.0
B=U1*S1*V1'
A-B
norm(A-B)
help norm
• m x n, m > n (overedetermined system)
– with all columns independent
A = [c1 c2 c3]
[U,S,V]=svd(A)
U(1:3,1:3)*S(1:3,1:3)*V'
1
2
Av1v2 vn u1u2 unun 1 um
n
complete this to an
orthonormal basis for the
whole column space Rm
1
2
A v1v2 vm vm1 vn u1u2 um
n
complete this to an
orthonormal basis for the
whole row space Rn
complete this by
i.e. these vectors come from adding zeros
null space, which is
orthogonal to row space
Avi i ui , i 1, ,r
• Rank is?
– two, invertible, nonsingular
• I'm going to look for two vectors v1 and v2
in the row space, which of course is what?
– R2
• And I'm going to look for u1, u2 in the
column space, which is also R2, and I'm
going to look for numbers σ1, σ2 so that it
all comes out right.
SVD example
4 4
A
3 3 25 7
A A T
• 1 step – A A
st T
7 25
• 2nd step – find its eigenvectors (they’ll b the vs)
and eigenvalues (squares of the σ)
– eigen can be guessed just by looking
– what are eigenvectors and their corresponding
eigenvalues?
• [1 1]T, λ1 = 32
• [-1 1]T, λ2 = 18
• Actually, I made one mistake, do you see which one?
• I didn’t normalize eigenvectors, they should be [1/sqrt(2),
1/sqrt(2)]T …
4 4 32 0 1 2 1 2
A UV U
T
3 3 0 18 1 2 1 2
4 4 1 0 32 0 1 2 1 2
A UV
T
3 3 0 1 0 18 1 2 1 2
SVD expansion
• Do you remember column times row
picture of matrix multiplication?
• If you look at the UΣVT multiplication as at
the column times row problem, then you
get the SVD expansion.
T
A i ui vi
i
Please note, that if you put p = m (for m<n), then we need m2+m·n,
which is much more than m·n. So there is apparently some border
at which storing SVD matrices increases the storage needs!
Numerical Linear Algebra
Matrix Decompositions
1 2 x 4.001 x 1.999
2 3 y 7.001 y 1.001 small change in result b
1 2 x 4 x 2
small change in coefficient matrix A 2 3.999 y 7.999 y 1
1 2 x 4.001 x 3.999
2 3.999 y 7.998 y 4.000
stable