Linear Least Squares Problem: - Can Not Be Satisfied Exactly
Linear Least Squares Problem: - Can Not Be Satisfied Exactly
Y = x 1 + x2 T
Where x1 is the original length, T is the force applied and x2
is the inverse coefficient of stiffness.
Suppose that the following measurements where taken:
T 10 15 20
Y 11.60 11.85 12.25
Problem:
The first algorithm in the fastest and the least accurate among the
three. On the other hand SVD is the slowest and most accurate.
A x b
1 10 11 .60
3 45
A 1 15 1 1 1
A
T AT A B 11 .85
45 725
1 20 10 15 20 12.25
10.925
x ( AT A) 1 AT b (ATA)-1AT is called a Pseudo-inverse
0. 650
Math for CS Lecture 4 6
QR factorization 1
• A matrix Q is said to be orthogonal if its columns are
orthonormal, i.e. QT·Q=I.
A = Q·R
Q·R·x = b
R·x = QT·b
x=R-1·QT·b
A U Σ V T i u i v Ti
p
i 1
ATA V Σ2 VT
AA T U Σ 2 U T
Math for CS Lecture 4 12
Approximation by a low-rank matrix
Fact II : let A mn have rank p.
p
let A U Σ V T i u i v Ti be the SVD of A, 1 2 ... p
i 1
~ r
~ T ~
Then A i u i v i U Σ V , Σ diag( 1 , 2 ,..., r ), r p
T
i 1
v1 σ·v1
σ·v2
v2
S v1 AS σ·u1
σ·u2
v2
AS A mn S
n left singular vectors of A are the unit vectors {u1,…, un}, oriented in the
n right singular vectors of A are the unit vectors {v1,…, vn}, of S, which are the
preimages
Math for CS of the principal semiaxes of AS: 4
Lecture Avi= σiui 15
Singular Value Decomposition
Avi= σiui, 1≤i≤n
1
1 2 n
2
A u1 u2 un
( n ,n ) ( m ,n )
n ( n,n )
( m ,n )
AV U
A UV *
- Singular Value decomposition
b A mn x
Any vector b(m,1) can be expanded in the basis of left singular vectors of A {ui};
Any vector x(n,1) can be expanded in the basis of right singular vectors of A {vi};
b' U * b; x' V * x
b A x U * b U * A x U *UV * x b x'
Math for CS Lecture 4 17
Rank of A
Let p=min{m,n}, let r≤p denote the number of nonzero
singlular values of A,
Then:
The rank of A equals to r, the number of nonzero
singular values
Proof:
The rank of a diagonal matrix equals to the number of its
nonzero entries, and in the decomposition A=UΣV* ,U and V
are of full rank
Math for CS Lecture 4 18
Determinant of A
For A(m,m), m
| det( A) | i
i 1
Proof:
The determinant of a product of square matrices is the
product of their determinants. The determinant of a Unitary
matrix is 1 in absolute value, since: U*U=I. Therefore,
m
| det( A) || det(UV ) || det(U ) || det( ) || det(V ) || det( ) | i
* *
i 1
2
x x x
T
xi (1)
2 i 1
x
2
Therefore
A 2 max( i )
A xi i xi
,where λi are the eigenvalues
Math for CS Lecture 4 21
Matrix Approximation in SVD basis
A A 2
infmn A B 2 1
BC
rank ( B )