0% found this document useful (0 votes)
177 views11 pages

L16 - 17 Linear Algebra - Least Square Approximation

The document discusses least squares approximation for inconsistent linear systems. It begins by explaining that when a linear system Ax = b is inconsistent, the goal is to find the vector x that minimizes the Euclidean norm ||Ax - b||. This vector x is called the least squares solution. It then proves that the normal system ATAx = ATb always has a solution, and any solution to the normal system is a least squares solution to the original system. It also shows that the orthogonal projection of b onto the column space of A is equal to the least squares solution Ax. An example is provided to demonstrate finding the least squares solution and error vector for an inconsistent system. Another example shows computing the orthogonal

Uploaded by

Harshini M
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
177 views11 pages

L16 - 17 Linear Algebra - Least Square Approximation

The document discusses least squares approximation for inconsistent linear systems. It begins by explaining that when a linear system Ax = b is inconsistent, the goal is to find the vector x that minimizes the Euclidean norm ||Ax - b||. This vector x is called the least squares solution. It then proves that the normal system ATAx = ATb always has a solution, and any solution to the normal system is a least squares solution to the original system. It also shows that the orthogonal projection of b onto the column space of A is equal to the least squares solution Ax. An example is provided to demonstrate finding the least squares solution and error vector for an inconsistent system. Another example shows computing the orthogonal

Uploaded by

Harshini M
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 11

Linear Algebra

-Lease Square Approximation

4.1
Least Squares Approximation:
Suppose that a linear system Ax = b of ‘m’ equations in ‘n’ unknowns
is inconsistent.
Also, we suspect that it to be caused by measurement errors in the
coefficients of A.
Since no exact solution is possible, we will look for a vector (solution)
x that comes as “close as possible” to being a solution in the sense that it
n
minimizes || Ax – b || with respect to the Euclidean inner product on R .
So our goal is finding a vector x for a system Ax = b such that
n
which minimizes || Ax – b || with respect to the Euclidean inner product on R

We call such x as a least squares solution of the system, Ax – b we


call the least squares error vector, and || Ax – b || we call the least squares
4.2
We already know that the system Ax = b is consistent iff b is
an vector in the column space W of coefficient matrix.
Hence to find the Approximate solution of the system we
have to find the nearest vector to the vector b from the column
space W. That nearest vector is proj.W b

Theorem: For every linear system Ax = b, the associated normal


T T
system A Ax  A b and all solutions of new system are
is consistent,
least squares solutions of Ax = b.
Moreover, if W is the column space of A, and x is any least
squares solution of Ax = b, then the orthogonal projection of b on
W is proj W b  .Ax
4.3
Example:
a)Find all least squares solutions of the linear system

b)Find the error vector and the error.

Soln: The matrix form of the given system is


 1  1  4
  x1   
 3 2     1 
  2 4  x2   3 
   
Clearly, the given system is not consistent. Hence we can able to
find approximate solution only.

4.4
 1  1
T  1 3  2    14  3 
A A    3 2    
  1 2 4   2 4    3 21 
 
 4
T  1 3  2    1 
A b    1    
  1 2 4  3  10 
 
Hence the normal system of the given system is
 14  3  x1   1 
     
  3 21  x2  10 
14  3  x1   1 
     
 0 285  x2  143
143 17
x 
Hence the approximate solution is 2 285 .1 95, x 

4.5
4.6
Example (Orthogonal Projection on a Subspace) :

Find the orthogonal projection of the vector (3, 4, 5 ,6) on


Rn
the subspace of spanned by the vectors

Soln: The orthogonal projection of the vector (3, 4, 5 ,6) is the


approximation solution of the following system
3 1  1  3
  x1   
1 2 0    4 
0  x2    
1 2   5
  x3   
1 1  1 6
  

4.7
3 1  1
 3 1 0 1    11 6  4 
T
  1 2 0  
A A   1 2 1 1    6 7 0 
  1 0 2  1 0 1 2
   4 0 6 
  1 1  1

 3
 3 1 0 1    19 
T
  4   
A b   1 2 1 1     22 
  1 0 2  1 5   1 
  6   
 

Hence the normal system of the given system is

 11 6  4  x1   19 
    
 6 7 0  x2    22 
  4 0 6  x   1 
  3   

4.8
11 6  4  x1   19 
    
 0 41 11 24 11  x2   128 11
0 0 134 41 x   45 41 
  3   
x3  45 134 x2  196 67 x1  17 67

Hence the least square solution of the system is (45/134,


196/67, 17/67). Therefore the projection of the given vector is

3 1  1  493 134 
  45 134   
1 2 0    829 134 
0 196 67    
1 2  230 67
  17 67   
1 1  1  403 134 
  
4.9
Theorem: If A is an m × n matrix, then the following are
equivalent.
 A has linearly independent column vectors.
 AT A is invertible.

Theorem: If A is an m × n matrix with linearly independent


column vectors, then for every m × 1 matrix b, the linear system
Ax = b has a unique least squares solution. This solution is given
by

Moreover, if W is the column space of A, then the


orthogonal projection of b on W is

4.10
Thank
You
4.11

You might also like