0% found this document useful (0 votes)
51 views1 page

0.1. Applications To Least Squares Problems

This document discusses formulations of the least squares problem as it relates to concepts from optimization theory. It presents the least squares problem as: 1) an Euler-Lagrange equation, 2) a primal problem to minimize a cost function, 3) a dual problem to minimize a related cost function subject to constraints, 4) a Lagrangian with critical points, 5) a minimax problem, and 6) a saddle point problem. It then defines a projection matrix for a full column rank matrix A, and states properties of the projection matrix and its use in projecting vectors onto the column space of A.

Uploaded by

julianli0220
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
51 views1 page

0.1. Applications To Least Squares Problems

This document discusses formulations of the least squares problem as it relates to concepts from optimization theory. It presents the least squares problem as: 1) an Euler-Lagrange equation, 2) a primal problem to minimize a cost function, 3) a dual problem to minimize a related cost function subject to constraints, 4) a Lagrangian with critical points, 5) a minimax problem, and 6) a saddle point problem. It then defines a projection matrix for a full column rank matrix A, and states properties of the projection matrix and its use in projecting vectors onto the column space of A.

Uploaded by

julianli0220
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 1

0.1.

APPLICATIONS TO LEAST SQUARES PROBLEMS

0.1

Applications to Least Squares Problems

Consider the least square problem: minx kbAxk2 , where A has full column rank. Now we
formulate this (primal) problem in terms of (2.7.1) (2.7.6) as follows. It is an excellent
exercise for students to verify the equivalence of these formulations.
1. (Euler-Lagrange) x , y is the solution to the Euler-Lagrange equation:


I A
y
b
=
.
AT 0
x
0
2. (Primal) Let F (x) := 12 (b Ax, b Ax). Then x = minx F (x).
3. (Dual) Let G(y) := 12 (b y, b y). Then y = miny G(y) s.t. AT y = 0.
4. (Lagrange Multiplier) (x , y ) is the critical point of the functional L(x, y) := 12 (b
y, b y) + (x, AT y).
5. (Minimax) (x , y ) is the solution to the minimax problem, maxx miny L(x, y), for
L(x, y) defined in (4).
6. (Saddle Point) (x , y ) is the saddle point of the quadratic form L(x, y) defined in
(4).
Definition 0.1.1 (Projection Matrix). P = A(AT A)1 AT is called the projection matrix,
where A has full column rank.
Proposition 0.1.2. Suppose P is projection matrix, then we have
P T = P,

P 2 = P.

Theorem 0.1.3. P = A(AT A)1 AT projects any vector onto the column space of A.
Moreover, P b = Ax where x = minx kb Axk2
Question: What is the trouble if the columns of A are linearly dependent? And why?

You might also like