0% found this document useful (0 votes)
13 views61 pages

NO LINEALs

The document introduces the concepts of nonlinear least squares and estimation theory, focusing on methods such as the Gauss-Newton Method and optimization techniques. It discusses the differences between linear and nonlinear problems, including the use of gradient descent, Cholesky, and QR solvers. Additionally, it highlights the challenges of finding global minima in nonlinear optimization and the importance of convexity.

Uploaded by

javieraneudis
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
13 views61 pages

NO LINEALs

The document introduces the concepts of nonlinear least squares and estimation theory, focusing on methods such as the Gauss-Newton Method and optimization techniques. It discusses the differences between linear and nonlinear problems, including the use of gradient descent, Cholesky, and QR solvers. Additionally, it highlights the challenges of finding global minima in nonlinear optimization and the importance of convexity.

Uploaded by

javieraneudis
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 61

Introduction to Nonlinear Least Squares

Rajat Talak

VNAV
Fall 2020
Recall … Motion estimation

In the previous lecture:

• Perception problem can systematically formulated using estimation theory


Recall … Motion estimation

In the previous lecture:

• Perception problem can systematically formulated using estimation theory

• Estimation theory:
(1) Maximum likelihood (ML) estimate,
(2) Maximum a-postiriori (MAP) estimate
Recall … Motion estimation

In the previous lecture:

• Perception problem can systematically formulated using estimation theory

• Estimation theory:
(1) Maximum likelihood (ML) estimate,
(2) Maximum a-postiriori (MAP) estimate

Which is this? ML or MAP?


Recall … Motion estimation

In the previous lecture:

• Perception problem can systematically formulated using estimation theory

• Estimation theory:
(1) Maximum likelihood (ML) estimate,
(2) Maximum a-postiriori (MAP) estimate

• Abstract Model:

If
and independent across
state variable

measurements

noise
Recall … Motion estimation

In the previous lecture:

• Perception problem can systematically formulated using estimation theory

• Estimation theory:
(1) Maximum likelihood (ML) estimate,
(2) Maximum a-postiriori (MAP) estimate
is called Mahalanobis distance.
• Abstract Model:

If
and independent across
state variable

measurements

noise
Recall … Motion estimation

In the previous lecture:

• Perception problem can systematically formulated using estimation theory

• Estimation theory:
(1) Maximum likelihood (ML) estimate,
(2) Maximum a-postiriori (MAP) estimate
is called Mahalanobis distance.
• Linear Model:

If
and independent across
state variable

measurements

noise
Today
• Nonlinear least squares problem
• Gauss-Newton Method

A quick detour
• Nonlinear optimization
• Convexity
• Optimality conditions
• Gradient descent and Newton’s method
Nonlinear Least Squares Problem


Nonlinear Least Squares Problem

• For our abstract model

• Linear model
Nonlinear Least Squares Problem


Nonlinear Least Squares Problem


Nonlinear Least Squares Problem
Nonlinear
optimization
problem


Nonlinear Optimization Problem
• Unconstrained nonlinear optimization problem:

local minimum

• Global minimum: global minimum

• Local minimum:
Nonlinear Optimization Problem
• Unconstrained nonlinear optimization problem:

local minimum

• Necessary conditions for local minimum global minimum

• Sufficient conditions for local minimum


Nonlinear Optimization Problem
• Unconstrained nonlinear optimization problem:

local minimum

• Necessary conditions for local minimum global minimum

• Sufficient conditions for local minimum


Recall
Nonlinear Optimization Problem
• Unconstrained nonlinear optimization problem:

local minimum

• Necessary conditions for local minimum global minimum

• Sufficient conditions for local minimum

• Gradient descent converges to local minimum


Nonlinear Optimization Problem
• Unconstrained nonlinear optimization problem:

local minimum

• Necessary conditions for local minimum global minimum

• Sufficient conditions for local minimum

Finding global minimum is hard!! … possible with an added structure of convexity


Convex Problems
• Convex optimization problem:

local minimum

• global minimum
Convex Problems
• Convex optimization problem:

local minimum Not convex!

• global minimum
Convex Problems
• Convex optimization problem:


Convex Problems
• Convex optimization problem:


Convex Problems
• Convex optimization problem:

• Local minima is also a global minima

• Necessary and sufficient condition

• Gradient descent converges to global minima


Back to Nonlinear Least Squares Problem


Linear Least Squares Problem

• The objective function is convex!

• Gradient descent algorithm converges to the global minimum

• But, we can do much better (computationally) by exploiting the problem structure


and using the optimality conditions
Linear Least Squares Problem

• The objective function is convex!

• Recall:
Linear Least Squares Problem

• The objective function is convex!

• Recall:


Linear Least Squares Problem

• The objective function is convex!

• Recall:


suffices to solve this linear
• system of equations
Linear Least Squares Problem

• The objective function is convex!

• Recall:


suffices to solve this linear
• system of equations Do not invert!
Cholesky Solver

• Assuming
Cholesky Solver

• Assuming Illustrative example

• Cholesky decomposition of
Cholesky Solver

• Assuming Illustrative example

• Cholesky decomposition of

• We now have to solve . We solve it in two steps.


Cholesky Solver

• Assuming Illustrative example

• Cholesky decomposition of

• We now have to solve . We solve it in two steps.

• Forward substitution: and obtain


• Backward substitution: and obtain
QR Solver
QR Solver


QR Solver


QR Solver


QR Solver


QR Solver


can be solved by backward substitution
Cholesky vs QR Solver

• QR is slower than Cholesky


• QR gives better numerical stability than Cholesky
Linear Least Squares Problem

• The objective function is convex!

• Recall:


Done!!

Back to Nonlinear Least Squares Problem


Back to Nonlinear Least Squares Problem


Linear Approximations


Nonlinear Least Squares Problem


Nonlinear Least Squares Problem


Nonlinear Least Squares Problem


Nonlinear Least Squares Problem

Will it? Yes or No?


Nonlinear Least Squares Problem

No!!
Nonlinear Least Squares Problem


Nonlinear Least Squares Problem


Nonlinear Least Squares Problem

This is called the Gauss-Newton Method


Gauss-Newton Method
1.
Gauss-Newton Method
1.

2.
Gauss-Newton Method
1.

2.

3.
Gauss-Newton Method
1.

2.

3.

4.
Gauss-Newton Method
1.

2.

3.

4.
Nonlinear Least Squares Problem


Summary
• Nonlinear least squares problem
• Linear least squares problem
• Gradient descent
• Cholesky solver
• QR solver
• Gauss-Newton Method

A quick detour
• Nonlinear optimization
• Convexity
• Optimality conditions
• Gradient descent
Summary
• Nonlinear least squares problem
• Linear least squares problem
• Gradient descent
• Cholesky solver
• QR solver
• Gauss-Newton Method

A quick detour
Next
• Nonlinear optimization
• Issues with Gauss-Newton Method
• Convexity
• Levenberg-Marquardt Method
• Optimality conditions
• Nonlinear least squares on Riemannian
• Gradient descent Manifolds

You might also like