0% found this document useful (0 votes)
25 views1 page

Non Linear Optimization

The document discusses using gradient descent and Newton's method with backtracking line search algorithms to minimize the Rosenbrock function. It provides initial conditions and termination criteria for finding the minimum value of the function using each algorithm and compares their convergence speeds.

Uploaded by

Strix
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
25 views1 page

Non Linear Optimization

The document discusses using gradient descent and Newton's method with backtracking line search algorithms to minimize the Rosenbrock function. It provides initial conditions and termination criteria for finding the minimum value of the function using each algorithm and compares their convergence speeds.

Uploaded by

Strix
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 1

Bonus Point Project on Non-Linear Optimization

Gradient Descent, Newton and Backtracking Line Search Algorithms

Given the Rosembrock function

f ( x ) = 100( x2 − x12 )2 + (1 − x1 )2 .

1. Use Gradient descent with Backtracking line search to find the minimum value of f ,
given:

• The initial step size for backtracking line search is t0 = 1.


• The starting points for gradient descent are x0 = (1.2, 1.2) T and x0 = (−1.2, 1) T .
• The algorithm terminates when the number of iterates exceeds 1000, or ∥∇ f ( xk )∥ <
10−4 .

2. Use Newton method with Backtracking line search to find the minimum value of f
using the same information as above.

3. Compare the convergence speed of the two algorithms.

You might also like