0% found this document useful (0 votes)
62 views

Algorithm of Code Minimization

Newton's method can be used to minimize a convex function g(x) by iteratively solving the equation ∇g(x)=0. This involves computing the gradient and Hessian at each iteration x, solving the Newton system to find the Newton step v, and updating x as x+=x+tv where t is chosen using backtracking line search to ensure sufficient decrease in g(x). If the Hessian is not positive definite, the method falls back to using the negative gradient as the descent direction.

Uploaded by

SeyhaSun
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
62 views

Algorithm of Code Minimization

Newton's method can be used to minimize a convex function g(x) by iteratively solving the equation ∇g(x)=0. This involves computing the gradient and Hessian at each iteration x, solving the Newton system to find the Newton step v, and updating x as x+=x+tv where t is chosen using backtracking line search to ensure sufficient decrease in g(x). If the Hessian is not positive definite, the method falls back to using the negative gradient as the descent direction.

Uploaded by

SeyhaSun
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 3

Newton’s method for minimizing a convex function

if ∇2g(x) is positive definite everywhere, we can minimize g(x) by solving

∇g(x) = 0

using Newton’s method

given initial x, tolerance ǫ > 0


repeat
1. evaluate ∇g(x) and ∇2g(x).
2. if k∇g(x)k ≤ ǫ, return x.
3. Solve ∇2g(x)v = −∇g(x).
4. x := x + v .
until maximum number of iterations is exceeded

• v = −∇2g(x)−1∇g(x) is called the Newton step at x


• converges if started sufficiently close to the solution

Unconstrained minimization 13-15


Newton’s method with backtracking line search

use update x+ = x + tv; choose t so that g(x+) < g(x)

given initial x, tolerance ǫ > 0, parameter α ∈ (0, 1/2).


repeat
1. evaluate ∇g(x) and ∇2g(x).
2. if k∇g(x)k ≤ ǫ, return x.
3. Solve ∇2g(x)v = −∇g(x).
4. t := 1.
while g(x + tv) > g(x) + αt∇g(x)T v , t := t/2.
5. x := x + tv .
until maximum number of iterations is exceeded

• typical values of α are small (e.g., α = 0.01)


• t is called the step size
• inner loop is called backtracking line search

Unconstrained minimization 13-19


solution: always use a descent direction v, for example, v = −∇g(x)

given initial x, tolerance ǫ > 0, parameter α ∈ (0, 1/2).


repeat
1. evaluate ∇g(x) and ∇2g(x).
2. if k∇g(x)k ≤ ǫ, return x.
3. if ∇2g(x) is positive definite, solve ∇2g(x)v = −∇g(x) for v
else, v := −∇g(x).
4. t := 1.
while g(x + tv) > g(x) + αt∇g(x)T v, t := t/2.
5. x := x + tv.
until maximum number of iterations is exceeded

practical methods use more sophisticated choices of v if ∇2g(x) is not


positive definite

Unconstrained minimization 13-25

You might also like