0% found this document useful (0 votes)
32 views

OPT++ - An Object-Oriented Nonlinear Optimization Library - 7

This document discusses unconstrained minimization in OPT++, an object-oriented nonlinear optimization library. It defines the Rosenbrock problem, writes a function to evaluate it and its gradient, and creates an NLF1 object to represent the problem. It also discusses using conjugate gradient and quasi-Newton methods to solve the problem.

Uploaded by

David
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
32 views

OPT++ - An Object-Oriented Nonlinear Optimization Library - 7

This document discusses unconstrained minimization in OPT++, an object-oriented nonlinear optimization library. It defines the Rosenbrock problem, writes a function to evaluate it and its gradient, and creates an NLF1 object to represent the problem. It also discusses using conjugate gradient and quasi-Newton methods to solve the problem.

Uploaded by

David
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 2

24/08/2017 OPT++: An Object-Oriented Nonlinear Optimization Library

Main Page Namespaces Classes Files Directories Related Pages Examples

Unconstrained minimization
In this section, we present the constructors for an objective function, supply a prototype function evaluator, and provide examples for solving
the unconstrained minimization problem.

Declaring objective functions


Defining an unconstrained problem
Specifying the optimization method

Defining an unconstrained problem


Let's consider the two-dimensional Rosenbrock problem with analytic derivatives.

minimize

Representing the Rosenbrock problem as an NLF1 requires an user-supplied function to evaluate the problem and construction of an NLF1.

Step 1: Write a function that evaluates the Rosenbrock problem and gradient.

void rosen(int mode, int n, const ColumnVector& x, double& fx,


ColumnVector& g, int& result)
{ // Rosenbrock's function
double f1, f2, x1, x2;

if (n != 2) return;

x1 = x(1);
x2 = x(2);
f1 = (x2 - x1 * x1);
f2 = 1. - x1;

if (mode & NLPFunction) {


fx = 100.* f1*f1 + f2*f2;
}
if (mode & NLPGradient) {
g(1) = -400.*f1*x1 - 2.*f2;
g(2) = 200.*f1;
}
https://software.sandia.gov/opt++/opt++2.4_doc/html/UnconstrainedProblems.html 1/2
24/08/2017 OPT++: An Object-Oriented Nonlinear Optimization Library

result = NLPFunction & NLPGradient;


}

Step 2: Create an NLF1 object.

NLF1 rosen_problem(n,rosen,init_rosen);

Specifying the optimization method


There are several algorithms in OPT++ to solve unconstrained problems. We provide examples of solving the Rosenbrock problem with a
conjugate gradient and Quasi-Newton method.

1. Conjugate Gradient Method


2. Quasi-Newton Method with trust-region

Next Section: Bound-constrained Minimization | Back to Solvers Page

Last revised July 13, 2006

Bug Reports OPT++ Developers Copyright Information GNU Lesser General Public License
Documentation, generated by , last revised August 30, 2006.

https://software.sandia.gov/opt++/opt++2.4_doc/html/UnconstrainedProblems.html 2/2

You might also like