MATLAB Optimization Toolbox User s Guide The Mathworks 2024 scribd download
MATLAB Optimization Toolbox User s Guide The Mathworks 2024 scribd download
com
https://textbookfull.com/product/matlab-optimization-
toolbox-user-s-guide-the-mathworks/
OR CLICK BUTTON
DOWNLOAD NOW
https://textbookfull.com/product/matlab-global-optimization-toolbox-
user-s-guide-the-mathworks/
textboxfull.com
https://textbookfull.com/product/matlab-econometrics-toolbox-user-s-
guide-the-mathworks/
textboxfull.com
https://textbookfull.com/product/matlab-bioinformatics-toolbox-user-s-
guide-the-mathworks/
textboxfull.com
https://textbookfull.com/product/matlab-mapping-toolbox-user-s-guide-
the-mathworks/
textboxfull.com
MATLAB Trading Toolbox User s Guide The Mathworks
https://textbookfull.com/product/matlab-trading-toolbox-user-s-guide-
the-mathworks/
textboxfull.com
https://textbookfull.com/product/matlab-computer-vision-toolbox-user-
s-guide-the-mathworks/
textboxfull.com
https://textbookfull.com/product/matlab-curve-fitting-toolbox-user-s-
guide-the-mathworks/
textboxfull.com
https://textbookfull.com/product/matlab-fuzzy-logic-toolbox-user-s-
guide-the-mathworks/
textboxfull.com
https://textbookfull.com/product/matlab-image-processing-toolbox-user-
s-guide-the-mathworks/
textboxfull.com
Optimization Toolbox™
User's Guide
R2020a
How to Contact MathWorks
Phone: 508-647-7000
Acknowledgments
Acknowledgments . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . xx
Getting Started
1
Optimization Toolbox Product Description . . . . . . . . . . . . . . . . . . . . . . . . . 1-2
Key Features . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1-2
v
Setting Up an Optimization
2
Optimization Theory Overview . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2-2
vi Contents
Linear Equality Constraints . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2-36
Bibliography . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2-80
Examining Results
3
Current Point and Function Value . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3-2
vii
Exit Message Options . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3-7
viii Contents
Use a Sparse Solver or a Multiply Function . . . . . . . . . . . . . . . . . . . . . . 4-10
Use Parallel Computing . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4-11
Optimization App
5
Optimization App . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5-2
Optimization App Basics . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5-2
Specifying Certain Options . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5-6
Importing and Exporting Your Work . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5-8
ix
Minimization with Gradient and Hessian . . . . . . . . . . . . . . . . . . . . . . . . . 6-13
x Contents
Code Generation in fmincon . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6-114
What Is Code Generation? . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6-114
Code Generation Requirements . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6-114
Generated Code Not Multithreaded . . . . . . . . . . . . . . . . . . . . . . . . . . . 6-115
Nonlinear Problem-Based
7
Rational Objective Function, Problem-Based . . . . . . . . . . . . . . . . . . . . . . . 7-2
xi
Multiobjective Algorithms and Examples
8
Multiobjective Optimization Algorithms . . . . . . . . . . . . . . . . . . . . . . . . . . . 8-2
Multiobjective Optimization Definition . . . . . . . . . . . . . . . . . . . . . . . . . . . 8-2
Algorithms . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8-3
xii Contents
Optimal Dispatch of Power Generators: Solver-Based . . . . . . . . . . . . . . . 9-55
Problem-Based Optimization
10
Problem-Based Optimization Workflow . . . . . . . . . . . . . . . . . . . . . . . . . . . 10-2
xiii
Named Index for Optimization Variables . . . . . . . . . . . . . . . . . . . . . . . . . 10-20
Create Named Indices . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10-20
Use Named Indices . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10-21
View Solution with Index Variables . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10-22
Create Initial Point for Optimization with Named Index Variables . . . . 10-43
Quadratic Programming
11
Quadratic Programming Algorithms . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11-2
Quadratic Programming Definition . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11-2
interior-point-convex quadprog Algorithm . . . . . . . . . . . . . . . . . . . . . . . . 11-2
trust-region-reflective quadprog Algorithm . . . . . . . . . . . . . . . . . . . . . . . 11-7
active-set quadprog Algorithm . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11-11
xiv Contents
Step 1: Decide what part of H to pass to quadprog as the first argument.
.................................................... 11-17
Step 2: Write a function to compute Hessian-matrix products for H. . . . 11-17
Step 3: Call a quadratic minimization routine with a starting point. . . . 11-18
Preconditioning . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11-19
Least Squares
12
Least-Squares (Model Fitting) Algorithms . . . . . . . . . . . . . . . . . . . . . . . . 12-2
Least Squares Definition . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12-2
Linear Least Squares: Interior-Point or Active-Set . . . . . . . . . . . . . . . . . . 12-2
Trust-Region-Reflective Least Squares . . . . . . . . . . . . . . . . . . . . . . . . . . 12-3
Levenberg-Marquardt Method . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12-6
xv
Setting Up the Problem . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12-27
Systems of Equations
13
Equation Solving Algorithms . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13-2
Equation Solving Definition . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13-2
Trust-Region Algorithm . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13-2
Trust-Region-Dogleg Algorithm . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13-4
Levenberg-Marquardt Method . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13-5
fzero Algorithm . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13-6
\ Algorithm . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13-6
xvi Contents
Nonlinear Equations with Jacobian Sparsity Pattern . . . . . . . . . . . . . . . 13-13
Step 1: Write a file nlsf1a.m that computes the objective function values.
.................................................... 13-13
Step 2: Call the system of equations solve routine. . . . . . . . . . . . . . . . . 13-13
xvii
Optimization Options Reference . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15-6
Optimization Options . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15-6
Hidden Options . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15-16
Functions
16
xviii Contents
Acknowledgments
xix
Acknowledgments
Acknowledgments
MathWorks® would like to acknowledge the following contributors to Optimization Toolbox
algorithms.
Thomas F. Coleman researched and contributed algorithms for constrained and unconstrained
minimization, nonlinear least squares and curve fitting, constrained linear least squares, quadratic
programming, and nonlinear equations.
Yin Zhang researched and contributed the large-scale linear programming algorithm.
xx
1
Getting Started
Optimization Toolbox provides functions for finding parameters that minimize or maximize objectives
while satisfying constraints. The toolbox includes solvers for linear programming (LP), mixed-integer
linear programming (MILP), quadratic programming (QP), nonlinear programming (NLP),
constrained linear least squares, nonlinear least squares, and nonlinear equations. You can define
your optimization problem with functions and matrices or by specifying variable expressions that
reflect the underlying mathematics.
You can use the toolbox solvers to find optimal solutions to continuous and discrete problems,
perform tradeoff analyses, and incorporate optimization methods into algorithms and applications.
The toolbox lets you perform design optimization tasks, including parameter estimation, component
selection, and parameter tuning. It can be used to find optimal solutions in applications such as
portfolio optimization, resource allocation, and production planning and scheduling.
Key Features
• Nonlinear and multiobjective optimization of smooth constrained and unconstrained problems
• Solvers for nonlinear least squares, constrained linear least squares, data fitting, and nonlinear
equations
• Quadratic programming (QP) and linear programming (LP)
• Mixed-integer linear programming (MILP)
• Optimization modeling tools
• Graphical monitoring of optimization progress
• Gradient estimation acceleration (with Parallel Computing Toolbox™)
1-2
First Choose Problem-Based or Solver-Based Approach
This table summarizes the main differences between the two approaches.
Approaches Characteristics
“Problem-Based Optimization Setup” Easier to create and debug
Represents the objective and constraints symbolically
Requires translation from problem form to matrix form, resulting in a
time
Does not allow direct inclusion of gradient or Hessian; see “Include De
Problem-Based Workflow” on page 7-20
See the steps in “Problem-Based Optimization Workflow” on page 10-2
“Problem-Based Workflow for Solving Equations” on page 10-4
Basic linear example: “Mixed-Integer Linear Programming Basics: Pro
page 10-40 or the video Solve a Mixed-Integer Linear Programming Pr
Optimization Modeling
1-3
1 Getting Started
See Also
More About
• “Problem-Based Optimization Setup”
• “Solver-Based Optimization Problem Setup”
1-4
Solve a Constrained Nonlinear Problem, Problem-Based
This example shows how to solve a constrained nonlinear optimization problem using the problem-
based approach. The example demonstrates the typical work flow: create an objective function,
create constraints, solve the problem, and examine the results.
Note:
If your objective function or nonlinear constraints are not composed of elementary functions, you
must convert the nonlinear functions to optimization expressions using fcn2optimexpr. See the
last part of this example, Alternative Formulation Using fcn2optimexpr on page 1-0 , or “Convert
Nonlinear Function to Optimization Expression” on page 7-8.
For the solver-based approach to this problem, see “Solve a Constrained Nonlinear Problem, Solver-
Based” on page 1-11.
2 2
f (x) = 100 x2 − x12 + (1 − x1) ,
over the unit disk, meaning the disk of radius 1 centered at the origin. In other words, find x that
minimizes the function f (x) over the set x12 + x22 ≤ 1. This problem is a minimization of a nonlinear
function subject to a nonlinear constraint.
Rosenbrock's function is a standard test function in optimization. It has a unique minimum value of 0
attained at the point [1,1]. Finding the minimum is a challenge for some algorithms because the
function has a shallow minimum inside a deeply curved valley. The solution for this problem is not at
the point [1,1] because that point does not satisfy the constraint.
This figure shows two views of Rosenbrock's function in the unit disk. The vertical axis is log-scaled;
in other words, the plot shows log(1 + f (x)). Contour lines lie beneath the surface plot.
rosenbrock = @(x)100*(x(:,2) - x(:,1).^2).^2 + (1 - x(:,1)).^2; % Vectorized function
% Create subplot
subplot1 = subplot(1,2,1,'Parent',figure1);
view([124 34]);
grid('on');
hold on;
% Create surface
1-5
1 Getting Started
surf(X,Y,Z,'Parent',subplot1,'LineStyle','none');
% Create contour
contour(X,Y,Z,'Parent',subplot1);
% Create subplot
subplot2 = subplot(1,2,2,'Parent',figure1);
view([234 34]);
grid('on');
hold on
% Create surface
surf(X,Y,Z,'Parent',subplot2,'LineStyle','none');
% Create contour
contour(X,Y,Z,'Parent',subplot2);
% Create textarrow
annotation(figure1,'textarrow',[0.4 0.31],...
[0.055 0.16],...
'String',{'Minimum at (0.7864,0.6177)'});
% Create arrow
annotation(figure1,'arrow',[0.59 0.62],...
[0.065 0.34]);
hold off
The rosenbrock function handle calculates Rosenbrock's function at any number of 2-D points at
once. This “Vectorization” (MATLAB) speeds the plotting of the function, and can be useful in other
contexts for speeding evaluation of a function at multiple points.
1-6
Solve a Constrained Nonlinear Problem, Problem-Based
The function f (x) is called the objective function. The objective function is the function you want to
minimize. The inequality x12 + x22 ≤ 1 is called a constraint. Constraints limit the set of x over which a
solver searches for a minimum. You can have any number of constraints, which are inequalities or
equations.
The problem-based approach to optimization uses optimization variables to define objective and
constraints. There are two approaches for creating expressions using these variables:
For this problem, both the objective function and the nonlinear constraint are polynomials, so you can
write the expressions directly in terms of optimization variables. Create a 2-D optimization variable
named 'x'.
x = optimvar('x',1,2);
Create an optimization problem named prob having obj as the objective function.
prob = optimproblem('Objective',obj);
OptimizationProblem :
Solve for:
x
minimize :
((100 .* (x(2) - x(1).^2).^2) + (1 - x(1)).^2)
subject to circlecons:
(x(1).^2 + x(2).^2) <= 1
Solve Problem
To solve the optimization problem, call solve. The problem needs an initial point, which is a
structure giving the initial value of the optimization variable. Create the initial point structure x0
having an x-value of [0 0].
1-7
1 Getting Started
x0.x = [0 0];
[sol,fval,exitflag,output] = solve(prob,x0)
fval = 0.0457
exitflag =
OptimalSolution
Examine Solution
The solution shows exitflag = OptimalSolution. This exit flag indicates that the solution is a
local optimum. For information on trying to find a better solution, see “When the Solver Succeeds” on
page 4-18.
The exit message indicates that the solution satisfies the constraints. You can check that the solution
is indeed feasible in several ways.
• Check the reported infeasibility in the constrviolation field of the output structure.
infeas = output.constrviolation
infeas = 0
infeas = infeasibility(nlcons,sol)
infeas = 0
1-8
Solve a Constrained Nonlinear Problem, Problem-Based
nx = norm(sol.x)
nx = 1.0000
The output structure gives more information on the solution process, such as the number of
iterations (24), the solver (fmincon), and the number of function evaluations (84). For more
information on these statistics, see “Tolerances and Stopping Criteria” on page 2-68.
For more complex expressions, write function files for the objective or constraint functions, and
convert them to optimization expressions using fcn2optimexpr. For example, the basis of the
nonlinear constraint function is in the disk.m file:
type disk
Furthermore, you can also convert the rosenbrock function handle, which was defined at the
beginning of the plotting routine, into an optimization expression.
rosenexpr = fcn2optimexpr(rosenbrock,x);
OptimizationProblem :
Solve for:
x
minimize :
anonymousFunction2(x)
where:
anonymousFunction2 = @(x)100*(x(:,2)-x(:,1).^2).^2+(1-x(:,1)).^2;
subject to :
disk(x) <= 1
Solve the new problem. The solution is essentially the same as before.
[sol,fval,exitflag,output] = solve(convprob,x0)
1-9
Another Random Scribd Document
with Unrelated Content
The Project Gutenberg eBook of Early
explorers of Plymouth Harbor, 1525-1619
This ebook is for the use of anyone anywhere in the United States
and most other parts of the world at no cost and with almost no
restrictions whatsoever. You may copy it, give it away or re-use it
under the terms of the Project Gutenberg License included with this
ebook or online at www.gutenberg.org. If you are not located in the
United States, you will have to check the laws of the country where
you are located before using this eBook.
Language: English
by
Henry F. Howe
Published jointly by
Plimoth Plantation, Inc.
and the Pilgrim Society
Plymouth, 1953
Copyright by Plimoth Plantation, Inc., and the Pilgrim Society, 1953
For the story of the latter days of their seven weeks’ stay at
Plymouth, we can scarcely do better than read Pring’s narrative in
the original: “By the end of July we had laded our small Barke called
the Discoverer with as much Sassafras as we thought sufficient, and
sent her home into England before, to give some speedie
contentment to the Adventurers; who arrived safely in Kingrode
about a fortnight before us. After their departure we so bestirred
ourselves, that our shippe also had gotten in her lading, during which
time there fell out this accident. On a day about noone tide while our
men which used to cut down Sassafras in the woods were asleep, as
they used to do for two houres in the heat of the day, there came
down about seven score Savages armed with their Bowes and
Arrowes, and environed our House or Barricado, wherein were foure
of our men alone with their Muskets to keepe Centinell, whom they
sought to have come down unto them, which they utterly refused,
and stood upon their guard. Our Master like-wise being very careful
and circumspect, having not past two with him in the shippe, put the
same in the best defence he could, lest they should have invaded
the same, and caused a piece of great Ordnance to bee shot off to
give terrour to the Indians, and warning to our men which were fast
asleepe in the woods: at the noyse of which peece they ... betooke
them to their weapons, and with their Mastives, great Foole with an
halfe Pike in his mouth, drew down to their ship; whom when the
Indians beheld afarre off with the Mastive which they most feared, in
dissembling manner they turned all to a jest and sport and departed
away in friendly manner, yet not long after, even the day before our
departure, they set fire on the woods where wee wrought, which wee
did behold to burne for a mile space, and the very same day that
wee weighed Anchor, they came down to the shore in greater
number, to wit, very neere two hundred by our estimation, and some
of them came in theire Boates to our ship, and would have had us
come in againe, but we sent them back, and would none of their
entertainment.” One would love to know more than is provided in
Martin Pring’s brief narrative in order to estimate fairly whether the
English had given provocation to the Indians for this threatened
attack at Plymouth. The only hint of provocation is the taking of a
canoe back to England.
Had Plymouth been populated by two hundred Indians in 1620 it
seems unlikely that the Pilgrims could have survived. The English
adventurers transferred their explorations to the coast of Maine,
where in 1605 George Waymouth, and in 1606 Martin Pring, made
investigations preparatory to the major colonization attempt of the
English Plymouth Company at Sagadahoc, the mouth of the
Kennebec River. This colony failed after a year through a breakdown
in leadership. For our purposes it is significant that the English sent
no further explorations into Massachusetts waters for eight years
after the voyages of Gosnold and Martin Pring. It looks as though the
enmity of the Massachusetts Indians dissuaded English merchants
from further plans to set up a colony in that region.
Plymouth’s next visitor was that great French explorer and
empire builder, the founder of Canada, Samuel de Champlain. A
group of French merchants organized by the governor of Dieppe,
that old French port which had been sending fishermen to America
for more than a century, succeeded in 1603 in getting from Henri IV
a commission to found a colony. After spending the summer of 1603
exploring the St. Lawrence, the leaders decided that a more
southern climate was desirable. Accordingly in 1604, De Monts and
Pont-Grave, with Samuel de Champlain as geographer and
chronicler of the expedition, explored the Bay of Fundy and founded
a colony on Dochet Island in Passamaquoddy Bay. Half the colonists
died during the first winter, and the colony was moved across the
Bay to a better site at what is now Annapolis, Nova Scotia. Meantime
Champlain spent the summers of 1605 and 1606 exploring the New
England coast, familiarizing himself with all the shores as far south
as Woods Hole in Massachusetts. He wrote a splendid account of
these expeditions, which is still good reading; and for the first time
produced a good map of the Massachusetts coast, with detail maps
of Gloucester, Plymouth, Eastham and Chatham harbors.
(Larger)
Champlain’s Map of Plymouth Harbor, 1605
Legends on Champlain’s Map of Port St. Louis with comments
Our website is not just a platform for buying books, but a bridge
connecting readers to the timeless values of culture and wisdom. With
an elegant, user-friendly interface and an intelligent search system,
we are committed to providing a quick and convenient shopping
experience. Additionally, our special promotions and home delivery
services ensure that you save time and fully enjoy the joy of reading.
textbookfull.com