0% found this document useful (0 votes)
1K views

MATLAB Optimization Toolbox User s Guide The Mathworks 2024 scribd download

Optimization

Uploaded by

kozakbriar18
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
1K views

MATLAB Optimization Toolbox User s Guide The Mathworks 2024 scribd download

Optimization

Uploaded by

kozakbriar18
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 55

Experience Seamless Full Ebook Downloads for Every Genre at textbookfull.

com

MATLAB Optimization Toolbox User s Guide The


Mathworks

https://textbookfull.com/product/matlab-optimization-
toolbox-user-s-guide-the-mathworks/

OR CLICK BUTTON

DOWNLOAD NOW

Explore and download more ebook at https://textbookfull.com


Recommended digital products (PDF, EPUB, MOBI) that
you can download immediately if you are interested.

MATLAB Global Optimization Toolbox User s Guide The


Mathworks

https://textbookfull.com/product/matlab-global-optimization-toolbox-
user-s-guide-the-mathworks/

textboxfull.com

MATLAB Econometrics Toolbox User s Guide The Mathworks

https://textbookfull.com/product/matlab-econometrics-toolbox-user-s-
guide-the-mathworks/

textboxfull.com

MATLAB Bioinformatics Toolbox User s Guide The Mathworks

https://textbookfull.com/product/matlab-bioinformatics-toolbox-user-s-
guide-the-mathworks/

textboxfull.com

MATLAB Mapping Toolbox User s Guide The Mathworks

https://textbookfull.com/product/matlab-mapping-toolbox-user-s-guide-
the-mathworks/

textboxfull.com
MATLAB Trading Toolbox User s Guide The Mathworks

https://textbookfull.com/product/matlab-trading-toolbox-user-s-guide-
the-mathworks/

textboxfull.com

MATLAB Computer Vision Toolbox User s Guide The Mathworks

https://textbookfull.com/product/matlab-computer-vision-toolbox-user-
s-guide-the-mathworks/

textboxfull.com

MATLAB Curve Fitting Toolbox User s Guide The Mathworks

https://textbookfull.com/product/matlab-curve-fitting-toolbox-user-s-
guide-the-mathworks/

textboxfull.com

MATLAB Fuzzy Logic Toolbox User s Guide The Mathworks

https://textbookfull.com/product/matlab-fuzzy-logic-toolbox-user-s-
guide-the-mathworks/

textboxfull.com

MATLAB Image Processing Toolbox User s Guide The Mathworks

https://textbookfull.com/product/matlab-image-processing-toolbox-user-
s-guide-the-mathworks/

textboxfull.com
Optimization Toolbox™
User's Guide

R2020a
How to Contact MathWorks

Latest news: www.mathworks.com

Sales and services: www.mathworks.com/sales_and_services

User community: www.mathworks.com/matlabcentral

Technical support: www.mathworks.com/support/contact_us

Phone: 508-647-7000

The MathWorks, Inc.


1 Apple Hill Drive
Natick, MA 01760-2098
Optimization Toolbox™ User's Guide
© COPYRIGHT 1990–2020 by The MathWorks, Inc.
The software described in this document is furnished under a license agreement. The software may be used or copied
only under the terms of the license agreement. No part of this manual may be photocopied or reproduced in any form
without prior written consent from The MathWorks, Inc.
FEDERAL ACQUISITION: This provision applies to all acquisitions of the Program and Documentation by, for, or through
the federal government of the United States. By accepting delivery of the Program or Documentation, the government
hereby agrees that this software or documentation qualifies as commercial computer software or commercial computer
software documentation as such terms are used or defined in FAR 12.212, DFARS Part 227.72, and DFARS 252.227-7014.
Accordingly, the terms and conditions of this Agreement and only those rights specified in this Agreement, shall pertain
to and govern the use, modification, reproduction, release, performance, display, and disclosure of the Program and
Documentation by the federal government (or other entity acquiring for or through the federal government) and shall
supersede any conflicting contractual terms or conditions. If this License fails to meet the government's needs or is
inconsistent in any respect with federal procurement law, the government agrees to return the Program and
Documentation, unused, to The MathWorks, Inc.
Trademarks
MATLAB and Simulink are registered trademarks of The MathWorks, Inc. See
www.mathworks.com/trademarks for a list of additional trademarks. Other product or brand names may be
trademarks or registered trademarks of their respective holders.
Patents
MathWorks products are protected by one or more U.S. patents. Please see www.mathworks.com/patents for
more information.
Revision History
November 1990 First printing
December 1996 Second printing For MATLAB® 5
January 1999 Third printing For Version 2 (Release 11)
September 2000 Fourth printing For Version 2.1 (Release 12)
June 2001 Online only Revised for Version 2.1.1 (Release 12.1)
September 2003 Online only Revised for Version 2.3 (Release 13SP1)
June 2004 Fifth printing Revised for Version 3.0 (Release 14)
October 2004 Online only Revised for Version 3.0.1 (Release 14SP1)
March 2005 Online only Revised for Version 3.0.2 (Release 14SP2)
September 2005 Online only Revised for Version 3.0.3 (Release 14SP3)
March 2006 Online only Revised for Version 3.0.4 (Release 2006a)
September 2006 Sixth printing Revised for Version 3.1 (Release 2006b)
March 2007 Seventh printing Revised for Version 3.1.1 (Release 2007a)
September 2007 Eighth printing Revised for Version 3.1.2 (Release 2007b)
March 2008 Online only Revised for Version 4.0 (Release 2008a)
October 2008 Online only Revised for Version 4.1 (Release 2008b)
March 2009 Online only Revised for Version 4.2 (Release 2009a)
September 2009 Online only Revised for Version 4.3 (Release 2009b)
March 2010 Online only Revised for Version 5.0 (Release 2010a)
September 2010 Online only Revised for Version 5.1 (Release 2010b)
April 2011 Online only Revised for Version 6.0 (Release 2011a)
September 2011 Online only Revised for Version 6.1 (Release 2011b)
March 2012 Online only Revised for Version 6.2 (Release 2012a)
September 2012 Online only Revised for Version 6.2.1 (Release 2012b)
March 2013 Online only Revised for Version 6.3 (Release 2013a)
September 2013 Online only Revised for Version 6.4 (Release 2013b)
March 2014 Online only Revised for Version 7.0 (Release 2014a)
October 2014 Online only Revised for Version 7.1 (Release 2014b)
March 2015 Online only Revised for Version 7.2 (Release 2015a)
September 2015 Online only Revised for Version 7.3 (Release 2015b)
March 2016 Online only Revised for Version 7.4 (Release 2016a)
September 2016 Online only Revised for Version 7.5 (Release 2016b)
March 2017 Online only Revised for Version 7.6 (Release 2017a)
September 2017 Online only Revised for Version 8.0 (Release 2017b)
March 2018 Online only Revised for Version 8.1 (Release 2018a)
September 2018 Online only Revised for Version 8.2 (Release 2018b)
March 2019 Online only Revised for Version 8.3 (Release 2019a)
September 2019 Online only Revised for Version 8.4 (Release 2019b)
March 2020 Online only Revised for Version 8.5 (Release 2020a)
Contents

Acknowledgments

Acknowledgments . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . xx

Getting Started
1
Optimization Toolbox Product Description . . . . . . . . . . . . . . . . . . . . . . . . . 1-2
Key Features . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1-2

First Choose Problem-Based or Solver-Based Approach . . . . . . . . . . . . . . 1-3

Solve a Constrained Nonlinear Problem, Problem-Based . . . . . . . . . . . . . 1-5

Solve a Constrained Nonlinear Problem, Solver-Based . . . . . . . . . . . . . . 1-11


Typical Optimization Problem . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1-11
Problem Formulation: Rosenbrock's Function . . . . . . . . . . . . . . . . . . . . . 1-11
Define the Problem in Toolbox Syntax . . . . . . . . . . . . . . . . . . . . . . . . . . . 1-12
Run the Optimization . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1-13
Interpret the Result . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1-16

Set Up a Linear Program, Solver-Based . . . . . . . . . . . . . . . . . . . . . . . . . . 1-18


Convert a Problem to Solver Form . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1-18
Model Description . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1-18
Solution Method . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1-19
Bibliography . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1-24

Set Up a Linear Program, Problem-Based . . . . . . . . . . . . . . . . . . . . . . . . . 1-25


Convert a Problem to Solver Form . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1-25
Model Description . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1-25
First Solution Method: Create Optimization Variable for Each Problem
Variable . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1-26
Create Problem and Objective . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1-27
Create and Include Linear Constraints . . . . . . . . . . . . . . . . . . . . . . . . . . 1-27
Solve Problem . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1-27
Examine Solution . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1-28
Second Solution Method: Create One Optimization Variable and Indices
..................................................... 1-29
Set Variable Bounds . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1-29
Create Problem, Linear Constraints, and Solution . . . . . . . . . . . . . . . . . . 1-29
Examine Indexed Solution . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1-30
Bibliography . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1-30

v
Setting Up an Optimization
2
Optimization Theory Overview . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2-2

Optimization Toolbox Solvers . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2-3

Optimization Decision Table . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2-4

Choosing the Algorithm . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2-6


fmincon Algorithms . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2-6
fsolve Algorithms . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2-7
fminunc Algorithms . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2-7
Least Squares Algorithms . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2-8
Linear Programming Algorithms . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2-9
Quadratic Programming Algorithms . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2-9
Large-Scale vs. Medium-Scale Algorithms . . . . . . . . . . . . . . . . . . . . . . . . 2-10
Potential Inaccuracy with Interior-Point Algorithms . . . . . . . . . . . . . . . . 2-10

Problems Handled by Optimization Toolbox Functions . . . . . . . . . . . . . . 2-12

Complex Numbers in Optimization Toolbox Solvers . . . . . . . . . . . . . . . . . 2-14

Types of Objective Functions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2-16

Writing Scalar Objective Functions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2-17


Function Files . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2-17
Anonymous Function Objectives . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2-18
Including Gradients and Hessians . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2-19

Writing Vector and Matrix Objective Functions . . . . . . . . . . . . . . . . . . . . 2-26


What Are Vector or Matrix Objective Functions? . . . . . . . . . . . . . . . . . . . 2-26
Jacobians of Vector Functions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2-26
Jacobians of Matrix Functions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2-27
Jacobians with Matrix-Valued Independent Variables . . . . . . . . . . . . . . . . 2-27

Writing Objective Functions for Linear or Quadratic Problems . . . . . . . 2-29

Maximizing an Objective . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2-30

Matrix Arguments . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2-31

Types of Constraints . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2-32

Iterations Can Violate Constraints . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2-33


Intermediate Iterations can Violate Constraints . . . . . . . . . . . . . . . . . . . 2-33
Algorithms That Satisfy Bound Constraints . . . . . . . . . . . . . . . . . . . . . . . 2-33
Solvers and Algorithms That Can Violate Bound Constraints . . . . . . . . . . 2-33

Bound Constraints . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2-34

Linear Constraints . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2-35


What Are Linear Constraints? . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2-35
Linear Inequality Constraints . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2-35

vi Contents
Linear Equality Constraints . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2-36

Nonlinear Constraints . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2-37


Including Gradients in Constraint Functions . . . . . . . . . . . . . . . . . . . . . . 2-38
Anonymous Nonlinear Constraint Functions . . . . . . . . . . . . . . . . . . . . . . 2-38

Or Instead of And Constraints . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2-41

How to Use All Types of Constraints . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2-45

Objective and Nonlinear Constraints in the Same Function . . . . . . . . . . 2-48

Objective and Constraints Having a Common Function in Serial or


Parallel, Problem-Based . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2-52

Passing Extra Parameters . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2-57


Extra Parameters, Fixed Variables, or Data . . . . . . . . . . . . . . . . . . . . . . . 2-57
Anonymous Functions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2-57
Nested Functions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2-58
Global Variables . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2-59

What Are Options? . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2-60

Options in Common Use: Tuning and Troubleshooting . . . . . . . . . . . . . . 2-61

Set and Change Options . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2-62

Choose Between optimoptions and optimset . . . . . . . . . . . . . . . . . . . . . . 2-63

View Options . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2-66

Tolerances and Stopping Criteria . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2-68

Tolerance Details . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2-71

Checking Validity of Gradients or Jacobians . . . . . . . . . . . . . . . . . . . . . . . 2-74


Check Gradient or Jacobian in Objective Function . . . . . . . . . . . . . . . . . . 2-74
How to Check Derivatives . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2-74
Example: Checking Derivatives of Objective and Constraint Functions . . 2-75

Bibliography . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2-80

Examining Results
3
Current Point and Function Value . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3-2

Exit Flags and Exit Messages . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3-3


Exit Flags . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3-3
Exit Messages . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3-4
Enhanced Exit Messages . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3-4

vii
Exit Message Options . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3-7

Iterations and Function Counts . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3-9

First-Order Optimality Measure . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3-11


What Is First-Order Optimality Measure? . . . . . . . . . . . . . . . . . . . . . . . . 3-11
Stopping Rules Related to First-Order Optimality . . . . . . . . . . . . . . . . . . 3-11
Unconstrained Optimality . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3-11
Constrained Optimality Theory . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3-12
Constrained Optimality in Solver Form . . . . . . . . . . . . . . . . . . . . . . . . . . 3-13

Iterative Display . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3-14


Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3-14
Common Headings . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3-14
Function-Specific Headings . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3-15

Output Structures . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3-21

Lagrange Multiplier Structures . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3-22

Hessian Output . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3-24


Returned Hessian . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3-24
fminunc Hessian . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3-24
fmincon Hessian . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3-25

Plot Functions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3-27


Plot an Optimization During Execution . . . . . . . . . . . . . . . . . . . . . . . . . . 3-27
Using a Plot Function . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3-27

Output Functions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3-32


What Is an Output Function? . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3-32
Example: Using Output Functions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3-32

Steps to Take After Running a Solver


4
Overview of Next Steps . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4-2

When the Solver Fails . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4-3


Too Many Iterations or Function Evaluations . . . . . . . . . . . . . . . . . . . . . . 4-3
Converged to an Infeasible Point . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4-6
Problem Unbounded . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4-7
fsolve Could Not Solve Equation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4-8

Solver Takes Too Long . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4-9


Enable Iterative Display . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4-9
Use Appropriate Tolerances . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4-9
Use a Plot Function . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4-9
Use 'lbfgs' HessianApproximation Option . . . . . . . . . . . . . . . . . . . . . . . . 4-10
Enable CheckGradients . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4-10
Use Inf Instead of a Large, Arbitrary Bound . . . . . . . . . . . . . . . . . . . . . . 4-10
Use an Output Function . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4-10

viii Contents
Use a Sparse Solver or a Multiply Function . . . . . . . . . . . . . . . . . . . . . . 4-10
Use Parallel Computing . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4-11

When the Solver Might Have Succeeded . . . . . . . . . . . . . . . . . . . . . . . . . . 4-12


Final Point Equals Initial Point . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4-12
Local Minimum Possible . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4-12

When the Solver Succeeds . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4-18


What Can Be Wrong If The Solver Succeeds? . . . . . . . . . . . . . . . . . . . . . 4-18
1. Change the Initial Point . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4-18
2. Check Nearby Points . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4-19
3. Check your Objective and Constraint Functions . . . . . . . . . . . . . . . . . 4-20

Local vs. Global Optima . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4-22


Why the Solver Does Not Find the Smallest Minimum . . . . . . . . . . . . . . . 4-22
Searching for a Smaller Minimum . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4-22
Basins of Attraction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4-23

Optimizing a Simulation or Ordinary Differential Equation . . . . . . . . . . 4-26


What Is Optimizing a Simulation or ODE? . . . . . . . . . . . . . . . . . . . . . . . . 4-26
Potential Problems and Solutions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4-26
Bibliography . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4-30

Optimization App
5
Optimization App . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5-2
Optimization App Basics . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5-2
Specifying Certain Options . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5-6
Importing and Exporting Your Work . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5-8

Optimization App Alternatives . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5-12


Optimize Without Using the App . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5-12
Set Options Using Live Scripts . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5-12
Set Options: Command Line or Standard Scripts . . . . . . . . . . . . . . . . . . . 5-14
Choose Plot Functions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5-15
Pass Solver Arguments . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5-16

Nonlinear algorithms and examples


6
Unconstrained Nonlinear Optimization Algorithms . . . . . . . . . . . . . . . . . . 6-2
Unconstrained Optimization Definition . . . . . . . . . . . . . . . . . . . . . . . . . . . 6-2
fminunc trust-region Algorithm . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6-2
fminunc quasi-newton Algorithm . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6-4

fminsearch Algorithm . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6-9

Unconstrained Minimization Using fminunc . . . . . . . . . . . . . . . . . . . . . . 6-11

ix
Minimization with Gradient and Hessian . . . . . . . . . . . . . . . . . . . . . . . . . 6-13

Minimization with Gradient and Hessian Sparsity Pattern . . . . . . . . . . . 6-16

Constrained Nonlinear Optimization Algorithms . . . . . . . . . . . . . . . . . . . 6-19


Constrained Optimization Definition . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6-19
fmincon Trust Region Reflective Algorithm . . . . . . . . . . . . . . . . . . . . . . . 6-19
fmincon Active Set Algorithm . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6-22
fmincon SQP Algorithm . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6-29
fmincon Interior Point Algorithm . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6-30
fminbnd Algorithm . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6-32
fseminf Problem Formulation and Algorithm . . . . . . . . . . . . . . . . . . . . . . 6-32

Tutorial for the Optimization Toolbox™ . . . . . . . . . . . . . . . . . . . . . . . . . . . 6-36

Banana Function Minimization . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6-49

Minimizing an Expensive Optimization Problem Using Parallel Computing


Toolbox™ . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6-56

Nonlinear Inequality Constraints . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6-61

Nonlinear Constraints with Gradients . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6-63

fmincon Interior-Point Algorithm with Analytic Hessian . . . . . . . . . . . . . 6-66

Linear or Quadratic Objective with Quadratic Constraints . . . . . . . . . . . 6-71

Nonlinear Equality and Inequality Constraints . . . . . . . . . . . . . . . . . . . . 6-75

Optimization App with the fmincon Solver . . . . . . . . . . . . . . . . . . . . . . . . 6-77


Step 1: Write a file objecfun.m for the objective function. . . . . . . . . . . . . 6-77
Step 2: Write a file nonlconstr.m for the nonlinear constraints. . . . . . . . . 6-77
Step 3: Set up and run the problem with the Optimization app. . . . . . . . . 6-77

Minimization with Bound Constraints and Banded Preconditioner . . . . 6-81

Minimization with Linear Equality Constraints, Trust-Region Reflective


Algorithm . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6-87

Minimization with Dense Structured Hessian, Linear Equalities . . . . . . 6-90


Hessian Multiply Function for Lower Memory . . . . . . . . . . . . . . . . . . . . . 6-90
Step 1: Write a file brownvv.m that computes the objective function, the
gradient, and the sparse part of the Hessian. . . . . . . . . . . . . . . . . . . . 6-91
Step 2: Write a function to compute Hessian-matrix products for H given a
matrix Y. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6-91
Step 3: Call a nonlinear minimization routine with a starting point and
linear equality constraints. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6-91
Preconditioning . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6-93

Symbolic Math Toolbox™ Calculates Gradients and Hessians . . . . . . . . . 6-94

Using Symbolic Mathematics with Optimization Toolbox™ Solvers . . . 6-105

x Contents
Code Generation in fmincon . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6-114
What Is Code Generation? . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6-114
Code Generation Requirements . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6-114
Generated Code Not Multithreaded . . . . . . . . . . . . . . . . . . . . . . . . . . . 6-115

Code Generation for Optimization Basics . . . . . . . . . . . . . . . . . . . . . . . . 6-116


Generate Code for fmincon . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6-116
Modify Example for Efficiency . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6-116

Static Memory Allocation for fmincon Code Generation . . . . . . . . . . . . 6-120

Optimization Code Generation for Real-Time Applications . . . . . . . . . . 6-122


Time Limits on Generated Code . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6-122
Match the Target Environment . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6-122
Set Coder Configuration . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6-122
Benchmark the Solver . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6-123
Set Initial Point . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6-123
Set Options Appropriately . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6-123
Global Minimum . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6-124

One-Dimensional Semi-Infinite Constraints . . . . . . . . . . . . . . . . . . . . . . 6-125

Two-Dimensional Semi-Infinite Constraint . . . . . . . . . . . . . . . . . . . . . . . 6-128

Analyzing the Effect of Uncertainty Using Semi-Infinite Programming


........................................................ 6-131

Nonlinear Problem-Based
7
Rational Objective Function, Problem-Based . . . . . . . . . . . . . . . . . . . . . . . 7-2

Solve Constrained Nonlinear Optimization, Problem-Based . . . . . . . . . . . 7-4

Convert Nonlinear Function to Optimization Expression . . . . . . . . . . . . . . 7-8

Constrained Electrostatic Nonlinear Optimization, Problem-Based . . . . 7-12

Problem-Based Nonlinear Minimization with Linear Constraints . . . . . . 7-17

Include Derivatives in Problem-Based Workflow . . . . . . . . . . . . . . . . . . . 7-20


Why Include Derivatives? . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7-20
Create Optimization Problem . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7-20
Convert Problem to Solver-Based Form . . . . . . . . . . . . . . . . . . . . . . . . . . 7-21
Calculate Derivatives and Keep Track of Variables . . . . . . . . . . . . . . . . . 7-21
Edit the Objective and Constraint Files . . . . . . . . . . . . . . . . . . . . . . . . . . 7-22
Run Problem Using Two Methods . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7-22

Output Function for Problem-Based Optimization . . . . . . . . . . . . . . . . . . 7-25

Solve Nonlinear Feasibility Problem, Problem-Based . . . . . . . . . . . . . . . 7-30

xi
Multiobjective Algorithms and Examples
8
Multiobjective Optimization Algorithms . . . . . . . . . . . . . . . . . . . . . . . . . . . 8-2
Multiobjective Optimization Definition . . . . . . . . . . . . . . . . . . . . . . . . . . . 8-2
Algorithms . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8-3

Compare fminimax and fminunc . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8-6

Using fminimax with a Simulink® Model . . . . . . . . . . . . . . . . . . . . . . . . . . 8-8

Signal Processing Using fgoalattain . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8-12


Step 1: Write a file filtmin.m . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8-12
Step 2: Invoke optimization routine . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8-12

Generate and Plot a Pareto Front . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8-15

Multi-Objective Goal Attainment Optimization . . . . . . . . . . . . . . . . . . . . . 8-18

Minimax Optimization . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8-24

Linear Programming and Mixed-Integer Linear Programming


9
Linear Programming Algorithms . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9-2
Linear Programming Definition . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9-2
Interior-Point linprog Algorithm . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9-2
Interior-Point-Legacy Linear Programming . . . . . . . . . . . . . . . . . . . . . . . . 9-6
Dual-Simplex Algorithm . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9-9

Typical Linear Programming Problem . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9-13

Maximize Long-Term Investments Using Linear Programming: Solver-


Based . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9-15

Mixed-Integer Linear Programming Algorithms . . . . . . . . . . . . . . . . . . . 9-26


Mixed-Integer Linear Programming Definition . . . . . . . . . . . . . . . . . . . . 9-26
intlinprog Algorithm . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9-26

Tuning Integer Linear Programming . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9-35


Change Options to Improve the Solution Process . . . . . . . . . . . . . . . . . . 9-35
Some “Integer” Solutions Are Not Integers . . . . . . . . . . . . . . . . . . . . . . . 9-36
Large Components Not Integer Valued . . . . . . . . . . . . . . . . . . . . . . . . . . 9-36
Large Coefficients Disallowed . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9-36

Mixed-Integer Linear Programming Basics: Solver-Based . . . . . . . . . . . 9-37

Factory, Warehouse, Sales Allocation Model: Solver-Based . . . . . . . . . . . 9-40

Traveling Salesman Problem: Solver-Based . . . . . . . . . . . . . . . . . . . . . . . 9-49

xii Contents
Optimal Dispatch of Power Generators: Solver-Based . . . . . . . . . . . . . . . 9-55

Mixed-Integer Quadratic Programming Portfolio Optimization: Solver-


Based . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9-65

Solve Sudoku Puzzles Via Integer Programming: Solver-Based . . . . . . . 9-72

Office Assignments by Binary Integer Programming: Solver-Based . . . . 9-79

Cutting Stock Problem: Solver-Based . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9-86

Factory, Warehouse, Sales Allocation Model: Problem-Based . . . . . . . . . 9-90

Traveling Salesman Problem: Problem-Based . . . . . . . . . . . . . . . . . . . . . 9-98

Optimal Dispatch of Power Generators: Problem-Based . . . . . . . . . . . . 9-104

Office Assignments by Binary Integer Programming: Problem-Based 9-113

Mixed-Integer Quadratic Programming Portfolio Optimization: Problem-


Based . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9-118

Cutting Stock Problem: Problem-Based . . . . . . . . . . . . . . . . . . . . . . . . . 9-125

Solve Sudoku Puzzles Via Integer Programming: Problem-Based . . . . 9-129

Minimize Makespan in Parallel Processing . . . . . . . . . . . . . . . . . . . . . . 9-135

Investigate Linear Infeasibilities . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9-139

Problem-Based Optimization
10
Problem-Based Optimization Workflow . . . . . . . . . . . . . . . . . . . . . . . . . . . 10-2

Problem-Based Workflow for Solving Equations . . . . . . . . . . . . . . . . . . . . 10-4

Optimization Expressions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10-6


What Are Optimization Expressions? . . . . . . . . . . . . . . . . . . . . . . . . . . . 10-6
Expressions for Objective Functions . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10-6
Expressions for Constraints and Equations . . . . . . . . . . . . . . . . . . . . . . . 10-7
Optimization Variables Have Handle Behavior . . . . . . . . . . . . . . . . . . . . 10-9

Pass Extra Parameters in Problem-Based Approach . . . . . . . . . . . . . . . 10-11

Review or Modify Optimization Problems . . . . . . . . . . . . . . . . . . . . . . . . 10-14


Review Problem Using show or write . . . . . . . . . . . . . . . . . . . . . . . . . . 10-14
Change Default Solver or Options . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10-14
Correct a Misspecified Problem . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10-16
Duplicate Variable Name . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10-19

xiii
Named Index for Optimization Variables . . . . . . . . . . . . . . . . . . . . . . . . . 10-20
Create Named Indices . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10-20
Use Named Indices . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10-21
View Solution with Index Variables . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10-22

Examine Optimization Solution . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10-25


Obtain Numeric Solution . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10-25
Examine Solution Quality . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10-26
Infeasible Solution . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10-26
Solution Takes Too Long . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10-27

Create Efficient Optimization Problems . . . . . . . . . . . . . . . . . . . . . . . . . 10-28

Separate Optimization Model from Data . . . . . . . . . . . . . . . . . . . . . . . . . 10-30

Problem-Based Optimization Algorithms . . . . . . . . . . . . . . . . . . . . . . . . 10-32

Variables with Duplicate Names Disallowed . . . . . . . . . . . . . . . . . . . . . . 10-34

Expression Contains Inf or NaN . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10-35

Supported Operations on Optimization Variables and Expressions . . . 10-36


Notation for Supported Operations . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10-36
Operations Returning Optimization Expressions . . . . . . . . . . . . . . . . . . 10-36
Operations Returning Optimization Variables . . . . . . . . . . . . . . . . . . . . 10-38
Operations on Optimization Expressions . . . . . . . . . . . . . . . . . . . . . . . . 10-38
Operations Returning Constraint Expressions . . . . . . . . . . . . . . . . . . . . 10-38
Some Undocumented Operations Work on Optimization Variables and
Expressions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10-38
Unsupported Functions and Operations Require fcn2optimexpr . . . . . . 10-38

Mixed-Integer Linear Programming Basics: Problem-Based . . . . . . . . 10-40

Create Initial Point for Optimization with Named Index Variables . . . . 10-43

Quadratic Programming
11
Quadratic Programming Algorithms . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11-2
Quadratic Programming Definition . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11-2
interior-point-convex quadprog Algorithm . . . . . . . . . . . . . . . . . . . . . . . . 11-2
trust-region-reflective quadprog Algorithm . . . . . . . . . . . . . . . . . . . . . . . 11-7
active-set quadprog Algorithm . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11-11

Quadratic Minimization with Bound Constraints . . . . . . . . . . . . . . . . . . 11-15


Step 1: Load the Hessian and define f, lb, and ub. . . . . . . . . . . . . . . . . . 11-15
Step 2: Call a quadratic minimization routine with a starting point xstart.
.................................................... 11-15

Quadratic Minimization with Dense, Structured Hessian . . . . . . . . . . . 11-17


Take advantage of a structured Hessian . . . . . . . . . . . . . . . . . . . . . . . . 11-17

xiv Contents
Step 1: Decide what part of H to pass to quadprog as the first argument.
.................................................... 11-17
Step 2: Write a function to compute Hessian-matrix products for H. . . . 11-17
Step 3: Call a quadratic minimization routine with a starting point. . . . 11-18
Preconditioning . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11-19

Large Sparse Quadratic Program with Interior Point Algorithm . . . . . 11-21

Bound-Constrained Quadratic Programming, Solver-Based . . . . . . . . . 11-24

Quadratic Programming for Portfolio Optimization Problems, Solver-


Based . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11-28

Quadratic Programming with Bound Constraints: Problem-Based . . . . 11-34

Large Sparse Quadratic Program, Problem-Based . . . . . . . . . . . . . . . . . 11-37

Bound-Constrained Quadratic Programming, Problem-Based . . . . . . . 11-40

Quadratic Programming for Portfolio Optimization, Problem-Based . . 11-44

Code Generation for quadprog . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11-51


What Is Code Generation? . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11-51
Code Generation Requirements . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11-51
Generated Code Not Multithreaded . . . . . . . . . . . . . . . . . . . . . . . . . . . 11-52

Generate Code for quadprog . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11-53


First Steps in quadprog Code Generation . . . . . . . . . . . . . . . . . . . . . . . 11-53
Modify Example for Efficiency . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11-54

Quadratic Programming with Many Linear Constraints . . . . . . . . . . . . 11-57

Least Squares
12
Least-Squares (Model Fitting) Algorithms . . . . . . . . . . . . . . . . . . . . . . . . 12-2
Least Squares Definition . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12-2
Linear Least Squares: Interior-Point or Active-Set . . . . . . . . . . . . . . . . . . 12-2
Trust-Region-Reflective Least Squares . . . . . . . . . . . . . . . . . . . . . . . . . . 12-3
Levenberg-Marquardt Method . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12-6

Nonlinear Data-Fitting . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12-9

lsqnonlin with a Simulink® Model . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12-17

Nonlinear Least Squares Without and Including Jacobian . . . . . . . . . . 12-21

Nonnegative Linear Least Squares, Solver-Based . . . . . . . . . . . . . . . . . 12-24

Optimization App with the lsqlin Solver . . . . . . . . . . . . . . . . . . . . . . . . . 12-27


The Problem . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12-27

xv
Setting Up the Problem . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12-27

Jacobian Multiply Function with Linear Least Squares . . . . . . . . . . . . . 12-30

Large-Scale Constrained Linear Least-Squares, Solver-Based . . . . . . . 12-34

Shortest Distance to a Plane . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12-38

Nonnegative Linear Least Squares, Problem-Based . . . . . . . . . . . . . . . . 12-40

Large-Scale Constrained Linear Least-Squares, Problem-Based . . . . . 12-44

Nonlinear Curve Fitting with lsqcurvefit . . . . . . . . . . . . . . . . . . . . . . . . . 12-48

Fit a Model to Complex-Valued Data . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12-50

Fit an Ordinary Differential Equation (ODE) . . . . . . . . . . . . . . . . . . . . . 12-54

Nonlinear Least-Squares, Problem-Based . . . . . . . . . . . . . . . . . . . . . . . . 12-62

Fit ODE, Problem-Based . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12-77

Nonlinear Data-Fitting Using Several Problem-Based Approaches . . . 12-84

Write Objective Function for Problem-Based Least Squares . . . . . . . . . 12-92

Systems of Equations
13
Equation Solving Algorithms . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13-2
Equation Solving Definition . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13-2
Trust-Region Algorithm . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13-2
Trust-Region-Dogleg Algorithm . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13-4
Levenberg-Marquardt Method . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13-5
fzero Algorithm . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13-6
\ Algorithm . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13-6

Nonlinear Equations with Analytic Jacobian . . . . . . . . . . . . . . . . . . . . . . . 13-7


Step 1: Write a file bananaobj.m to compute the objective function values
and the Jacobian. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13-7
Step 2: Call the solve routine for the system of equations. . . . . . . . . . . . . 13-8

Nonlinear Equations with Finite-Difference Jacobian . . . . . . . . . . . . . . . 13-9

Nonlinear Equations with Jacobian . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13-11


Step 1: Write a file nlsf1.m that computes the objective function values and
the Jacobian. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13-11
Step 2: Call the solve routine for the system of equations. . . . . . . . . . . 13-11

xvi Contents
Nonlinear Equations with Jacobian Sparsity Pattern . . . . . . . . . . . . . . . 13-13
Step 1: Write a file nlsf1a.m that computes the objective function values.
.................................................... 13-13
Step 2: Call the system of equations solve routine. . . . . . . . . . . . . . . . . 13-13

Nonlinear Systems with Constraints . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13-15


Solve Equations with Inequality Constraints . . . . . . . . . . . . . . . . . . . . . 13-15
Use Different Start Points . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13-15
Use Different Algorithms . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13-16
Use lsqnonlin with Bounds . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13-16
Set Equations and Inequalities as fmincon Constraints . . . . . . . . . . . . . 13-17

Solve Nonlinear System of Equations, Problem-Based . . . . . . . . . . . . . 13-19

Solve Nonlinear System of Polynomials, Problem-Based . . . . . . . . . . . . 13-21

Follow Equation Solution as a Parameter Changes . . . . . . . . . . . . . . . . 13-23

Nonlinear System of Equations with Constraints, Problem-Based . . . . 13-30

Parallel Computing for Optimization


14
What Is Parallel Computing in Optimization Toolbox? . . . . . . . . . . . . . . . 14-2
Parallel Optimization Functionality . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14-2
Parallel Estimation of Gradients . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14-2
Nested Parallel Functions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14-3

Using Parallel Computing in Optimization Toolbox . . . . . . . . . . . . . . . . . 14-5


Using Parallel Computing with Multicore Processors . . . . . . . . . . . . . . . 14-5
Using Parallel Computing with a Multiprocessor Network . . . . . . . . . . . . 14-5
Testing Parallel Computations . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14-6

Minimizing an Expensive Optimization Problem Using Parallel Computing


Toolbox™ . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14-8

Improving Performance with Parallel Computing . . . . . . . . . . . . . . . . . 14-13


Factors That Affect Speed . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14-13
Factors That Affect Results . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14-13
Searching for Global Optima . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14-14

Argument and Options Reference


15
Function Input Arguments . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15-2

Function Output Arguments . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15-4

xvii
Optimization Options Reference . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15-6
Optimization Options . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15-6
Hidden Options . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15-16

Current and Legacy Option Name Tables . . . . . . . . . . . . . . . . . . . . . . . . 15-21

Output Function Syntax . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15-26


What Are Output Functions? . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15-26
Structure of the Output Function . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15-26
Fields in optimValues . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15-27
States of the Algorithm . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15-31
Stop Flag . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15-31

intlinprog Output Function and Plot Function Syntax . . . . . . . . . . . . . . 15-33


What Are Output Functions and Plot Functions? . . . . . . . . . . . . . . . . . . 15-33
Custom Function Syntax . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15-33
optimValues Structure . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15-34

Functions
16

xviii Contents
Acknowledgments

xix
Acknowledgments

Acknowledgments
MathWorks® would like to acknowledge the following contributors to Optimization Toolbox
algorithms.

Thomas F. Coleman researched and contributed algorithms for constrained and unconstrained
minimization, nonlinear least squares and curve fitting, constrained linear least squares, quadratic
programming, and nonlinear equations.

Dr. Coleman is Professor of Combinatorics and Optimization at the University of Waterloo.

Yin Zhang researched and contributed the large-scale linear programming algorithm.

Dr. Zhang is Professor of Computational and Applied Mathematics at Rice University.

xx
1

Getting Started

• “Optimization Toolbox Product Description” on page 1-2


• “First Choose Problem-Based or Solver-Based Approach” on page 1-3
• “Solve a Constrained Nonlinear Problem, Problem-Based” on page 1-5
• “Solve a Constrained Nonlinear Problem, Solver-Based” on page 1-11
• “Set Up a Linear Program, Solver-Based” on page 1-18
• “Set Up a Linear Program, Problem-Based” on page 1-25
1 Getting Started

Optimization Toolbox Product Description


Solve linear, quadratic, integer, and nonlinear optimization problems

Optimization Toolbox provides functions for finding parameters that minimize or maximize objectives
while satisfying constraints. The toolbox includes solvers for linear programming (LP), mixed-integer
linear programming (MILP), quadratic programming (QP), nonlinear programming (NLP),
constrained linear least squares, nonlinear least squares, and nonlinear equations. You can define
your optimization problem with functions and matrices or by specifying variable expressions that
reflect the underlying mathematics.

You can use the toolbox solvers to find optimal solutions to continuous and discrete problems,
perform tradeoff analyses, and incorporate optimization methods into algorithms and applications.
The toolbox lets you perform design optimization tasks, including parameter estimation, component
selection, and parameter tuning. It can be used to find optimal solutions in applications such as
portfolio optimization, resource allocation, and production planning and scheduling.

Key Features
• Nonlinear and multiobjective optimization of smooth constrained and unconstrained problems
• Solvers for nonlinear least squares, constrained linear least squares, data fitting, and nonlinear
equations
• Quadratic programming (QP) and linear programming (LP)
• Mixed-integer linear programming (MILP)
• Optimization modeling tools
• Graphical monitoring of optimization progress
• Gradient estimation acceleration (with Parallel Computing Toolbox™)

1-2
First Choose Problem-Based or Solver-Based Approach

First Choose Problem-Based or Solver-Based Approach


Optimization Toolbox has two approaches to solving optimization problems or equations: problem-
based and solver-based. Before you start to solve a problem, you must first choose the appropriate
approach.

This table summarizes the main differences between the two approaches.

Approaches Characteristics
“Problem-Based Optimization Setup” Easier to create and debug
Represents the objective and constraints symbolically
Requires translation from problem form to matrix form, resulting in a
time
Does not allow direct inclusion of gradient or Hessian; see “Include De
Problem-Based Workflow” on page 7-20
See the steps in “Problem-Based Optimization Workflow” on page 10-2
“Problem-Based Workflow for Solving Equations” on page 10-4
Basic linear example: “Mixed-Integer Linear Programming Basics: Pro
page 10-40 or the video Solve a Mixed-Integer Linear Programming Pr
Optimization Modeling

Basic nonlinear example: “Solve a Constrained Nonlinear Problem, Pro


on page 1-5

Basic equation-solving example: “Solve Nonlinear System of Equations


Based” on page 13-19
“Solver-Based Optimization Problem Harder to create and debug
Setup” Represents the objective and constraints as functions or matrices
Does not require translation from problem form to matrix form, resulti
solution time
Allows direct inclusion of gradient or Hessian
Allows use of a Hessian multiply function or Jacobian multiply function
memory in large problems

See “Quadratic Minimization with Dense, Structured Hessian” on page


“Jacobian Multiply Function with Linear Least Squares” on page 12-30
See the steps in “Solver-Based Optimization Problem Setup”
Basic linear example: “Mixed-Integer Linear Programming Basics: Sol
page 9-37

Basic nonlinear example: “Solve a Constrained Nonlinear Problem, So


page 1-11

Basic equation-solving examples: “Examples” on page 16-0

1-3
1 Getting Started

See Also

More About
• “Problem-Based Optimization Setup”
• “Solver-Based Optimization Problem Setup”

1-4
Solve a Constrained Nonlinear Problem, Problem-Based

Solve a Constrained Nonlinear Problem, Problem-Based


Typical Optimization Problem

This example shows how to solve a constrained nonlinear optimization problem using the problem-
based approach. The example demonstrates the typical work flow: create an objective function,
create constraints, solve the problem, and examine the results.

Note:

If your objective function or nonlinear constraints are not composed of elementary functions, you
must convert the nonlinear functions to optimization expressions using fcn2optimexpr. See the
last part of this example, Alternative Formulation Using fcn2optimexpr on page 1-0 , or “Convert
Nonlinear Function to Optimization Expression” on page 7-8.

For the solver-based approach to this problem, see “Solve a Constrained Nonlinear Problem, Solver-
Based” on page 1-11.

Problem Formulation: Rosenbrock's Function

Consider the problem of minimizing Rosenbrock's function

2 2
f (x) = 100 x2 − x12 + (1 − x1) ,

over the unit disk, meaning the disk of radius 1 centered at the origin. In other words, find x that
minimizes the function f (x) over the set x12 + x22 ≤ 1. This problem is a minimization of a nonlinear
function subject to a nonlinear constraint.

Rosenbrock's function is a standard test function in optimization. It has a unique minimum value of 0
attained at the point [1,1]. Finding the minimum is a challenge for some algorithms because the
function has a shallow minimum inside a deeply curved valley. The solution for this problem is not at
the point [1,1] because that point does not satisfy the constraint.

This figure shows two views of Rosenbrock's function in the unit disk. The vertical axis is log-scaled;
in other words, the plot shows log(1 + f (x)). Contour lines lie beneath the surface plot.
rosenbrock = @(x)100*(x(:,2) - x(:,1).^2).^2 + (1 - x(:,1)).^2; % Vectorized function

figure1 = figure('Position',[1 200 600 300]);


colormap('gray');
axis square;
R = 0:.002:1;
TH = 2*pi*(0:.002:1);
X = R'*cos(TH);
Y = R'*sin(TH);
Z = log(1 + rosenbrock([X(:),Y(:)]));
Z = reshape(Z,size(X));

% Create subplot
subplot1 = subplot(1,2,1,'Parent',figure1);
view([124 34]);
grid('on');
hold on;

% Create surface

1-5
1 Getting Started

surf(X,Y,Z,'Parent',subplot1,'LineStyle','none');

% Create contour
contour(X,Y,Z,'Parent',subplot1);

% Create subplot
subplot2 = subplot(1,2,2,'Parent',figure1);
view([234 34]);
grid('on');
hold on

% Create surface
surf(X,Y,Z,'Parent',subplot2,'LineStyle','none');

% Create contour
contour(X,Y,Z,'Parent',subplot2);

% Create textarrow
annotation(figure1,'textarrow',[0.4 0.31],...
[0.055 0.16],...
'String',{'Minimum at (0.7864,0.6177)'});

% Create arrow
annotation(figure1,'arrow',[0.59 0.62],...
[0.065 0.34]);

title("Rosenbrock's Function: Two Views")

hold off

The rosenbrock function handle calculates Rosenbrock's function at any number of 2-D points at
once. This “Vectorization” (MATLAB) speeds the plotting of the function, and can be useful in other
contexts for speeding evaluation of a function at multiple points.

1-6
Solve a Constrained Nonlinear Problem, Problem-Based

The function f (x) is called the objective function. The objective function is the function you want to
minimize. The inequality x12 + x22 ≤ 1 is called a constraint. Constraints limit the set of x over which a
solver searches for a minimum. You can have any number of constraints, which are inequalities or
equations.

Define Problem Using Optimization Variables

The problem-based approach to optimization uses optimization variables to define objective and
constraints. There are two approaches for creating expressions using these variables:

• For polynomial or rational functions, write expressions directly in the variables.


• For other types of functions, convert functions to optimization expressions using fcn2optimexpr.
See Alternative Formulation Using fcn2optimexpr at the end of this example.

For this problem, both the objective function and the nonlinear constraint are polynomials, so you can
write the expressions directly in terms of optimization variables. Create a 2-D optimization variable
named 'x'.
x = optimvar('x',1,2);

Create the objective function as a polynomial in the optimization variable.


obj = 100*(x(2) - x(1)^2)^2 + (1 - x(1))^2;

Create an optimization problem named prob having obj as the objective function.
prob = optimproblem('Objective',obj);

Create the nonlinear constraint as a polynomial in the optimization variable.


nlcons = x(1)^2 + x(2)^2 <= 1;

Include the nonlinear constraint in the problem.


prob.Constraints.circlecons = nlcons;

Review the problem.


show(prob)

OptimizationProblem :

Solve for:
x

minimize :
((100 .* (x(2) - x(1).^2).^2) + (1 - x(1)).^2)

subject to circlecons:
(x(1).^2 + x(2).^2) <= 1

Solve Problem

To solve the optimization problem, call solve. The problem needs an initial point, which is a
structure giving the initial value of the optimization variable. Create the initial point structure x0
having an x-value of [0 0].

1-7
1 Getting Started

x0.x = [0 0];
[sol,fval,exitflag,output] = solve(prob,x0)

Solving problem using fmincon.

Local minimum found that satisfies the constraints.

Optimization completed because the objective function is non-decreasing in


feasible directions, to within the value of the optimality tolerance,
and constraints are satisfied to within the value of the constraint tolerance.

sol = struct with fields:


x: [0.7864 0.6177]

fval = 0.0457

exitflag =
OptimalSolution

output = struct with fields:


iterations: 24
funcCount: 84
constrviolation: 0
stepsize: 6.9164e-06
algorithm: 'interior-point'
firstorderopt: 2.0934e-08
cgiterations: 4
message: '...'
solver: 'fmincon'

Examine Solution

The solution shows exitflag = OptimalSolution. This exit flag indicates that the solution is a
local optimum. For information on trying to find a better solution, see “When the Solver Succeeds” on
page 4-18.

The exit message indicates that the solution satisfies the constraints. You can check that the solution
is indeed feasible in several ways.

• Check the reported infeasibility in the constrviolation field of the output structure.

infeas = output.constrviolation

infeas = 0

An infeasibility of 0 indicates that the solution is feasible.

• Compute the infeasibility at the solution.

infeas = infeasibility(nlcons,sol)

infeas = 0

Again, an infeasibility of 0 indicates that the solution is feasible.

• Compute the norm of x to ensure that it is less than or equal to 1.

1-8
Solve a Constrained Nonlinear Problem, Problem-Based

nx = norm(sol.x)

nx = 1.0000

The output structure gives more information on the solution process, such as the number of
iterations (24), the solver (fmincon), and the number of function evaluations (84). For more
information on these statistics, see “Tolerances and Stopping Criteria” on page 2-68.

Alternative Formulation Using fcn2optimexpr

For more complex expressions, write function files for the objective or constraint functions, and
convert them to optimization expressions using fcn2optimexpr. For example, the basis of the
nonlinear constraint function is in the disk.m file:
type disk

function radsqr = disk(x)

radsqr = x(1)^2 + x(2)^2;

Convert this function file to an optimization expression.


radsqexpr = fcn2optimexpr(@disk,x);

Furthermore, you can also convert the rosenbrock function handle, which was defined at the
beginning of the plotting routine, into an optimization expression.
rosenexpr = fcn2optimexpr(rosenbrock,x);

Create an optimization problem using these converted optimization expressions.


convprob = optimproblem('Objective',rosenexpr,'Constraints',radsqexpr <= 1);

View the new problem.


show(convprob)

OptimizationProblem :

Solve for:
x

minimize :
anonymousFunction2(x)

where:

anonymousFunction2 = @(x)100*(x(:,2)-x(:,1).^2).^2+(1-x(:,1)).^2;

subject to :
disk(x) <= 1

Solve the new problem. The solution is essentially the same as before.
[sol,fval,exitflag,output] = solve(convprob,x0)

Solving problem using fmincon.

1-9
Another Random Scribd Document
with Unrelated Content
The Project Gutenberg eBook of Early
explorers of Plymouth Harbor, 1525-1619
This ebook is for the use of anyone anywhere in the United States
and most other parts of the world at no cost and with almost no
restrictions whatsoever. You may copy it, give it away or re-use it
under the terms of the Project Gutenberg License included with this
ebook or online at www.gutenberg.org. If you are not located in the
United States, you will have to check the laws of the country where
you are located before using this eBook.

Title: Early explorers of Plymouth Harbor, 1525-1619

Author: Henry F. Howe

Release date: November 16, 2023 [eBook #72138]

Language: English

Original publication: Plymouth, MA: Plimoth Plantation, Inc. and the


Pilgrim Society, 1953

Credits: Steve Mattern and the Online Distributed Proofreading Team


at https://www.pgdp.net

*** START OF THE PROJECT GUTENBERG EBOOK EARLY


EXPLORERS OF PLYMOUTH HARBOR, 1525-1619 ***
Transcriber’s Note
Larger versions of the illustrations may be seen by right-
clicking them and selecting an option to view them separately,
or by double-tapping and/or stretching them. Full-size, higher-
resolution versions of the illustrations may be seen by clicking
(Larger) below them.
Additional notes will be found near the end of this ebook.
(Larger)
Early
Explorers of
Plymouth Harbor
1525–1619

by
Henry F. Howe
Published jointly by
Plimoth Plantation, Inc.
and the Pilgrim Society
Plymouth, 1953
Copyright by Plimoth Plantation, Inc., and the Pilgrim Society, 1953

Henry F. Howe is author of Prologue to New


England, New York, 1943, and Salt Rivers of the
Massachusetts Shore, New York, 1951, both
published by Rinehart & Co., Inc. Much of the
material here presented is condensed from these
volumes.
The cover decoration, which is reproduced from
the London 1614 translation of Bartholomew
Pitiscus, Trigonometry: or the doctrine of
triangles, shows not only early seventeenth
century ships but seamen using a cross-staff and
casting a lead.

Printed in offset by The Meriden Gravure Company, Meriden,


Connecticut
Composition by The Anthoensen Press, Portland, Maine
Early Explorers of Plymouth Harbor,
1525–1619
VISITORS to Plymouth are often amazed to learn that the Mayflower
was not the first vessel to drop anchor in Plymouth Harbor. The
“stern and rock-bound coast” of Massachusetts was in fact explored
by more than twenty recorded expeditions before the arrival of the
Pilgrims. At least six of these sailed into Plymouth Harbor. Plymouth
appeared on five good maps of the Massachusetts coast by 1616,
one of them a detailed map of Plymouth Harbor itself, made by
Samuel de Champlain in 1605. The Harbor had been successively
called Whitson Bay, the Port du Cap St. Louis, and Cranes Bay by
English, French and Dutch explorers, but the name Plimouth,
bestowed on it by Captain John Smith in 1614, was the one the
Pilgrims perpetuated.
The Pilgrim voyage was the successful culmination of a century
of maritime efforts along the New England coast by Spanish
explorers, Portuguese fishermen, French and Dutch fur traders, and
Elizabethan English “sea dogs.” All the western ports of Europe
seethed with ambitious shipmasters in search of opportunities for
profit in commerce or fishing, free-booting, the slave trade, warfare
or piracy. Some, like Henry Hudson, visited New England primarily
as geographers looking for a Northwest Passage through North
America. A few were probing out possibilities for a colonial
beachhead in the New World. One of these, led by French Jesuits,
had, like the Pilgrims, a religious motive. Three or four others
attempted New England colonies, but failed. Only the Pilgrims
succeeded in hanging on, through the inevitable preliminary
disasters of the first year or two, to found a permanent colony.
Indians of Massachusetts probably first saw Europeans during
the four Vinland voyages of Leif Ericsson and his successors
between 1003 and 1015 a.d. There is good reason to suppose that
these voyages touched the shores of Cape Cod and the islands
about Martha’s Vineyard, but no direct evidence connects them with
Plymouth Harbor. Continuous contact with Europeans did not begin
until nearly five hundred years later when John Cabot’s second
Newfoundland voyage coasted North America southward to the
Carolinas in 1498. The Portuguese nobleman, Miguel Cortereal, a
castaway on a voyage of exploration, may have lived among the
Indians at the head of Narragansett Bay from 1502 until 1511.
Certainly these Indians, the ancestors of Massasoit, were visited in
1524 by Giovanni da Verrazano, whose French expedition spent two
weeks in Narragansett Bay in that year. Verrazano wrote that he was
met by about twenty dugout canoes filled with eager people, dressed
in embroidered deerskins and necklaces. The women wore
ornaments of copper on their heads and in their ears, and painted
their faces. These Indians were delighted to trade furs for bright bits
of colored glass. They lived in circular houses “ten or twelve paces in
circumference, made of logs split in halves, covered with roofs of
straw, nicely put on.” From the description, these people were no
doubt the Wampanoags, who a hundred years later were such
friendly allies to the Pilgrims at Plymouth, only thirty miles away.
Verrazano did not enter Massachusetts Bay. But in the spring of
1525 a Spaniard, Estevan Gomez, cruised for two months on the
New England coast. While he left no narrative of his voyage, “Tierra
de Estevan Gomez” appeared immediately on Spanish maps of the
period, notably that of Ribero, 1529. An indentation of the bay shore
behind his “Cabo de Arenas” sufficiently suggests Plymouth Harbor
and Cape Cod Bay to make one speculate whether this is not in fact
the first record of a visit to Plymouth, ninety-five years before the
landing of the Pilgrims. What has been interpreted as Boston Harbor
on this Ribero map bears the name “Bay of St. Christoval,” but the
indentation that suggests Plymouth remains nameless. It seems
likely that Estevan Gomez was the discoverer of Plymouth in 1525.
After these first Massachusetts discoveries, there follows a
period of seventy-five years of documentary obscurity so far as
Massachusetts is concerned. Maritime activities about
Newfoundland, the St. Lawrence River, and Nova Scotia steadily
grew. Cartier founded, and then had to abandon, his French colony
at Quebec. Jehan Allefonsce, one of his shipmasters, was blown
southward from the Newfoundland banks in 1542, into “a great bay
in latitude 42°,” but Massachusetts Bay is not mentioned during the
rest of the century. Both French and English built up an increasing
fur trade on the Maine coast, especially about the Penobscot and in
the Bay of Fundy. Fishermen from all the maritime nations of Europe
crowded in increasing numbers to the Newfoundland banks, carrying
back apparently inexhaustible supplies of codfish to feed Catholic
Europe in the season of Lent. By 1578 as many as 350 fishing
vessels were making the transatlantic voyage, usually twice each
year. Since fishermen and fur traders rarely left written records, we
can only assume that with the increased volume of shipping, some of
these vessels found their way to Massachusetts shores. But we have
no documentary evidence of their visits.
(Larger)
TIERA DE AYLLON
TIERA DE ESTEVA GOMEZ
TIERA NOVA: DE CORTEREAI
Map of the North American Coast, a portion of the World Map made
by Diego Ribero in 1529, showing the results of the explorations of
Estevan Gomez in 1525
Reproduced from the Map Collection, Yale University Library

Inevitably the tremendous growth of all this free-lance


commercial shipping, in the fur trade and fisheries, must lead to
attempts to organize it either into colonial administrations under
government control, or business companies privately financed.
Everyone could see that permanent bases were needed in the New
World. International quarrels and actual piracy were already
appearing in the Newfoundland fisheries. Sir Humphrey Gilbert
determined to make Newfoundland an English colony. But in 1583
his well-organized expedition came to grief by shipwreck. His half
brother, Sir Walter Raleigh, renewed the attempt, this time at
Roanoke in the Carolinas, but was unable to maintain the colony’s
supply, once it was established. The project failed. But the idea was
right, and men in England like Raleigh and Richard Hakluyt, the
historian of English voyages, kept preaching the necessity of
overseas colonies. Similar ideas were growing in the western ports
of France.
The turn of the century marked the beginning of a concerted
campaign on the part of both France and England to establish
plantations in New England. The first move was made by a group of
free-lance English merchants who in 1602 sent out Bartholomew
Gosnold on a commercial voyage with thirty-two men in the bark
Concord. Its objective was twofold, to get sassafras (a medication
thought to be “of sovereigne vertue for the French poxe”) and to
establish an outpost somewhere in the area described by Verrazano
seventy-five years before. Sir Humphrey Gilbert’s son Bartholomew
was among the company. After brief landings on Cape Cod and
Martha’s Vineyard, the expedition built a hut on the islet in the pond
on Cuttyhunk Island in Buzzards Bay, there traded with the Indians,
cut a cargo of sassafras root and cedarwood, and after a minor
altercation with the natives pulled up stakes and sailed back to
England with two enthusiastic narratives of the country and its
commodities.
Less than nine months later, and obviously because of the
success of the Gosnold voyage, a new expedition of two vessels,
Speedwell and Discoverer, with forty-four men and boys, sailed to
Massachusetts from Bristol under command of Martin Pring, a
Devonshire skipper. This voyage was instigated by Richard Hakluyt,
together with some merchants of Bristol. Robert Salterne, pilot of the
Gosnold voyage, was assistant to Pring. Deliberately avoiding the
long sail around Cape Cod, Pring entered Massachusetts Bay,
coasted its North Shore looking for sassafras, and finding none,
sailed across to “a certaine Bay, which we called Whitson Bay.” It
had a “pleasant Hill thereunto adjoyning,” and a “Haven winding in
compasse like the Shell of a Snaile,” with twenty fathoms at the
entrance, and seven fathoms at the land-locked anchorage. Here
was a “sufficient quantitie of Sassafras.” This was Plymouth Harbor
in 1603.
Ashore, Pring’s men built a “small baricado” or watchtower and
kept it manned with sentinels while the crew worked in the woods.
They also took ashore with them two great mastiff dogs, of whom the
Indians were mortally afraid. Indians appeared in considerable
numbers, “at one time one hundred and twentie at once.” One of the
crew delighted the natives by playing a guitar and they sang and
“danced twentie in a ring” about him and gave him gifts of tobacco,
tobacco pipes, snakeskin girdles and “Fawnes skinnes.” They had
black and yellow bows, and prettily decorated quivers of rushes filled
with long feathered arrows. They used birch-bark canoes “where of
we brought one to Bristoll.” Their gardens were planted with tobacco,
pumpkins, cucumbers and corn. The men wore breech clouts, and
feathers in their knotted hair; the women “Aprons of Leather skins
before them down to the knees, and a Bears skinne like an Irish
Mantle over one shoulder.”
(Larger)
Martin Pring’s men ashore with a mastiff, Plymouth Harbor, 1603
Reproduced from J. P. Abelin, De Wytheroemde Voyagien der Engelsen (Leiden,
1727) in the Boston Athenæum

For the story of the latter days of their seven weeks’ stay at
Plymouth, we can scarcely do better than read Pring’s narrative in
the original: “By the end of July we had laded our small Barke called
the Discoverer with as much Sassafras as we thought sufficient, and
sent her home into England before, to give some speedie
contentment to the Adventurers; who arrived safely in Kingrode
about a fortnight before us. After their departure we so bestirred
ourselves, that our shippe also had gotten in her lading, during which
time there fell out this accident. On a day about noone tide while our
men which used to cut down Sassafras in the woods were asleep, as
they used to do for two houres in the heat of the day, there came
down about seven score Savages armed with their Bowes and
Arrowes, and environed our House or Barricado, wherein were foure
of our men alone with their Muskets to keepe Centinell, whom they
sought to have come down unto them, which they utterly refused,
and stood upon their guard. Our Master like-wise being very careful
and circumspect, having not past two with him in the shippe, put the
same in the best defence he could, lest they should have invaded
the same, and caused a piece of great Ordnance to bee shot off to
give terrour to the Indians, and warning to our men which were fast
asleepe in the woods: at the noyse of which peece they ... betooke
them to their weapons, and with their Mastives, great Foole with an
halfe Pike in his mouth, drew down to their ship; whom when the
Indians beheld afarre off with the Mastive which they most feared, in
dissembling manner they turned all to a jest and sport and departed
away in friendly manner, yet not long after, even the day before our
departure, they set fire on the woods where wee wrought, which wee
did behold to burne for a mile space, and the very same day that
wee weighed Anchor, they came down to the shore in greater
number, to wit, very neere two hundred by our estimation, and some
of them came in theire Boates to our ship, and would have had us
come in againe, but we sent them back, and would none of their
entertainment.” One would love to know more than is provided in
Martin Pring’s brief narrative in order to estimate fairly whether the
English had given provocation to the Indians for this threatened
attack at Plymouth. The only hint of provocation is the taking of a
canoe back to England.
Had Plymouth been populated by two hundred Indians in 1620 it
seems unlikely that the Pilgrims could have survived. The English
adventurers transferred their explorations to the coast of Maine,
where in 1605 George Waymouth, and in 1606 Martin Pring, made
investigations preparatory to the major colonization attempt of the
English Plymouth Company at Sagadahoc, the mouth of the
Kennebec River. This colony failed after a year through a breakdown
in leadership. For our purposes it is significant that the English sent
no further explorations into Massachusetts waters for eight years
after the voyages of Gosnold and Martin Pring. It looks as though the
enmity of the Massachusetts Indians dissuaded English merchants
from further plans to set up a colony in that region.
Plymouth’s next visitor was that great French explorer and
empire builder, the founder of Canada, Samuel de Champlain. A
group of French merchants organized by the governor of Dieppe,
that old French port which had been sending fishermen to America
for more than a century, succeeded in 1603 in getting from Henri IV
a commission to found a colony. After spending the summer of 1603
exploring the St. Lawrence, the leaders decided that a more
southern climate was desirable. Accordingly in 1604, De Monts and
Pont-Grave, with Samuel de Champlain as geographer and
chronicler of the expedition, explored the Bay of Fundy and founded
a colony on Dochet Island in Passamaquoddy Bay. Half the colonists
died during the first winter, and the colony was moved across the
Bay to a better site at what is now Annapolis, Nova Scotia. Meantime
Champlain spent the summers of 1605 and 1606 exploring the New
England coast, familiarizing himself with all the shores as far south
as Woods Hole in Massachusetts. He wrote a splendid account of
these expeditions, which is still good reading; and for the first time
produced a good map of the Massachusetts coast, with detail maps
of Gloucester, Plymouth, Eastham and Chatham harbors.
(Larger)
Champlain’s Map of Plymouth Harbor, 1605
Legends on Champlain’s Map of Port St. Louis with comments

A. SHOWS THE PLACE WHERE VESSELS ANCHOR


The figures show the fathoms of water. The depths are now much less than
those indicated on the map, and the difference may represent an actual
change.
B. THE CHANNEL
C. TWO ISLANDS
Clarks Island, a low swell of upland occupied by farms, and Saquish Head,
likewise low upland occupied by a few buildings and some bushes.
D. SAND DUNES
The long line of dune beaches, collectively called Duxbury Beach,
connecting Brant and Gurnet points.
E. SHOALS
A prominent feature of this part of Plymouth Harbor and of Duxbury Bay. As
Champlain states, they are largely bare at low tide.
F. WIGWAMS WHERE THE INDIANS CULTIVATE THE LAND
A number are situated on the slope where now stands the historic city of
Plymouth founded by the Pilgrim Fathers fifteen years after this visit of
Champlain.
G. THE SPOT WHERE WE RAN OUR PINNACE AGROUND
Browns Bank, still an impediment to the navigation of the harbor. Apparently
it was while their pinnace was aground that Champlain landed on the north
end of Long Beach, where he sketched this map.
H. A KIND OF ISLAND COVERED WITH TREES, AND
CONNECTED WITH THE SAND DUNES
The Gurnet, a low-swelling upland island, ending in a low bluff; it is largely
bare of trees and occupied by a group of buildings belonging to the light
station. It was thickly wooded when the Pilgrim Fathers settled here in 1620.
Slafter’s contention that this was Saquish Head he later abandoned.
I. A FAIRLY HIGH PROMONTORY, WHICH IS VISIBLE FROM
FOUR TO FIVE LEAGUES OUT TO SEA
Manomet Hill, 360 feet in height, a plateau ridge cut to an abrupt bluff where
it reaches the sea, thus forming a conspicuous landmark.
Welcome to our website – the ideal destination for book lovers and
knowledge seekers. With a mission to inspire endlessly, we offer a
vast collection of books, ranging from classic literary works to
specialized publications, self-development books, and children's
literature. Each book is a new journey of discovery, expanding
knowledge and enriching the soul of the reade

Our website is not just a platform for buying books, but a bridge
connecting readers to the timeless values of culture and wisdom. With
an elegant, user-friendly interface and an intelligent search system,
we are committed to providing a quick and convenient shopping
experience. Additionally, our special promotions and home delivery
services ensure that you save time and fully enjoy the joy of reading.

Let us accompany you on the journey of exploring knowledge and


personal growth!

textbookfull.com

You might also like