100% found this document useful (1 vote)
13 views

Practical Methods for Optimal Control and Estimation Using Nonlinear Programming Second Edition Advances in Design and Control John T. Betts instant download

The document is about the second edition of 'Practical Methods for Optimal Control and Estimation Using Nonlinear Programming' by John T. Betts, which is part of the SIAM Advances in Design and Control series. It covers various topics related to nonlinear programming, optimal control, and estimation methods applicable in engineering and scientific disciplines. The book includes mathematical and computational techniques for optimization and control design.

Uploaded by

sprowbuskedo
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
100% found this document useful (1 vote)
13 views

Practical Methods for Optimal Control and Estimation Using Nonlinear Programming Second Edition Advances in Design and Control John T. Betts instant download

The document is about the second edition of 'Practical Methods for Optimal Control and Estimation Using Nonlinear Programming' by John T. Betts, which is part of the SIAM Advances in Design and Control series. It covers various topics related to nonlinear programming, optimal control, and estimation methods applicable in engineering and scientific disciplines. The book includes mathematical and computational techniques for optimization and control design.

Uploaded by

sprowbuskedo
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 57

Practical Methods for Optimal Control and

Estimation Using Nonlinear Programming Second


Edition Advances in Design and Control John T.
Betts download
https://ebookname.com/product/practical-methods-for-optimal-
control-and-estimation-using-nonlinear-programming-second-
edition-advances-in-design-and-control-john-t-betts/

Get Instant Ebook Downloads – Browse at https://ebookname.com


Instant digital products (PDF, ePub, MOBI) available
Download now and explore formats that suit you...

Practical methods for optimal control using nonlinear


programming 1st Edition John T. Betts

https://ebookname.com/product/practical-methods-for-optimal-
control-using-nonlinear-programming-1st-edition-john-t-betts/

Nonlinear optimal control theory 1st Edition Leonard


David Berkovitz

https://ebookname.com/product/nonlinear-optimal-control-
theory-1st-edition-leonard-david-berkovitz/

Linear Feedback Control Analysis and Design with MATLAB


Advances in Design and Control 1st Edition Dingyu Xue

https://ebookname.com/product/linear-feedback-control-analysis-
and-design-with-matlab-advances-in-design-and-control-1st-
edition-dingyu-xue/

Words of the Lagoon R. E. Johannes

https://ebookname.com/product/words-of-the-lagoon-r-e-johannes/
Ironclads at War The Monitor vs The Merrimac 1st
Edition Dan Abnett

https://ebookname.com/product/ironclads-at-war-the-monitor-vs-
the-merrimac-1st-edition-dan-abnett/

Liquid and Crystal Nanomaterials for Water Pollutants


Remediation 1st Edition Uma Shanker (Editor)

https://ebookname.com/product/liquid-and-crystal-nanomaterials-
for-water-pollutants-remediation-1st-edition-uma-shanker-editor/

Catherine the Great 2009 1st US Edition Edition Simon


Dixon

https://ebookname.com/product/catherine-the-great-2009-1st-us-
edition-edition-simon-dixon/

The Complete Guide to Customer Support How to Turn


Technical Assistance Into a Profitable Relationship 1st
Edition Joe Fleischer (Author)

https://ebookname.com/product/the-complete-guide-to-customer-
support-how-to-turn-technical-assistance-into-a-profitable-
relationship-1st-edition-joe-fleischer-author/

iPhone The Missing Manual 1st Edition David Pogue

https://ebookname.com/product/iphone-the-missing-manual-1st-
edition-david-pogue/
Law for Criminologists A Practical Guide 1st Edition
Ursula Smartt

https://ebookname.com/product/law-for-criminologists-a-practical-
guide-1st-edition-ursula-smartt/
Practical Methods
for Optimal Control
and Estimation Using
Nonlinear Programming
Advances in Design and Control
SIAM’s Advances in Design and Control series consists of texts and monographs dealing with all areas of
design and control and their applications. Topics of interest include shape optimization, multidisciplinary
design, trajectory optimization, feedback, and optimal control. The series focuses on the mathematical and
computational aspects of engineering design and control that are usable in a wide variety of scientific and
engineering disciplines.

Editor-in-Chief
Ralph C. Smith, North Carolina State University

Editorial Board
Athanasios C. Antoulas, Rice University
Siva Banda, Air Force Research Laboratory
Belinda A. Batten, Oregon State University
John Betts, The Boeing Company (retired)
Stephen L. Campbell, North Carolina State University
Eugene M. Cliff, Virginia Polytechnic Institute and State University
Michel C. Delfour, University of Montreal
Max D. Gunzburger, Florida State University
J. William Helton, University of California, San Diego
Arthur J. Krener, University of California, Davis
Kirsten Morris, University of Waterloo
Richard Murray, California Institute of Technology
Ekkehard Sachs, University of Trier

Series Volumes
Betts, John T., Practical Methods for Optimal Control and Estimation Using Nonlinear Programming, Second
Edition
Shima, Tal and Rasmussen, Steven, eds., UAV Cooperative Decision and Control: Challenges and Practical
Approaches
Speyer, Jason L. and Chung, Walter H., Stochastic Processes, Estimation, and Control
Krstic, Miroslav and Smyshlyaev, Andrey, Boundary Control of PDEs: A Course on Backstepping Designs
Ito, Kazufumi and Kunisch, Karl, Lagrange Multiplier Approach to Variational Problems and Applications
Xue, Dingyü, Chen, YangQuan, and Atherton, Derek P., Linear Feedback Control: Analysis and Design
with MATLAB
Hanson, Floyd B., Applied Stochastic Processes and Control for Jump-Diffusions: Modeling, Analysis,
and Computation
Michiels, Wim and Niculescu, Silviu-Iulian, Stability and Stabilization of Time-Delay Systems: An Eigenvalue-Based
Approach
Ioannou, Petros and Fidan, Baris,¸ Adaptive Control Tutorial
Bhaya, Amit and Kaszkurewicz, Eugenius, Control Perspectives on Numerical Algorithms and Matrix Problems
Robinett III, Rush D., Wilson, David G., Eisler, G. Richard, and Hurtado, John E., Applied Dynamic Programming
for Optimization of Dynamical Systems
Huang, J., Nonlinear Output Regulation: Theory and Applications
Haslinger, J. and Mäkinen, R. A. E., Introduction to Shape Optimization: Theory, Approximation, and
Computation
Antoulas, Athanasios C., Approximation of Large-Scale Dynamical Systems
Gunzburger, Max D., Perspectives in Flow Control and Optimization
Delfour, M. C. and Zolésio, J.-P., Shapes and Geometries: Analysis, Differential Calculus, and Optimization
Betts, John T., Practical Methods for Optimal Control Using Nonlinear Programming
El Ghaoui, Laurent and Niculescu, Silviu-Iulian, eds., Advances in Linear Matrix Inequality Methods in Control
Helton, J. William and James, Matthew R., Extending H∞ Control to Nonlinear Systems: Control of Nonlinear
Systems to Achieve Performance Objectives
Practical Methods
for Optimal Control
and Estimation Using
Nonlinear Programming
SECOND EDITION

John T. Betts

Society for Industrial and Applied Mathematics


Philadelphia
Copyright © 2010 by the Society for Industrial and Applied Mathematics (SIAM)

10 9 8 7 6 5 4 3 2 1

All rights reserved. Printed in the United States of America. No part of this book may be
reproduced, stored, or transmitted in any manner without the written permission of the
publisher. For information, write to the Society for Industrial and Applied Mathematics,
3600 Market Street, 6th Floor, Philadelphia, PA 19104-2688 USA.

Trademarked names may be used in this book without the inclusion of a trademark
symbol. These names are used in an editorial context only; no infringement of trademark
is intended.

Dell is a registered trademark of Dell, Inc.

KNITRO is a registered trademark of Ziena Optimization, Inc.

Linux is a registered trademark of Linus Torvalds.

Maple is a registered trademark of Waterloo Maple, Inc.

NPSOL is a registered trademark of Stanford University.

SNOPT is a trademark of Stanford University and UC San Diego.

Library of Congress Cataloging-in-Publication Data


Betts, John T. 1943-
Practical methods for optimal control and estimation using nonlinear programming /
John T. Betts. — 2nd ed.
p. cm. — (Advances in design and control)
Includes bibliographical references and index.
ISBN 978-0-898716-88-7
1. Control theory. 2. Mathematical optimization. 3. Nonlinear programming. I. Title.
QA402.3.B47 2009
629.8’312—dc22
2009025106

is a registered trademark.
For Theon and Dorothy

He Inspired Creativity
She Cherished Education


Contents

Preface xiii

1 Introduction to Nonlinear Programming 1


1.1 Preliminaries . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1
1.2 Newton’s Method in One Variable . . . . . . . . . . . . . . . . . . . . 2
1.3 Secant Method in One Variable . . . . . . . . . . . . . . . . . . . . . . 4
1.4 Newton’s Method for Minimization in One Variable . . . . . . . . . . . 5
1.5 Newton’s Method in Several Variables . . . . . . . . . . . . . . . . . . 7
1.6 Unconstrained Optimization . . . . . . . . . . . . . . . . . . . . . . . 8
1.7 Recursive Updates . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10
1.8 Equality-Constrained Optimization . . . . . . . . . . . . . . . . . . . . 12
1.8.1 Newton’s Method . . . . . . . . . . . . . . . . . . . . . . . . 15
1.9 Inequality-Constrained Optimization . . . . . . . . . . . . . . . . . . . 16
1.10 Quadratic Programming . . . . . . . . . . . . . . . . . . . . . . . . . . 18
1.11 Globalization Strategies . . . . . . . . . . . . . . . . . . . . . . . . . . 21
1.11.1 Merit Functions . . . . . . . . . . . . . . . . . . . . . . . . . 21
1.11.2 Line-Search Methods . . . . . . . . . . . . . . . . . . . . . . 23
1.11.3 Trust-Region Methods . . . . . . . . . . . . . . . . . . . . . 25
1.11.4 Filters . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 26
1.12 Nonlinear Programming . . . . . . . . . . . . . . . . . . . . . . . . . . 28
1.13 An SQP Algorithm . . . . . . . . . . . . . . . . . . . . . . . . . . . . 29
1.14 Interior-Point Methods . . . . . . . . . . . . . . . . . . . . . . . . . . 30
1.15 Mathematical Program with Complementarity Conditions . . . . . . . . 36
1.15.1 The Signum or Sign Operator . . . . . . . . . . . . . . . . . . 37
1.15.2 The Absolute Value Operator . . . . . . . . . . . . . . . . . . 38
1.15.3 The Maximum Value Operator . . . . . . . . . . . . . . . . . 38
1.15.4 The Minimum Value Operator . . . . . . . . . . . . . . . . . 39
1.15.5 Solving an MPEC . . . . . . . . . . . . . . . . . . . . . . . . 39
1.16 What Can Go Wrong . . . . . . . . . . . . . . . . . . . . . . . . . . . 40
1.16.1 Infeasible Constraints . . . . . . . . . . . . . . . . . . . . . . 40
1.16.2 Rank-Deficient Constraints . . . . . . . . . . . . . . . . . . . 40
1.16.3 Constraint Redundancy . . . . . . . . . . . . . . . . . . . . . 41
1.16.4 Discontinuities . . . . . . . . . . . . . . . . . . . . . . . . . 42
1.16.5 Scaling . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 45
1.16.6 Nonunique Solution . . . . . . . . . . . . . . . . . . . . . . . 46

vii
viii Contents

1.17 Derivative Approximation by Finite Differences . . . . . . . . . . . . . 46


1.17.1 Difference Estimates in Differential Equations . . . . . . . . . 48

2 Large, Sparse Nonlinear Programming 51


2.1 Overview: Large, Sparse NLP Issues . . . . . . . . . . . . . . . . . . . 51
2.2 Sparse Finite Differences . . . . . . . . . . . . . . . . . . . . . . . . . 52
2.2.1 Background . . . . . . . . . . . . . . . . . . . . . . . . . . . 52
2.2.2 Sparse Hessian Using Gradient Differences . . . . . . . . . . 53
2.2.3 Sparse Differences in Nonlinear Programming . . . . . . . . . 54
2.3 Sparse QP Subproblem . . . . . . . . . . . . . . . . . . . . . . . . . . 54
2.4 Merit Function . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 56
2.5 Hessian Approximation . . . . . . . . . . . . . . . . . . . . . . . . . . 58
2.6 Sparse SQP Algorithm . . . . . . . . . . . . . . . . . . . . . . . . . . 60
2.6.1 Minimization Process . . . . . . . . . . . . . . . . . . . . . . 60
2.6.2 Algorithm Strategy . . . . . . . . . . . . . . . . . . . . . . . 62
2.7 Defective Subproblems . . . . . . . . . . . . . . . . . . . . . . . . . . 62
2.8 Feasible Point Strategy . . . . . . . . . . . . . . . . . . . . . . . . . . 63
2.8.1 QP Subproblem . . . . . . . . . . . . . . . . . . . . . . . . . 63
2.8.2 Feasible Point Strategy . . . . . . . . . . . . . . . . . . . . . 64
2.8.3 An Illustration . . . . . . . . . . . . . . . . . . . . . . . . . . 65
2.9 Computational Experience . . . . . . . . . . . . . . . . . . . . . . . . 67
2.9.1 Large, Sparse Test Problems . . . . . . . . . . . . . . . . . . 67
2.9.2 Small, Dense Test Problems . . . . . . . . . . . . . . . . . . 68
2.10 Nonlinear Least Squares . . . . . . . . . . . . . . . . . . . . . . . . . 70
2.10.1 Background . . . . . . . . . . . . . . . . . . . . . . . . . . . 70
2.10.2 Sparse Least Squares . . . . . . . . . . . . . . . . . . . . . . 70
2.10.3 Residual Hessian . . . . . . . . . . . . . . . . . . . . . . . . 72
2.11 Barrier Algorithm . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 73
2.11.1 External Format . . . . . . . . . . . . . . . . . . . . . . . . . 73
2.11.2 Internal Format . . . . . . . . . . . . . . . . . . . . . . . . . 74
2.11.3 Definitions . . . . . . . . . . . . . . . . . . . . . . . . . . . 76
2.11.4 Logarithmic Barrier Function . . . . . . . . . . . . . . . . . . 77
2.11.5 Computing a Search Direction . . . . . . . . . . . . . . . . . 79
2.11.6 Inertia Requirements for the Barrier KKT System . . . . . . . 82
2.11.7 Filter Globalization . . . . . . . . . . . . . . . . . . . . . . . 83
2.11.8 Barrier Parameter Update Strategy . . . . . . . . . . . . . . . 86
2.11.9 Initialization . . . . . . . . . . . . . . . . . . . . . . . . . . . 86
2.11.10 Outline of the Primary Algorithm . . . . . . . . . . . . . . . 87
2.11.11 Computational Experience . . . . . . . . . . . . . . . . . . . 88

3 Optimal Control Preliminaries 91


3.1 The Transcription Method . . . . . . . . . . . . . . . . . . . . . . . . . 91
3.2 Dynamic Systems . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 91
3.3 Shooting Method . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 93
3.4 Multiple Shooting Method . . . . . . . . . . . . . . . . . . . . . . . . 95
3.5 Initial Value Problems . . . . . . . . . . . . . . . . . . . . . . . . . . . 97
3.6 Boundary Value Example . . . . . . . . . . . . . . . . . . . . . . . . . 105
Contents ix

3.7 Dynamic Modeling Hierarchy . . . . . . . . . . . . . . . . . . . . . . . 108


3.8 Function Generator . . . . . . . . . . . . . . . . . . . . . . . . . . . . 108
3.8.1 Description . . . . . . . . . . . . . . . . . . . . . . . . . . . 108
3.8.2 NLP Considerations . . . . . . . . . . . . . . . . . . . . . . . 109
3.9 Dynamic System Differentiation . . . . . . . . . . . . . . . . . . . . . 111
3.9.1 Simple Example . . . . . . . . . . . . . . . . . . . . . . . . . 111
3.9.2 Discretization versus Differentiation . . . . . . . . . . . . . . 115
3.9.3 External and Internal Differentiation . . . . . . . . . . . . . . 115
3.9.4 Variational Derivatives . . . . . . . . . . . . . . . . . . . . . 118

4 The Optimal Control Problem 123


4.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 123
4.1.1 Dynamic Constraints . . . . . . . . . . . . . . . . . . . . . . 123
4.1.2 Algebraic Equality Constraints . . . . . . . . . . . . . . . . . 124
4.1.3 Singular Arcs . . . . . . . . . . . . . . . . . . . . . . . . . . 125
4.1.4 Algebraic Inequality Constraints . . . . . . . . . . . . . . . . 126
4.2 Necessary Conditions for the Discrete Problem . . . . . . . . . . . . . 126
4.3 Direct versus Indirect Methods . . . . . . . . . . . . . . . . . . . . . . 127
4.4 General Formulation . . . . . . . . . . . . . . . . . . . . . . . . . . . 129
4.5 Direct Transcription Formulation . . . . . . . . . . . . . . . . . . . . . 132
4.6 NLP Considerations—Sparsity . . . . . . . . . . . . . . . . . . . . . . 134
4.6.1 Background . . . . . . . . . . . . . . . . . . . . . . . . . . . 134
4.6.2 Standard Approach . . . . . . . . . . . . . . . . . . . . . . . 136
4.6.3 Discretization Separability . . . . . . . . . . . . . . . . . . . 137
4.6.4 Right-Hand-Side Sparsity (Trapezoidal) . . . . . . . . . . . . 139
4.6.5 Hermite–Simpson (Compressed) (HSC) . . . . . . . . . . . . 141
4.6.6 Hermite–Simpson (Separated) (HSS) . . . . . . . . . . . . . . 143
4.6.7 K-Stage Runge–Kutta Schemes . . . . . . . . . . . . . . . . 145
4.6.8 General Approach . . . . . . . . . . . . . . . . . . . . . . . . 146
4.6.9 Performance Issues . . . . . . . . . . . . . . . . . . . . . . . 147
4.6.10 Performance Highlights . . . . . . . . . . . . . . . . . . . . . 149
4.7 Mesh Refinement . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 152
4.7.1 Representing the Solution . . . . . . . . . . . . . . . . . . . . 153
4.7.2 Estimating the Discretization Error . . . . . . . . . . . . . . . 154
4.7.3 Estimating the Order Reduction . . . . . . . . . . . . . . . . 158
4.7.4 Constructing a New Mesh . . . . . . . . . . . . . . . . . . . 159
4.7.5 The Mesh-Refinement Algorithm . . . . . . . . . . . . . . . . 161
4.7.6 Computational Experience . . . . . . . . . . . . . . . . . . . 163
4.8 Scaling . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 166
4.9 Quadrature Equations . . . . . . . . . . . . . . . . . . . . . . . . . . . 168
4.10 Algebraic Variable Rate Constraints . . . . . . . . . . . . . . . . . . . 172
4.11 Estimating Adjoint Variables . . . . . . . . . . . . . . . . . . . . . . . 173
4.11.1 Quadrature Approximation . . . . . . . . . . . . . . . . . . . 175
4.11.2 Path Constraint Adjoints . . . . . . . . . . . . . . . . . . . . 176
4.11.3 Differential Constraint Adjoints . . . . . . . . . . . . . . . . 177
4.11.4 Numerical Comparisons . . . . . . . . . . . . . . . . . . . . 178
4.12 Discretize Then Optimize . . . . . . . . . . . . . . . . . . . . . . . . . 192
x Contents

4.12.1 High Index Partial Differential-Algebraic Equation . . . . . . 192


4.12.2 State Vector Formulation . . . . . . . . . . . . . . . . . . . . 193
4.12.3 Direct Transcription Results . . . . . . . . . . . . . . . . . . 194
4.12.4 The Indirect Approach . . . . . . . . . . . . . . . . . . . . . 194
4.12.5 Optimality Conditions . . . . . . . . . . . . . . . . . . . . . 196
Unconstrained Arcs (s < 0) . . . . . . . . . . . . . . . . . . . 196
Constrained Arcs (s = 0) . . . . . . . . . . . . . . . . . . . . 197
Boundary Conditions . . . . . . . . . . . . . . . . . . . . . . 198
Optimality Conditions: Summary . . . . . . . . . . . . . . . 199
4.12.6 Computational Comparison—Direct versus Indirect . . . . . . 199
Direct Method . . . . . . . . . . . . . . . . . . . . . . . . . . 199
Indirect Method . . . . . . . . . . . . . . . . . . . . . . . . . 199
4.12.7 Analysis of Results . . . . . . . . . . . . . . . . . . . . . . . 200
The Quandary . . . . . . . . . . . . . . . . . . . . . . . . . . 200
The Explanation . . . . . . . . . . . . . . . . . . . . . . . . . 201
4.13 Questions of Efficiency . . . . . . . . . . . . . . . . . . . . . . . . . . 205
Question: Newton or Quasi-Newton Hessian? . . . . . . . . . 210
Question: Barrier or SQP Algorithm? . . . . . . . . . . . . . 210
4.14 What Can Go Wrong . . . . . . . . . . . . . . . . . . . . . . . . . . . 212
4.14.1 Singular Arcs . . . . . . . . . . . . . . . . . . . . . . . . . . 212
4.14.2 State Constraints . . . . . . . . . . . . . . . . . . . . . . . . 215
4.14.3 Discontinuous Control . . . . . . . . . . . . . . . . . . . . . 216

5 Parameter Estimation 219


5.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 219
5.2 The Parameter Estimation Problem . . . . . . . . . . . . . . . . . . . . 219
5.3 Computing the Residuals . . . . . . . . . . . . . . . . . . . . . . . . . 222
5.4 Computing Derivatives . . . . . . . . . . . . . . . . . . . . . . . . . . 223
5.4.1 Residuals and Sparsity . . . . . . . . . . . . . . . . . . . . . 224
5.4.2 Residual Decomposition . . . . . . . . . . . . . . . . . . . . 225
5.4.3 Auxiliary Function Decomposition . . . . . . . . . . . . . . . 225
5.4.4 Algebraic Variable Parameterization . . . . . . . . . . . . . . 227
5.5 Computational Experience . . . . . . . . . . . . . . . . . . . . . . . . 228
5.5.1 Reentry Trajectory Reconstruction . . . . . . . . . . . . . . . 230
5.5.2 Commercial Aircraft Rotational Dynamics Analysis . . . . . . 233
5.6 Optimal Control or Optimal Estimation? . . . . . . . . . . . . . . . . . 241

6 Optimal Control Examples 247


6.1 Space Shuttle Reentry Trajectory . . . . . . . . . . . . . . . . . . . . . 247
6.2 Minimum Time to Climb . . . . . . . . . . . . . . . . . . . . . . . . . 256
6.2.1 Tabular Data . . . . . . . . . . . . . . . . . . . . . . . . . . . 257
6.2.2 Cubic Spline Interpolation . . . . . . . . . . . . . . . . . . . 258
6.2.3 Minimum Curvature Spline . . . . . . . . . . . . . . . . . . . 259
6.2.4 Numerical Solution . . . . . . . . . . . . . . . . . . . . . . . 262
6.3 Low-Thrust Orbit Transfer . . . . . . . . . . . . . . . . . . . . . . . . 265
6.3.1 Modified Equinoctial Coordinates . . . . . . . . . . . . . . . 265
6.3.2 Gravitational Disturbing Acceleration . . . . . . . . . . . . . 267
Contents xi

6.3.3 Thrust Acceleration—Burn Arcs . . . . . . . . . . . . . . . . 267


6.3.4 Boundary Conditions . . . . . . . . . . . . . . . . . . . . . . 269
6.3.5 Numerical Solution . . . . . . . . . . . . . . . . . . . . . . . 269
6.4 Two-Burn Orbit Transfer . . . . . . . . . . . . . . . . . . . . . . . . . 271
6.4.1 Simple Shooting Formulation . . . . . . . . . . . . . . . . . . 273
6.4.2 Multiple Shooting Formulation . . . . . . . . . . . . . . . . . 278
6.4.3 Collocation Formulation . . . . . . . . . . . . . . . . . . . . 279
6.5 Hang Glider . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 282
6.6 Abort Landing in the Presence of Windshear . . . . . . . . . . . . . . . 284
6.6.1 Dynamic Equations . . . . . . . . . . . . . . . . . . . . . . . 286
6.6.2 Objective Function . . . . . . . . . . . . . . . . . . . . . . . 288
6.6.3 Control Variable . . . . . . . . . . . . . . . . . . . . . . . . . 289
6.6.4 Model Data . . . . . . . . . . . . . . . . . . . . . . . . . . . 289
6.6.5 Computational Results . . . . . . . . . . . . . . . . . . . . . 291
6.7 Space Station Attitude Control . . . . . . . . . . . . . . . . . . . . . . 293
6.8 Reorientation of an Asymmetric Rigid Body . . . . . . . . . . . . . . . 299
6.8.1 Computational Issues . . . . . . . . . . . . . . . . . . . . . . 300
6.9 Industrial Robot . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 304
6.10 Multibody Mechanism . . . . . . . . . . . . . . . . . . . . . . . . . . 310
6.11 Kinematic Chain . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 315
6.12 Dynamic MPEC . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 322
6.13 Free-Flying Robot . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 326
6.14 Kinetic Batch Reactor . . . . . . . . . . . . . . . . . . . . . . . . . . . 331
6.15 Delta III Launch Vehicle . . . . . . . . . . . . . . . . . . . . . . . . . 336
6.16 A Two-Strain Tuberculosis Model . . . . . . . . . . . . . . . . . . . . 345
6.17 Tumor Anti-angiogenesis . . . . . . . . . . . . . . . . . . . . . . . . . 348

7 Advanced Applications 353


7.1 Optimal Lunar Swingby Trajectories . . . . . . . . . . . . . . . . . . . 353
7.1.1 Background and Motivation . . . . . . . . . . . . . . . . . . 353
7.1.2 Optimal Lunar Transfer Examples . . . . . . . . . . . . . . . 355
Synchronous Equatorial . . . . . . . . . . . . . . . . . . . . . 355
Polar, 24 hr (A) . . . . . . . . . . . . . . . . . . . . . . . . . 355
Polar, 24 hr (B) . . . . . . . . . . . . . . . . . . . . . . . . . 355
Retrograde Molniya . . . . . . . . . . . . . . . . . . . . . . . 357
7.1.3 Equations of Motion . . . . . . . . . . . . . . . . . . . . . . 357
7.1.4 Kepler Orbit Propagation . . . . . . . . . . . . . . . . . . . . 358
7.1.5 Differential-Algebraic Formulation of Three-Body Dynamics . 360
7.1.6 Boundary Conditions . . . . . . . . . . . . . . . . . . . . . . 360
7.1.7 A Four-Step Solution Technique . . . . . . . . . . . . . . . . 362
Step 1: Three-Impulse, Conic Solution . . . . . . . . . . . . . 362
Step 2: Three-Body Approximation . . . . . . . . . . . . . . 364
Step 3: Fixed Swingby Time . . . . . . . . . . . . . . . . . . 365
Step 4: Optimal Three-Body Solution . . . . . . . . . . . . . 366
7.1.8 Solving the Subproblems . . . . . . . . . . . . . . . . . . . . 366
Is Mesh Refinement Needed? . . . . . . . . . . . . . . . . . . 368
DAE or ODE Formulation? . . . . . . . . . . . . . . . . . . . 370
xii Contents

7.2 Multiple-Pass Aero-Assisted Orbit Transfer . . . . . . . . . . . . . . . 372


7.2.1 Orbital Phases . . . . . . . . . . . . . . . . . . . . . . . . . . 372
7.2.2 Atmospheric Phases . . . . . . . . . . . . . . . . . . . . . . . 373
7.2.3 Boundary Conditions . . . . . . . . . . . . . . . . . . . . . . 375
7.2.4 Initial Guess . . . . . . . . . . . . . . . . . . . . . . . . . . . 377
7.2.5 Numerical Results . . . . . . . . . . . . . . . . . . . . . . . 379
7.3 Delay Differential Equations . . . . . . . . . . . . . . . . . . . . . . . 385
7.4 In-Flight Dynamic Optimization of Wing Trailing Edge Surface
Positions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 396
7.4.1 Aircraft Dynamics for Drag Estimation . . . . . . . . . . . . 398
7.4.2 Step 1: Reference Trajectory Estimation . . . . . . . . . . . . 400
7.4.3 Step 2: Aerodynamic Drag Model Approximation . . . . . . . 401
7.4.4 Step 3: Optimal Camber Prediction . . . . . . . . . . . . . . 402
7.4.5 Numerical Results . . . . . . . . . . . . . . . . . . . . . . . 402
777-200ER Flight Test . . . . . . . . . . . . . . . . . . . . . 402
Performance Comparison . . . . . . . . . . . . . . . . . . . . 403

8 Epilogue 411

Appendix: Software 413


A.1 Simplified Usage Dense NLP . . . . . . . . . . . . . . . . . . . . . . . 413
A.2 Sparse NLP with Sparse Finite Differences . . . . . . . . . . . . . . . . 413
A.3 Optimal Control Using Sparse NLP . . . . . . . . . . . . . . . . . . . . 414

Bibliography 417

Index 431
Preface

Solving an optimal control or estimation problem is not easy. Pieces of the puzzle
are found scattered throughout many different disciplines. Furthermore, the focus of this
book is on practical methods, that is, methods that I have found actually work! In fact
everything described in this book has been implemented in production software and used to
solve real optimal control problems. Although the reader should be proficient in advanced
mathematics, no theorems are presented.
Traditionally, there are two major parts of a successful optimal control or optimal
estimation solution technique. The first part is the “optimization” method. The second part
is the “differential equation” method. When faced with an optimal control or estimation
problem it is tempting to simply “paste” together packages for optimization and numerical
integration. While naive approaches such as this may be moderately successful, the goal of
this book is to suggest that there is a better way! The methods used to solve the differential
equations and optimize the functions are intimately related.
The first two chapters of this book focus on the optimization part of the problem. In
Chapter 1 the important concepts of nonlinear programming for small dense applications
are introduced. Chapter 2 extends the presentation to problems which are both large and
sparse. Chapters 3 and 4 address the differential equation part of the problem. Chapter
3 introduces relevant material in the numerical solution of differential (and differential-
algebraic) equations. Methods for solving the optimal control problem are treated in some
detail in Chapter 4. Throughout the book the interaction between optimization and integra-
tion is emphasized. Chapter 5 describes how to solve optimal estimation problems. Chapter
6 presents a collection of examples that illustrate the various concepts and techniques. Real
world problems often require solving a sequence of optimal control and/or optimization
problems, and Chapter 7 describes a collection of these “advanced applications.”
While the book incorporates a great deal of new material not covered in Practical
Methods for Optimal Control Using Nonlinear Programming [21], it does not cover every-
thing. Many important topics are simply not discussed in order to keep the overall presen-
tation concise and focused. The discussion is general and presents a unified approach to
solving optimal estimation and control problems. Most of the examples are drawn from
my experience in the aerospace industry. Examples have been solved using a particular
implementation called SOCS. I have tried to adhere to notational conventions from both
optimization and control theory whenever possible. Also, I have attempted to use consistent
notation throughout the book.
The material presented here represents the collective contributions of many peo-
ple. The nonlinear programming material draws heavily on the work of John Dennis,
Roger Fletcher, Phillip Gill, Sven Leyffer, Walter Murray, Michael Saunders, and Mar-

xiii
xiv Preface

garet Wright. The material on differential-algebraic equations (DAEs) is drawn from the
work of Uri Ascher, Kathy Brenan, and Linda Petzold. Ray Spiteri graciously shared his
classroom notes on DAEs. I was introduced to optimal control by Stephen Citron, and I
routinely refer to the text by Bryson and Ho [54]. Over the past 20 years I have been for-
tunate to participate in workshops at Oberwolfach, Munich, Minneapolis, Victoria, Banff,
Lausanne, Griefswald, Stockholm, and Fraser Island. I’ve benefited immensely simply
by talking with Larry Biegler, Hans Georg Bock, Roland Bulirsch, Rainer Callies, Kurt
Chudej, Tim Kelley, Bernd Kugelmann, Helmut Maurer, Rainer Mehlhorn, Angelo Miele,
Hans Josef Pesch, Ekkehard Sachs, Gottfried Sachs, Roger Sargent, Volker Schulz, Mark
Steinbach, Oskar von Stryk, and Klaus Well.
Three colleagues deserve special thanks. Interaction with Steve Campbell and his
students has inspired many new results and interesting topics. Paul Frank has played a
major role in the implementation and testing of the large, sparse nonlinear programming
methods described. Bill Huffman, my coauthor for many publications and the SOCS soft-
ware, has been an invaluable sounding board over the last two decades. Finally, I thank
Jennifer for her patience and understanding during the preparation of this book.

John T. Betts
Chapter 1

Introduction to Nonlinear
Programming

1.1 Preliminaries
This book concentrates on numerical methods for solving the optimal control problem.
The fundamental principle of all effective numerical optimization methods is to solve a
difficult problem by solving a sequence of simpler subproblems. In particular, the solution
of an optimal control problem will require the solution of one or more finite-dimensional
subproblems. As a prelude to our discussions on optimal control, this chapter will focus
on the nonlinear programming (NLP) problem. The NLP problem requires finding a finite
number of variables such that an objective function or performance index is optimized
without violating a set of constraints. The NLP problem is often referred to as parameter
optimization. Important special cases of the NLP problem include linear programming
(LP), quadratic programming (QP), and least squares problems.
Before proceeding further, it is worthwhile to establish the notational conventions
used throughout the book. This is especially important since the subject matter covers a
number of different disciplines, each with its own notational conventions. Our goal is to
present a unified treatment of all these fields. As a rule, scalar quantities will be denoted by
lowercase letters (e.g., α). Vectors will be denoted by boldface lowercase letters and will
usually be considered column vectors, as in
 
x1
x2 
 
x =  . , (1.1)
 .. 
xn

where the individual components of the vector are x k for k = 1, . . ., n. To save space, it will
often be convenient to define the transpose, as in

xT = (x 1 , x 2 , . . . , x n ). (1.2)

A sequence of vectors will often be denoted as xk , xk+1 , . . . . Matrices will be denoted by

1
2 Chapter 1. Introduction to Nonlinear Programming

boldface capital letters, as in


 
a11 a12 ... a1n
 a21 a22 ... a2n 
 
A= . . (1.3)
 .. 
am1 am2 ... amn

1.2 Newton’s Method in One Variable


The fundamental approach to most iterative schemes was suggested over 300 years ago by
Newton. In fact, Newton’s method is the basis for all of the algorithms we will describe.
We begin with the simplest form of Newton’s method and then in subsequent sections gen-
eralize the discussion until we have presented one of the most widely used NLP algorithms,
namely the sequential quadratic programming (SQP) method.
Suppose it is required to find the value of the variable x such that the constraint
function
c(x) = 0. (1.4)
Let us denote the solution by x ∗ and let us assume x is a guess for the solution. The basic
idea of Newton’s method is to approximate the nonlinear function c(x) by the first two terms
in a Taylor series expansion about the current point x. This yields a linear approximation
for the constraint function at the new point x̄, which is given by

c(x̄) = c(x) + c(x)(x̄ − x), (1.5)

where c (x) = dc/d x is the slope of the constraint at x. Using this linear approximation, it
is reasonable to compute x̄, a new estimate for the root, by solving (1.5) such that c(x̄) = 0,
i.e.,
x̄ = x − [c (x)]−1c(x). (1.6)
Typically, we denote p ≡ x̄ − x and rewrite (1.6) as

x̄ = x + p, (1.7)

where
p = −[c (x)]−1c(x). (1.8)
Of course, in general, c(x) is not a linear function of x, and consequently we cannot
expect that c(x̄) = 0. However, we might hope that x̄ is a better estimate for the root x ∗
than the original guess x; in other words we might expect that

|x̄ − x ∗ | ≤ |x − x ∗ | (1.9)

and
|c(x̄)| ≤ |c(x)|. (1.10)
If the new point is an improvement, then it makes sense to repeat the process, thereby
defining a sequence of points x (0) , x (1) , x (2) , . . . with point (k + 1) in the sequence given by

x (k+1) = x (k) − [c(x (k) )]−1 c(x (k) ). (1.11)


1.2. Newton’s Method in One Variable 3

For notational convenience, it usually suffices to present a single step of the algorithm, as in
(1.6), instead of explicitly labeling the information at step k using the superscript notation
x (k) . Nevertheless, it should be understood that the algorithm defines a sequence of points
x (0) , x (1) , x (2) , . . . . The sequence is said to converge to x ∗ if

lim |x (k) − x ∗| = 0. (1.12)


k→∞

In practice, of course, we are not interested in letting k → ∞. Instead we are satisfied with
terminating the sequence when the computed solution is “close” to the answer. Further-
more, the rate of convergence is of paramount importance when measuring the computa-
tional efficiency of an algorithm. For Newton’s method, the rate of convergence is said to
be quadratic or, more precisely, q-quadratic (cf. [71]). The impact of quadratic conver-
gence can be dramatic. Loosely speaking, it implies that each successive estimate of the
solution will double the number of significant digits!
Example 1.1 N EWTON ’ S M ETHOD —ROOT F INDING. To demonstrate, let us sup-
pose we want to solve the constraint

c(x) = a1 + a2 x + a3 x 2 = 0, (1.13)

where the coefficients a1 , a2 , a3 are chosen such that c(0.1) = −0.05, c(0.25) = 0, and
c(0.9) = 0.9. Table 1.1 presents the Newton iteration sequence beginning from the initial
guess x = 0.85 and proceeding to the solution at x ∗ = 0.25. Figure 1.1 illustrates the
first three iterations. Notice in Table 1.1 that the error between the computed solution and
the true value, which is tabulated in the third column, exhibits the expected doubling in
significant figures from the fourth iteration to convergence.
So what is wrong with Newton’s method? Clearly, quadratic convergence is a very
desirable property for an algorithm to possess. Unfortunately, if the initial guess is not
sufficiently close to the solution, i.e., within the region of convergence, Newton’s method
may diverge. As a simple example, Dennis and Schnabel [71] suggest applying Newton’s
method to solve c(x) = arctan(x) = 0. This will diverge when the initial guess |x (0) | > a,
converge when |x (0) | < a, and cycle indefinitely if |x (0) | = a, where a = 1.3917452002707.
In essence, Newton’s method behaves well near the solution (locally) but lacks something
permitting it to converge globally. So-called globalization techniques, aimed at correcting
this deficiency, will be discussed in subsequent sections. A second difficulty occurs when

Table 1.1. Newton’s method for root finding.


Iter. c(x) x |x − x ∗ |
1 0.79134615384615 0.85000000000000 0.60000000000000
2 0.18530192382759 0.47448669201521 0.22448669201521
3 3.5942428588261×10−2 0.30910437279376 5.9104372793756×10−2
4 3.6096528286200×10−3 0.25669389900972 6.6938990097217×10−3
5 5.7007630268141×10−5 0.25010744198003 1.0744198002549×10−4
6 1.5161639596584×10−8 0.25000002858267 2.8582665845267×10−8
7 1.0547118733939×10−15 0.25000000000000 1.8873791418628×10−15
4 Chapter 1. Introduction to Nonlinear Programming

Figure 1.1. Newton’s method for root finding.

the slope c (x) = 0. Clearly, the correction defined by (1.6) is not well defined in this case.
In fact, Newton’s method loses its quadratic convergence property if the slope is zero at
the solution, i.e., c (x ∗ ) = 0. Finally, Newton’s method requires that the slope c (x) can
be computed at every iteration. This may be difficult and/or costly, especially when the
function c(x) is complicated.

1.3 Secant Method in One Variable


Motivated by a desire to eliminate the explicit calculation of the slope, one can consider
approximating it at x k by the secant

c(x k ) − c(x k−1) c


c (x k ) ≈ B = ≡ . (1.14)
x −x
k k−1 x
Notice that this approximation is constructed using two previous iterations but requires
values only for the constraint function c(x). This expression can be rewritten to give the
so-called secant condition
Bx = c, (1.15)
where B is the (scalar) secant approximation to the slope. Using this approximation, it then
follows that the Newton iteration (1.6) is replaced by the secant iteration

x̄ = x − B −1c(x) = x + p, (1.16)
1.4. Newton’s Method for Minimization in One Variable 5

Figure 1.2. Secant method for root finding.

which is often written as

x k − x k−1
x k+1 = x k − c(x k ). (1.17)
c(x k ) − c(x k−1)

Figure 1.2 illustrates a secant iteration applied to Example 1.1 described in the pre-
vious section.
Clearly, the virtue of the secant method is that it does not require calculation of the
slope c (x k ). While this may be advantageous when derivatives are difficult to compute,
there is a downside! The secant method is superlinearly convergent, which, in general, is
not as fast as the quadratically convergent Newton algorithm. Thus, we can expect conver-
gence will require more iterations, even though the cost per iteration is less. A distinguish-
ing feature of the secant method is that the slope is approximated using information from
previous iterates in lieu of a direct evaluation. This is the simplest example of a so-called
quasi-Newton method.

1.4 Newton’s Method for Minimization in One Variable


Now let us suppose we want to compute the value x ∗ such that the nonlinear objective
function F(x ∗ ) is a minimum. The basic notion of Newton’s method for root finding is
to approximate the nonlinear constraint function c(x) by a simpler model (i.e., linear) and
then compute the root for the linear model. If we are to extend this philosophy to opti-
mization, we must construct an approximate model of the objective function. Just as in the
6 Chapter 1. Introduction to Nonlinear Programming

development of (1.5), let us approximate F(x) by the first three terms in a Taylor series
expansion about the current point x:
1
F(x̄) = F(x) + F (x)(x̄ − x) + (x̄ − x)F (x)(x̄ − x). (1.18)
2
Notice that we cannot use a linear model for the objective because a linear function does
not have a finite minimum point. In contrast, a quadratic approximation to F(x) is the
simplest approximation that does have a minimum. Now for x̄ to be a minimum of the
quadratic (1.18), we must have
dF
≡ F  (x̄) = 0 = F  (x) + F (x)(x̄ − x). (1.19)
d x̄
Solving for the new point yields
x̄ = x − [F  (x)]−1 F  (x). (1.20)
The derivation has been motivated by minimizing F(x). Is this equivalent to solving the
slope condition F  (x) = 0? It would appear that the iterative optimization sequence defined
by (1.20) is the same as the iterative root-finding sequence defined by (1.6), provided we
replace c(x) by F  (x). Clearly, a quadratic model for the objective function (1.18) produces
a linear model for the slope F  (x). However, the condition F  (x) = 0 defines only a sta-
tionary point, which can be a minimum, a maximum, or a point of inflection. Apparently
what is missing is information about the curvature of the function, which would determine
whether it is concave up, concave down, or neither.
Figure 1.3 illustrates a typical situation. In the illustration, there are two points
with zero slopes; however, there is only one minimum point. The minimum point is dis-

Figure 1.3. Minimization in one variable.


1.5. Newton’s Method in Several Variables 7

tinguished from the maximum by the algebraic sign of the second derivative F  (x). For-
mally, we have
Necessary Conditions:

F  (x ∗ ) = 0, (1.21)
F  (x ∗ ) ≥ 0; (1.22)

Sufficient Conditions:

F  (x ∗ ) = 0, (1.23)
 ∗
F (x ) > 0. (1.24)

Note that the sufficient conditions require that F  (x ∗ ) > 0, defining a strong local
minimizer in contrast to a weak local minimizer, which may have F  (x ∗ ) = 0. It is also
important to observe that these conditions define a local rather than a global minimizer.

1.5 Newton’s Method in Several Variables


The preceding sections have addressed problems involving a single variable. In this section,
let us consider generalizing the discussion to functions of many variables. In particular, let
us consider how to find the n-vector xT = (x 1 , . . . , x n ) such that
 
c1 (x)
 
c(x) =  ...  = 0. (1.25)
cm (x)

For the present, let us assume that the number of constraints and variables is the same, i.e.,
m = n. Just as in one variable, a linear approximation to the constraint functions analogous
to (1.5) is given by
c(x) = c(x) + G(x − x), (1.26)
where the Jacobian matrix G is defined by
 
∂c1 ∂c1 ∂c1
...
 ∂ x1 ∂ x2 ∂ xn 
 
 ∂c2 ∂c2 ∂c2 
...
∂c 
 ∂ x1 ∂ x2 ∂ xn 

G≡ = .. . (1.27)
∂x  
 . 
 
 ∂cm ∂cm ∂cm

∂ x1 ∂ x2 ... ∂ xn

By convention, the m rows of G correspond to constraints and the n columns to variables.


As in one variable, if we require that c(x) = 0 in (1.26), we can solve the linear system

Gp = −c (1.28)
8 Chapter 1. Introduction to Nonlinear Programming

for the search direction p, which leads to an iteration of the form

x = x + p. (1.29)

Thus, each Newton iteration requires a linear approximation to the nonlinear con-
straints c, followed by a step from x to the solution of the linearized constraints at x. Figure
1.4 illustrates a typical situation when n = m = 2. It is important to remark that the multi-
dimensional version of Newton’s method shares all of the properties of its one-dimensional
counterpart. Specifically, the method is quadratically convergent provided it is within a
region of convergence, and it may diverge unless appropriate globalization strategies are
employed. Furthermore, in order to solve (1.28) it is necessary that the Jacobian G be non-
singular, which is analogous to requiring that c (x) = 0 in the univariate case. And, finally,
it is necessary to actually compute G, which can be costly.

Figure 1.4. Newton’s method in two variables.

1.6 Unconstrained Optimization


Let us now consider the multidimensional unconstrained minimization problem. Suppose
we want to find the n-vector xT = (x 1, . . . , x n ) such that the function F(x) = F(x 1 , . . . , x n ) is
a minimum. Just as in the univariate case (1.18), let us approximate F(x) by the first three
terms in a Taylor series expansion about the point x:

1
F(x) = F(x) + gT (x)(x − x) + (x − x)T H(x)(x − x). (1.30)
2
1.6. Unconstrained Optimization 9

The Taylor series expansion involves the n-dimensional gradient vector


 
∂F
 ∂ x1 
 
 .. 
g(x) ≡ ∇x F =  .  (1.31)
 
 
∂F
∂ xn

and the symmetric n × n Hessian matrix


 
∂2 F ∂2 F ∂2 F
...
 ∂ x 12 ∂ x1 ∂ x2 ∂ x1 ∂ xn 
 
 ∂2 F ∂2 F ∂2 F 
 ∂ x2 ∂ x1 ... ∂ x2 ∂ xn 
 ∂ x 22 
H ≡ ∇x x F = 
2
.. . (1.32)
 
 . 
 
 
∂2 F ∂2 F ∂2 F
∂ xn ∂ x1 ∂ xn ∂ x2 ... ∂ x n2

It is common to define the search direction p = x − x and then rewrite (1.30) as

1
F(x) = F(x) + gT p + pT Hp. (1.33)
2

The scalar term gT p is referred to as the directional derivative along p and the scalar term
pT Hp is called the curvature or second directional derivative in the direction p.
It is instructive to examine the behavior of the series (1.33). First, let us suppose
that the expansion is about the minimum point x∗ . Now if x∗ is a local minimum, then the
objective function must be larger at all neighboring points, that is, F(x) > F(x∗ ). In order
for this to be true, the slope in all directions must be zero, that is, (g∗ )T p = 0, which implies
we must have  
g1 (x∗ )
 .. 
g(x∗ ) =  .  = 0. (1.34)
gn (x∗ )
This is just the multidimensional analogue of the condition (1.21). Furthermore, if the
function curves up in all directions, the point x∗ is called a strong local minimum and the
third term in the expansion (1.33) must be positive:

pT H∗ p > 0. (1.35)

A matrix1 that satisfies this condition is said to be positive definite. If there are some
directions with zero curvature, i.e., pT H∗ p ≥ 0, then H∗ is said to be positive semidefinite. If
there are directions with both positive and negative curvature, the matrix is called indefinite.
In summary, we have
1 H∗ ≡ H(x∗ ) (not the conjugate transpose, as in some texts).
10 Chapter 1. Introduction to Nonlinear Programming

Necessary Conditions:

g(x∗ ) = 0, (1.36)
p H∗ p ≥ 0;
T
(1.37)

Sufficient Conditions:

g(x∗ ) = 0, (1.38)
p H∗ p > 0.
T
(1.39)

The preceding discussion was motivated by an examination of the Taylor series about
the minimum point x∗ . Let us now consider the same quadratic model about an arbitrary
point x. Then it makes sense to choose a new point x such that the gradient at x is zero. The
resulting linear approximation to the gradient is just

g = 0 = g + Hp, (1.40)

which can be solved to yield the Newton search direction

p = −H−1 g. (1.41)

Just as before, the Newton iteration is defined by (1.29). Since this iteration is based on
finding a zero of the gradient vector, there is no guarantee that the step will move toward a
local minimum rather than a stationary point or maximum. To preclude this, we must insist
that the step be downhill, which requires satisfying the so-called descent condition

gT p < 0. (1.42)

It is interesting to note that, if we use the Newton direction (1.41), the descent condition
becomes
gT p = −gT H−1 g < 0, (1.43)
which can be true only if the Hessian is positive definite, i.e., (1.35) holds.

1.7 Recursive Updates


Regardless of whether Newton’s method is used for solving nonlinear equations, as in
Section 1.5, or for optimization, as described in Section 1.6, it is necessary to compute
derivative information. In particular, one must compute either the Jacobian matrix (1.27)
or the Hessian matrix (1.32). For many applications, this can be a costly computational
burden. Quasi-Newton methods attempt to construct this information recursively. A brief
overview of the most important recursive updates is included, although a more complete
discussion can be found in [71], [99], and [82].
The basic idea of a recursive update is to construct a new estimate of the Jacobian or
Hessian using information from previous iterates. Most well-known recursive updates are
of the form
B = B + R(c, x), (1.44)
1.7. Recursive Updates 11

where the new estimate B is computed from the old estimate B. Typically, this calculation
involves a low-rank modification R(c, x) that can be computed from the previous step:
c = ck − ck−1 , (1.45)
x = xk − xk−1 . (1.46)
The usual way to construct the update is to insist that the secant condition
Bx = c (1.47)
hold and then construct an approximation B that is “close” to the previous estimate B. In
Section 1.3, the simplest form of this condition (1.15) led to the secant method. In fact, the
generalization of this formula, proposed in 1965 by Broyden [50], is
(c − Bx) (x)T
B = B+ , (1.48)
(x)T x
which is referred to as the secant or Broyden update. The recursive formula constructs a
rank-one modification that satisfies the secant condition and minimizes the Frobenius norm
between the estimates.
When a quasi-Newton method is used to approximate the Hessian matrix, as required
for minimization, one cannot simply replace c with g in the secant update. In particular,
the matrix B constructed using (1.48) is not symmetric. However, there is a rank-one update
that does maintain symmetry, known as the symmetric rank-one (SR1) update:
(g − Bx)(g − Bx)T
B = B+ , (1.49)
(g − Bx)T x
where g ≡ gk − gk−1 . While the SR1 update does preserve symmetry, it does not neces-
sarily maintain a positive definite approximation. In contrast, the update
g(g)T Bx(x)T B
B = B+ − (1.50)
(g)T x (x)T Bx
is a rank-two positive definite secant update provided (x)T g > 0 is enforced at each
iteration. This update was discovered independently by Broyden [51], Fletcher [81], Gold-
farb [103], and Shanno [159] in 1970 and is known as the BFGS update.
The effective computational implementation of a quasi-Newton update introduces a
number of additional considerations. When solving nonlinear equations, the search direc-
tion from (1.28) is p = −G−1 c, and for optimization problems the search direction given
by (1.41) is p = −H−1 g. Since the search direction calculation involves the matrix inverse
(either G−1 or H−1 ), one apparent simplification is to apply the recursive update directly
to the inverse. In this case, the search direction can be computed simply by computing the
matrix-vector product. This approach was proposed by Broyden for nonlinear equations,
but has been considerably less successful in practice than the update given by (1.48), and
is known as “Broyden’s bad update.” For unconstrained minimization, let us make the sub-
stitutions x → g, g → x, and B → B−1 in (1.50). By computing the inverse of the
resulting expression, one obtains
(g − Bx)(g)T + g(g − Bx)T
B = B+ − σ g(g)T , (1.51)
(g)T x
Discovering Diverse Content Through
Random Scribd Documents
CHAPTER V.

P OOR RUTH! her sky so soon overcast! As the door closed on the
prim, retreating figure of her mother-in-law, she burst into tears.
But she was too sensible a girl to weep long. She wiped her eyes,
and began to consider what was to be done. It would never do to
complain to Harry—dear Harry. He would have to take sides; oh no,
that would never do; she could never complain to him of his own
mother. But why did he bring them together? knowing, as he must
have known, how little likely they were to assimilate. This thought
she smothered quickly, but not before it had given birth to a sigh,
close upon the heels of which love framed this apology: It was so
long since Harry had lived under the same roof with his mother he
had probably forgotten her eccentricities; and then she was so
dotingly fond of him, that probably no points of collision ever came
up between the two.
In the course of an hour, what with cold bathing and philosophy,
Ruth’s eyes and equanimity were placed beyond the suspicion even
of a newly-made husband, and when she held up her lips to him so
temptingly, on his return, he little dreamed of the self-conquest she
had so tearfully achieved for his sake.
CHAPTER VI.

H ARRY’S father began life on a farm in Vermont. Between


handling ploughs, hoes, and harrows, he had managed to pick
up sufficient knowledge to establish himself as a country doctor; well
contented to ride six miles on horseback of a stormy night, to extract
a tooth for some distracted wretch, for twenty-five cents. Naturally
loquacious, and equally fond of administering jalap and gossip, he
soon became a great favorite with the “women folks,” which every
aspiring Esculapius, who reads this, knows to be half the battle.
They soon began to trust him, not only in drawing teeth, but in
cases involving the increase of the village census. Several successes
in this line, which he took no pains to conceal, put him behind a gig
of his own, and enabled his practice to overtake his fame as far as
the next village.
Like many other persons, who revolve all their life in a peck
measure, the doctor’s views of the world in general, and its denizens
in particular, were somewhat circumscribed. Added to this, he was as
persevering as a fly in the dog-days, and as immovable as the old
rusty weather-cock on the village meeting-house, which for twenty
years had never been blown about by any whisking wind of doctrine.
“When he opened his mouth, no dog must bark;” and any dissent
from his opinion, however circumspectly worded, he considered a
personal insult. As his wife entertained the same liberal views,
occasional conjugal collisions, on this narrow track, were the
consequence; the interest of which was intensified by each
reminding the other of their Calvinistic church obligations to keep
the peace. They had, however, one common ground of undisputed
territory—their “Son Harry,” who was as infallible as the Pope, and
(until he got married) never did a foolish thing since he was born.
On this last point, their “Son Harry” did not exactly agree with them,
as he considered it decidedly the most delightful negotiation he had
ever made, and one which he could not even think of without a
sudden acceleration of pulse.
Time wore on, the young couple occupying their own suite of
apartments, while the old people kept house. The doctor, who had
saved enough to lay his saddle-bags with his medical books on the
shelf, busied himself, after he had been to market in the morning, in
speculating on what Ruth was about, or in peeping over the
balustrade, to see who called when the bell rang; or, in counting the
wood-pile, to see how many sticks the cook had taken to make the
pot boil for dinner. The second girl (a supernumerary of the bridal
week) had long since been dismissed; and the doctor and his wife
spent their evenings with the cook, to save the expense of burning
an extra lamp. Consequently, Betty soon began to consider herself
one of the family, and surprised Ruth one day by modestly
requesting the loan of her bridal veil “to wear to a little party;” not
to speak of sundry naps to which she treated herself in Ruth’s
absence, in her damask rocking chair, which was redolent, for some
time after, of a strong odor of dish-water.
Still, Ruth kept her wise little mouth shut; moving, amid these
discordant elements, as if she were deaf, dumb, and blind.
Oh, love! that thy silken reins could so curb the spirit and bridle the
tongue, that thy uplifted finger of warning could calm that bounding
pulse, still that throbbing heart, and send those rebellious tears,
unnoticed, back to their source.
Ah! could we lay bare the secret history of many a wife’s heart, what
martyrs would be found, over whose uncomplaining lips the grave
sets its unbroken seal of silence.
But was Harry blind and deaf? Had the bridegroom of a few months
grown careless and unobservant? Was he, to whom every hair of
that sunny head was dear, blind to the inward struggles, marked
only by fits of feverish gaiety? Did he never see the sudden ruse to
hide the tell-tale blush, or starting tear? Did it escape his notice, that
Ruth would start, like a guilty thing, if a sudden impulse of
tenderness betrayed her into laying her hand upon his forehead, or
leaning her head upon his shoulder, or throwing her arms about his
neck, when the jealous mother was by? Did not his soul bend the
silent knee of homage to that youthful self-control that could repress
its own warm emotions, and stifle its own sorrows, lest he should
know a heart-pang?
Yes; Ruth read it in the magnetic glance of the loving eye as it
lingeringly rested on her, and in the low, thrilling tone of the
whispered, “God bless you, my wife;” and many an hour, when alone
in his counting room, was Harry, forgetful of business, revolving
plans for a separate home for himself and Ruth.
This was rendered every day more necessary, by the increased
encroachments of the old people, who insisted that no visitors
should remain in the house after the old-fashioned hour of nine; at
which time the fire should be taken apart, the chairs set up, the
lights extinguished, and a solemn silence brood until the next
morning’s cock-crowing. It was also suggested to the young couple,
that the wear and tear of the front entry carpet might be saved by
their entering the house by the back gate, instead of the front door.
Meals were very solemn occasions; the old people frowning, at such
times, on all attempts at conversation, save when the doctor
narrated the market prices he paid for each article of food upon the
table. And so time wore on. The old couple, like two scathed trees,
dry, harsh, and uninviting, presenting only rough surfaces to the
clinging ivy, which fain would clothe with brightest verdure their
leafless branches.
CHAPTER VII.

H ARK! to that tiny wail! Ruth knows that most blessed of all
hours. Ruth is a mother! Joy to thee, Ruth! Another outlet for
thy womanly heart; a mirror, in which thy smiles and tears shall be
reflected back; a fair page, on which thou, God-commissioned,
mayst write what thou wilt; a heart that will throb back to thine, love
for love.
But Ruth thinks not of all this now, as she lies pale and motionless
upon the pillow, while Harry’s grateful tears bedew his first-born’s
face. She cannot even welcome the little stranger. Harry thought her
dear to him before; but now, as she lies there, so like death’s
counterpart, a whole life of devotion would seem too little to prove
his appreciation of all her sacrifices.
The advent of the little stranger was viewed through very different
spectacles by different members of the family. The doctor regarded
it as a little automaton, for pleasant Æsculapian experiments in his
idle hours; the old lady viewed it as another barrier between herself
and Harry, and another tie to cement his already too strong
attachment for Ruth; and Betty groaned, when she thought of the
puny interloper, in connection with washing and ironing days; and
had already made up her mind that the first time its nurse used her
new saucepan to make gruel, she would strike for higher wages.
Poor, little, unconscious “Daisy,” with thy velvet cheek nestled up to
as velvet a bosom, sleep on; thou art too near heaven to know a
taint of earth.
CHAPTER VIII.

R UTH’S nurse, Mrs. Jiff, was fat, elephantine, and unctuous.


Nursing agreed with her. She had “tasted” too many bowls of
wine-whey on the stairs, tipped up too many bottles of porter in the
closet, slid down too many slippery oysters before handing them to
“her lady,” not to do credit to her pantry devotions. Mrs. Jiff wore an
uncommonly stiff gingham gown, which sounded, every time she
moved, like the rustle of a footfall among the withered leaves of
autumn. Her shoes were new, thick, and creaky, and she had a
wheezy, dilapidated-bellowsy way of breathing, consequent upon the
consumption of the above-mentioned port and oysters, which was
intensely crucifying to a sick ear.
Mrs. Jiff always “forgot to bring” her own comb and hair brush. She
had a way, too, of opening drawers and closets “by mistake,”
thereby throwing her helpless victim into a state of profuse
perspiration. Then she would go to sleep between the andirons, with
the new baby on the edge of her knee, in alarming proximity to the
coals; would take a pinch of snuff over the bowl of gruel in the
corner, and knock down the shovel, poker, and tongs, every time she
went near the fire; whispering—sh—sh—sh—at the top of her lungs,
as she glanced in the direction of the bed, as if its demented
occupant were the guilty cause of the accident.
Mrs. Jiff had not nursed five-and-twenty years for nothing. She
particularly affected taking care of young mothers, with their first
babies; knowing very well that her chain shortened, with every after
addition to maternal experience: she considered herself, therefore,
quite lucky in being called upon to superintend little Daisy’s advent.
It did occasionally cross Ruth’s mind as she lay, almost fainting with
exhaustion, on the pillow, while the ravenous little Daisy cried, “give,
give,” whether it took Mrs. Jiff two hours to make one cup of tea,
and brown one slice of toast; Mrs. Jiff solacing herself, meanwhile,
over an omelette in the kitchen, with Betty, and pouring into her
ready ears whole histories of “gen’lemen as wasn’t gen’lemen,
whose ladies she nursed,” and how “nobody but herself knew how
late they did come home when their wives were sick, though, to be
sure, she’d scorn to tell of it!” Sometimes, also, Ruth innocently
wondered if it was necessary for the nurse to occupy the same bed
with “her lady;” particularly when her circumference was as
Behemoth-ish, and her nose as musical as Mrs. Jiff’s; and whether
there would be any impropriety in her asking her to take the babe
and keep it quiet part of the night, that she might occasionally get a
nap. Sometimes, too, she considered the feasibility of requesting
Mrs. Jiff not to select the time when she (Ruth) was sipping her
chocolate, to comb out her “false front,” and polish up her artificial
teeth; and sometimes she marvelled why, when Mrs. Jiff paid such
endless visits to the kitchen, she was always as fixed as the North
Star, whenever dear Harry came in to her chamber to have a
conjugal chat with her.
CHAPTER IX.

“H OW do you do this morning, Ruth?” said the old lady, lowering


herself gradually into a softly-cushioned arm chair. “How your
sickness has altered you! You look like a ghost? I shouldn’t wonder if
you lost all your hair; it is no uncommon thing in sickness; or your
teeth either. How’s the baby? She don’t favor our side of the house
at all. She is quite a plain child, in fact. Has she any symptoms, yet,
of a sore mouth? I hope not, because she will communicate it to
your breast, and then you’ll have a time of it. I knew a poor, feeble
thing once, who died of it. Of course, you intend, when Mrs. Jiff
leaves, to take care of the baby yourself; a nursery girl would be
very expensive.”
“I believe Harry has already engaged one,” said Ruth.
“I don’t think he has,” said the old lady, sitting up very straight,
“because it was only this morning that the doctor and I figured up
the expense it would be to you, and we unanimously came to the
conclusion to tell Harry that you’d better take care of the child
yourself. I always took care of my babies. You oughtn’t to have
mentioned a nursery girl, at all, to Harry.”
“He proposed it himself,” replied Ruth; “he said I was too feeble to
have the care of the child.”
“Pooh! pshaw! stuff! no such thing. You are well enough, or will be,
before long. Now, there’s a girl’s board to begin with. Servant girls
eat like boa-constrictors. Then, there’s the soap and oil she’ll waste;
—oh, the thing isn’t to be thought of; it is perfectly ruinous. If you
hadn’t made a fool of Harry, he never could have dreamed of it. You
ought to have sense enough to check him, when he would go into
such extravagances for you, but some people haven’t any sense.
Where would all the sugar, and starch, and soap, go to, I’d like to
know, if we were to have a second girl in the house? How long
would the wood-pile, or pitch-kindlings, or our new copper-boiler
last? And who is to keep the back gate bolted, with such a chit flying
in and out?”
“Will you please hand me that camphor bottle?” said Ruth, laying her
hand upon her throbbing forehead.

“How’s my little snow-drop to-day?” said Harry, entering Ruth’s room


as his mother swept out; “what ails your eyes, Ruth?” said her
husband, removing the little hands which hid them.
“A sudden pain,” said Ruth, laughing gaily; “it has gone now; the
camphor was too strong.”
Good Ruth! brave Ruth! Was Harry deceived? Something ails his
eyes, now; but Ruth has too much tact to notice it.
Oh Love! thou skilful teacher! learned beyond all the wisdom of the
schools.
CHAPTER X

“Y OU will be happy here, dear Ruth,” said Harry; “you will be


your own mistress.”
Ruth danced about, from room to room, with the careless glee of a
happy child, quite forgetful that she was a wife and a mother; quite
unable to repress the flow of spirits consequent upon her new-found
freedom.
Ruth’s new house was about five miles from the city. The approach
to it was through a lovely winding lane, a little off the main road,
skirted on either side by a thick grove of linden and elms, where the
wild grape-vine leaped, clinging from branch to branch, festooning
its ample clusters in prodigal profusion of fruitage, and forming a
dense shade, impervious to the most garish noon-day heat; while
beneath, the wild brier-rose unfolded its perfumed leaves in the
hedges, till the bees and humming-birds went reeling away, with
their honeyed treasures.
You can scarce see the house, for the drooping elms, half a century
old, whose long branches, at every wind-gust, swept across the
velvet lawn. The house is very old, but Ruth says, “All the better for
that.” Little patches of moss tuft the sloping roof, and swallows and
martens twitter round the old chimney. It has nice old-fashioned
beams, running across the ceiling, which threaten to bump Harry’s
curly head. The doorways, too, are low, with honeysuckle, red and
white, wreathed around the porches; and back of the house there is
a high hill (which Ruth says must be terraced off for a garden),
surmounted by a gray rock, crowned by a tumble-down old summer-
house, where you have as fine a prospect of hill and valley, rock and
river, as ever a sunset flooded with rainbow tints.
It was blessed to see the love-light in Ruth’s gentle eyes; to see the
rose chase the lily from her cheek; to see the old spring come back
to her step; to follow her from room to room, while she draped the
pretty white curtains, and beautified, unconsciously, everything her
fingers touched.
She could give an order without having it countermanded; she could
kiss little Daisy, without being called “silly;” she could pull out her
comb, and let her curls flow about her face, without being
considered “frivolous;” and, better than all, she could fly into her
husband’s arms, when he came home, and kiss him, without feeling
that she had broken any penal statute. Yes; she was free as the
golden orioles, whose hanging nests swayed to and fro amid the
glossy green leaves beneath her window.
But not as thoughtless.
Ruth had a strong, earnest nature; she could not look upon this
wealth of sea, sky, leaf, bud, and blossom; she could not listen to
the little birds, nor inhale the perfumed breath of morning, without a
filling eye and brimming heart, to the bounteous Giver. Should she
revel in all this loveliness,—should her heart be filled to its fullest
capacity for earthly happiness, and no grateful incense go up from
its altar to Heaven?
And the babe? Its wondering eyes had already begun to seek its
mother’s; its little lip to quiver at a harsh or discordant sound. An
unpracticed hand must sweep that harp of a thousand strings;
trembling fingers must inscribe, indelibly, on that blank page,
characters to be read by the light of eternity: the maternal eye must
never sleep at its post, lest the enemy rifle the casket of its gems.
And so, by her child’s cradle, Ruth first learned to pray. The weight
her slender shoulders could not bear, she rolled at the foot of the
cross; and, with the baptism of holy tears, mother and child were
consecrated.
CHAPTER XI.

T IME flew on; seasons came and went; and still peace brooded,
like a dove, under the roof of Harry and Ruth. Each bright
summer morning, Ruth and the little Daisy,(who already partook of
her mother’s love for nature,) rambled, hand in hand, through the
woods and fields, with a wholesome disregard of those city bug-
bears, sun, dew, bogs, fences, briers, and cattle. Wherever a flower
opened its blue eye in the rock cleft; wherever the little stream ran,
babbling and sparkling, through the emerald meadow; where the
golden moss piled up its velvet cushion in the cool woods; where the
pretty clematis threw the graceful arms of youth ’round the gnarled
trunk of decay; where the bearded grain, swaying to and fro,
tempted to its death the reaper; where the red and white clover
dotted the meadow grass; or where, in the damp marsh, the whip-
poor-will moaned, and the crimson lobelia nodded its regal crown; or
where the valley smiled in its beauty ’neath the lofty hills, nestling
’mid its foliage the snow-white cottages; or where the cattle dozed
under the broad, green branches, or bent to the glassy lake to drink;
or where, on the breezy hill-tops, the voices of childhood came up,
sweet and clear, as the far-off hymning of angels,—there, Ruth and
her soul’s child loved to linger.
It was beautiful, yet fearful, to mark the kindling eye of the child; to
see the delicate flush come and go on her marble cheek, and to feel
the silent pressure of her little hand, when this alone could tell the
rapture she had no words to express.
Ah, Ruth! gaze not so dotingly on those earnest eyes. Know’st thou
not,

The rose that sweetest doth awake,


Will soonest go to rest?
CHAPTER XII.

“W ELL,” said the doctor, taking his spectacles from his nose, and
folding them up carefully in their leathern case; “I hope you’ll
be easy, Mis. Hall, now that we’ve toted out here, bag and baggage,
to please you, when I supposed I was settled for the rest of my life.”
“Fathers can’t be expected to have as much natural affection, or to
be as self-sacrificing as mothers,” said the old lady. “Of course, it
was some trouble to move out here; but, for Harry’s sake, I was
willing to do it. What does Ruth know about house-keeping, I’d like
to know? A pretty muss she’ll make of it, if I’m not around to
oversee things.”
“It strikes me,” retorted the doctor, “that you won’t get any thanks
for it—from one side of the house, at least. Ruth never says anything
when you vex her, but there’s a look in her eye which—well, Mis.
Hall, it tells the whole story.”
“I’ve seen it,” said the old lady, while her very cap-strings fluttered
with indignation, “and it has provoked me a thousand times more
than if she had thrown a brick-bat at my head. That girl is no fool,
doctor. She knows very well what she is about: but diamond cut
diamond, I say. Doctor, doctor, there are the hens in the garden. I
want that garden kept nice. I suppose Ruth thinks that nobody can
have flowers but herself. Wait till my china-asters and sweet peas
come up. I’m going over to-day to take a peep round her house; I
wonder what it looks like? Stuck full of gimcracks, of all sorts, I’ll
warrant. Well, I shan’t furnish my best parlor till I see what she has
got. I’ve laid by a little money, and—”
“Better give it to the missionaries, Mis. Hall,” growled the doctor; “I
tell you Ruth don’t care a pin what you have in your parlor.”
“Don’t you believe it,” said the old lady.
“Well, anyhow,” muttered the doctor, “you can’t get the upper hand
of her in that line; i. e., if she has a mind that you shall not. Harry is
doing a very good business; and you know very well, it is no use to
try to blind your eyes to it, that if she wanted Queen Victoria’s
sceptre, he’d manage to get it for her.”
“That’s more than I can say of you,” exclaimed the old lady, fanning
herself violently; “for all that I used to mend your old saddle-bags,
and once made, with my own hands, a pair of leather small-clothes
to ride horseback in. Forty years, doctor, I’ve spent in your service. I
don’t look much as I did when you married me. I was said then to
have ‘woman’s seven beauties,’ including the ‘dimple in the chin,’
which I see still remains;” and the old lady pointed to a slight
indentation in her wrinkled face. “I might have had him that was
Squire Smith, or Pete Packer, or Jim Jessup. There wasn’t one of ’em
who had not rather do the chores on our farm, than on any other in
the village.”
“Pooh, pooh,” said the doctor, “don’t be an old fool; that was
because your father kept good cider.”
Mrs. Hall’s cap-strings were seen flying the next minute through the
sitting-room door; and the doctor was heard to mutter, as she
banged the door behind her, “that tells the whole story!”
CHAPTER XIII.

“A SUMMER house, hey!” said the old lady, as with stealthy, cat-
like steps, she crossed a small piece of woods, between her
house and Ruth’s; “a summer house! that’s the way the money goes,
is it? What have we here? a book;” (picking up a volume which lay
half hidden in the moss at her feet;) “poetry, I declare! the most
frivolous of all reading; all pencil marked;—and here’s something in
Ruth’s own hand-writing—that’s poetry, too: worse and worse.”
“Well, we’ll see how the kitchen of this poetess looks. I will go into
the house the back way, and take them by surprise; that’s the way
to find people out. None of your company faces for me.” And the old
lady peered curiously through her spectacles, on either side, as she
passed along towards the kitchen door, and exclaimed, as her eye
fell on the shining row, “six milkpans!—wonder if they buy their milk,
or keep a cow. If they buy it, it must cost them something; if they
keep a cow, I’ve no question the milk is half wasted.”
The old lady passed her skinny forefinger across one of the pans,
examining her finger very minutely after the operation; and then
applied the tip of her nose to the interior of it. There was no fault to
be found with that milkpan, if it was Ruth’s; so, scrutinizing two or
three dish towels, which were hanging on a line to dry, she stepped
cautiously up to the kitchen door. A tidy, respectable-looking black
woman met her on the threshold; her woolly locks bound with a
gay-striped bandanna, and her ebony face shining with irresistible
good humor.
“Is Ruth in?” said the old lady.
“Who, Missis?” said Dinah.
“Ruth.”
“Missis Hall lives here,” answered Dinah, with a puzzled look.
“Exactly,” said the old lady; “she is my son’s wife.”
“Oh! I beg your pardon, Missis,” said Dinah, curtseying respectfully.
“I never heard her name called Ruth afore: massa calls her ‘bird,’
and ‘sunbeam.’”
The old lady frowned.
“Is she at home?” she repeated, with stately dignity.
“No,” said Dinah, “Missis is gone rambling off in the woods with little
Daisy. She’s powerful fond of flowers, and things. She climbs fences
like a squir’l! it makes this chil’ laf’ to see the ol’ farmers stare at
her.”
“You must have a great deal to do, here;” said the old lady,
frowning; “Ruth isn’t much of a hand at house-work.”
“Plenty to do, Missis, and willin’ hands to do it. Dinah don’t care how
hard she works, if she don’t work to the tune of a lash; and Missis
Hall goes singing about the house so that it makes time fly.”
“She don’t ever help you any, does she?” said the persevering old
lady.
“Lor’ bless you! yes, Missis. She comes right in and makes a pie for
Massa Harry, or cooks a steak jess’ as easy as she pulls off a flower;
and when Dinah’s cooking anything new, she asks more questions
how it’s done than this chil’ kin answer.”
“You have a great deal of company, I suppose; that must make you
extra trouble, I should think; people riding out from the city to
supper, when you are all through and cleared away: don’t it tire
you?”
“No; Missis Hall takes it easy. She laf’s merry, and says to the
company, ‘you get tea enough in the city, so I shan’t give you any;
we had tea long ago; but here’s some fresh milk, and some
raspberries and cake; and if you can’t eat that, you ought to go
hungry.’”
“She irons Harry’s shirts, I suppose?” said the old lady.
“She? s’pose dis chil’ let her? when she’s so careful, too, of ol’
Dinah’s bones?”
“Well,” said the old lady, foiled at all points, “I’ll walk over the house
a bit, I guess; I won’t trouble you to wait on me, Dinah;” and the old
lady started on her exploring tour.
CHAPTER XIV.

“T HIS is the parlor, hey?” soliloquized old Mrs. Hall, as she seated
herself on the sofa. “A few dollars laid out here, I guess.”
Not so fast, my dear madam. Examine closely. Those long, white
curtains, looped up so prettily from the open windows, are plain,
cheap muslin; but no artist could have disposed their folds more
gracefully. The chairs and sofas, also, Ruth covered with her own
nimble fingers: the room has the fragrance of a green-house, to be
sure; but if you examine the flowers, which are scattered so
profusedly round, you will find they are wild flowers, which Ruth,
basket in hand, climbs many a stone fence every morning to gather;
and not a country boy in the village knows their hiding-places as well
as she. See how skilfully they are arranged! with what an eye to the
blending of colors! How dainty is that little tulip-shaped vase, with
those half opened wild-rose buds! see that little gilt saucer,
containing only a few tiny green leaves; yet, mark their exquisite
shape and finish. And there are some wood anemonies; some white,
with a faint blush of pink at the petals; and others blue as little
Daisy’s eyes; and see that velvet moss, with its gold-star blossoms!
“Must take a deal of time to gather and fix ’em,” muttered the old
lady.
Yes, my dear madam; but, better pay the shoe-maker’s than the
doctor’s bill; better seek health in hunting live flowers, than ruin it by
manufacturing those German worsted abortions.
You should see your son Harry, as he ushers a visitor in through the
low door-way, and stands back to mark the surprised delight with
which he gazes upon Ruth’s little fairy room. You should see how
Harry’s eyes glisten, as they pass from one flower vase to another,
saying, “Who but Ruth would ever have spied out that tiny little
blossom?”
And little Daisy has caught the flower mania, too; and every day she
must have her vase in the collection; now withdrawing a rose and
replacing it with a violet, and then stepping a pace or two back and
looking at it with her little head on one side, as knowingly as an
artist looks at the finishing touches to a favorite picture.
But, my dear old lady, we beg pardon; we are keeping you too long
from that china closet, which you are so anxious to inspect; hoping
to find a flaw, either in crockery or cake. Not a bit! You may draw
those prying fingers across the shelves till you are tired, and not a
particle of dust will adhere to them. Neither cups, saucers, tumblers,
nor plates, stick to your hands; the sugar-bowl is covered; the cake,
in that tin pail, is fresh and light; the preserves, in those glass jars,
tied down with brandy papers, are clear as amber; and the silver
might serve for a looking-glass, in which you could read your own
vexation.
Never mind! A great many people keep the first floor spick and span;
mayhap you’ll find something wrong up stairs. Walk in; ’tis the “best
chamber.” A gilt arrow is fastened to the wall, and pretty white lace
curtains are thrown (tent fashion) over it; there is a snow-white quilt
and a pair of plump, tempting pillows; the furniture and carpet are
of a light cream color; and there is a vase of honeysuckle on the
little light-stand. Nothing could be more faultless, you see.
Now, step into the nursery; the floor is strewed with play-things;
thank God, there’s a child in the house! There is a broken doll; a
torn picture-book; a little wreath of oak leaves; a dandelion chain;
some willow tassels; a few acorns; a little red shoe, full of parti-
colored pebbles; the wing of a little blue-bird; two little, speckled
eggs, on a tuft of moss; and a little orphan chicken, nestling in a
basket of cotton wool, in the corner. Then, there is a work-basket of
Ruth’s with a little dress of Daisy’s, partly finished, and a dicky of
Harry’s, with the needle still sticking in it, which the little gypsey wife
intends finishing when she comes back from her wood ramble.
The old lady begins to think she must give it up; when, luckily, her
eye falls on a crouching “Venus,” in the corner. Saints and angels!
why, she has never been to the dress-makers! There’s a text, now!
What a pity there is no appreciative audience to see the glow of
indignation with which those half averted eyes regard the undraped
goddess!
“Oh, Harry! is this the end of all my teachings? Well, it is all Ruth’s
doings—all Ruth’s doings. Harry is to be pitied, not blamed;” and the
old lady takes up, at length, her triumphant march for home.
CHAPTER XV.

“H ALLO! what are you doing there?” exclaimed the doctor,


looking over the fence at a laborer, at work in one of Harry’s
fields.
“Ploughing this bit o’ ground, sir. Mr. Hall told me to be sure and get
it finished before he came home from the city this afthernoon.”
“Nonsense!” replied the doctor, “I was born sometime before my son
Harry; put up your plough, and lay that bit of stone wall yonder; that
needs to be done first.”
“I’m thinking Masther Hall won’t be afther liking it if I do, sir,” said
Pat; “I had my orders for the day’s work before masther went to the
city, sir, this morning.”
“Pooh, pooh,” said the old man, unchaining the horse from the
plough, and turning him loose in the pasture; “young folks think old
folks are fools; old folks know young folks to be so.”
Pat eyed the doctor, scratched his head, and began slowly to lay the
stone wall.
“What’s that fellow doing over yonder?” said the doctor to Pat.
“Planting corn, yer honor.”
“Corn? ha! ha! city farming! Good. Corn? That’s just the spot for
potatoes. H-a-l-l-o there! Don’t plant any more corn in that spot,
John; it never’ll come to anything—never.”
“But, Mr. Hall?” said John, hesitatingly, leaning on his hoe-handle.
“Harry? Oh, never mind him. He has seen more ledgers than corn.
Corn? Ha! that’s good. You can go cart that load of gravel up the hill.
What a fortunate thing for Harry, that I am here to oversee things.
This amateur farming is pretty play enough; but the way it sinks the
money is more curious than profitable. I wonder, now, if that tree is
grafted right. I’ll take off the ligatures and see. That hedge won’t
grow, I’m certain; the down-east cedars thrive the best for hedges. I
may as well pull these up, and tell Harry to get some of the other
kind;” and the doctor pulled them up by the roots, and threw them
over the fence.
CHAPTER XVI.

“T IME for papa to come,” said little Daisy, seating herself on the
low door-step; “the sun has crept way round to the big apple-
tree;” and Daisy shook back her hair, and settling her little elbows on
her knees, sat with her chin in her palms, dreamily watching the
shifting clouds. A butterfly alights on a blade of grass near her:
Daisy springs up, her long hair floating like a veil about her
shoulders, and her tiny feet scarce bending the clover blossoms, and
tiptoes carefully along in pursuit.
He’s gone, Daisy, but never mind; like many other coveted treasures,
he would lose his brilliancy if caught. Daisy has found something
else; she closes her hand over it, and returns to her old watch-post
on the door-step. She seats herself again, and loosing her tiny hold,
out creeps a great, bushy, yellow caterpillar. Daisy places him
carefully on the back of her little, blue-veined hand, and he
commences his travels up the polished arm, to the little round
shoulder. When he reaches the lace sleeve, Daisy’s laugh rings out
like a robin’s carol; then she puts him back, to retravel the same
smooth road again.
“Oh, Daisy! Daisy!” said Ruth, stepping up behind her, “what an ugly
playfellow; put him down, do darling; I cannot bear to see him on
your arm.”
“Why—God made him,” said little Daisy, with sweet, upturned eyes
of wonder.
“True, darling,” said Ruth, in a hushed whisper, kissing the child’s
brow, with a strange feeling of awe. “Keep him, Daisy, dear, if you
like.”
Welcome to our website – the ideal destination for book lovers and
knowledge seekers. With a mission to inspire endlessly, we offer a
vast collection of books, ranging from classic literary works to
specialized publications, self-development books, and children's
literature. Each book is a new journey of discovery, expanding
knowledge and enriching the soul of the reade

Our website is not just a platform for buying books, but a bridge
connecting readers to the timeless values of culture and wisdom. With
an elegant, user-friendly interface and an intelligent search system,
we are committed to providing a quick and convenient shopping
experience. Additionally, our special promotions and home delivery
services ensure that you save time and fully enjoy the joy of reading.

Let us accompany you on the journey of exploring knowledge and


personal growth!

ebookname.com

You might also like