Part A: Texas A&M University MEEN 683 Multidisciplinary System Design Optimization (MSADO) Spring 2021 Assignment 3
Part A: Texas A&M University MEEN 683 Multidisciplinary System Design Optimization (MSADO) Spring 2021 Assignment 3
Assignment 3
You are expected to solve Part A individually and Part B in your project team. Each person must
submit their own Part A but you should submit Part B as a group. Please indicate the name(s) of
your teammate(s). Please submit any code you used to answer the questions in Part A.
Part A
(P1) Estimating Derivatives with Finite Differences. In Lecture 14, numerous methods of esti-
mating derivatives were discussed. You are going to test their accuracy on a set of given functions.
You should estimate the first-derivative using (1) a first-order finite difference, (2) a second-order
central difference, and (3) a complex step estimate. In addition, you should estimate the second
derivative using (1) a second-order estimate, and (2) a complex step second derivative estimate.
Note, for the complex step, the second derivative estimate is:
2
f 00 (x) = [ f (x) − ℜe ( f (x + i · ∆x))] .
∆x2
Plot the error between the analytical first derivatives and your approximations on one plot and the
error between the analytical second derivative and your approximations on a second plot, both
using a log-scale. Use step sizes from 1 × 10−15 to 10.
(a) For the function f (x) = x2 , at x = 1.
(b) For the function f (x) = x3 , at x = 1.
(c) For the function f (x) = ex , at x = 1.
(d) Comment about the change in the accuracy of the central difference and the complex step
between (a) and (b).
Caution: if the error is zero it will not be shown on a log-scale so please take care in plotting your
results.
(P2) Karush-Kuhn-Tucker Conditions. Use the optimality conditions (KKT) to solve the prob-
lem:
Using the (generalized) reduced gradient method and verify with the KKT conditions. You will
need to solve a system of nonlinear equations as part of this problem. There will be four solutions
to the nonlinear system. You will need something like a Newton-Raphson solver for this. In
Matlab, the function ’fsolve’ can achieve this. You will need to consider multi-start (different
initial conditions). Please use s1 = x1 , s2 = x2 , and d1 = x3 as your partition for the reduced gradient
method.
Part B
In this assignment you are to take the simulation code that you developed for your project in assign-
ment 2, refine it and couple it with an optimizer. First you should use a gradient-based technique.
If you have non-continuous variables keep them at fixed values, or assume they are continuous.
(We will use heuristic techniques on your assignment in a later assignment.)
n b2.1 Algorithm Selection Select a gradient-based algorithm based on the characteristics of your
project and the properties of the available algorithms. Rationalize in a few sentences, why your
selection seems most appropriate for the problem at hand.
b2.2 Single objective optimization Select a single (scalar) objective function for which to opti-
mize your system. Describe why you selected this objective. Other potential objectives should
be turned into equality or inequality constraints or ignored (for now). Using the gradient-based
optimization technique identified in (b2.1), try to optimize your system with respect to the one
objective function. Can you get the algorithm to converge? Do you obtain an improvement in the
design compared to your initial starting point? If not, please give some reasons. You may use any
optimization environment of your choice (e.g., Matlab), but please specify in your write-up what
you used. What is the optimal solution x∗ ?
b2.3 Sensitivity analysis Conduct a sensitivity analysis at the optimal point x∗ with respect to
x, and a few of your fixed parameters, p. What design variables seem to be the drivers in your
problem? Does this match the intuition you had beforehand? What are the active constraints at x∗ ?
How can you tell? Try moving the most important active constraint by some amount. Reoptimize
and compare the new optimum with the previous optimum, what do you observe?
b2.4 Global optimum How confident are you that you have found the true global optimum? Ex-
plain.
Note: Keep all the results from this assignment handy for A4, where we will extend this work.