0% found this document useful (0 votes)
55 views2 pages

Tutorial-2 (Multivariable Optimization) 2020

This document contains 16 questions regarding multivariable optimization techniques. The questions cover topics such as locating and classifying stationary points, explaining convex/concave functions, using conjugate direction and steepest descent methods, determining if search directions are descent directions, proving properties of search directions, performing iterations of DFP and Newton's method, explaining differences between optimization algorithms, and outlining algorithms for conjugate gradient, variable metric, and simplex search methods.

Uploaded by

yash kalal
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
55 views2 pages

Tutorial-2 (Multivariable Optimization) 2020

This document contains 16 questions regarding multivariable optimization techniques. The questions cover topics such as locating and classifying stationary points, explaining convex/concave functions, using conjugate direction and steepest descent methods, determining if search directions are descent directions, proving properties of search directions, performing iterations of DFP and Newton's method, explaining differences between optimization algorithms, and outlining algorithms for conjugate gradient, variable metric, and simplex search methods.

Uploaded by

yash kalal
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 2

Tutorial-2

(Multivariable optimization)

Date of Submission:30/11/2020

Q1 Locate and classify the stationary points of the following functions:


(i) f(x1,x2)=𝑥12 +2x2−4x1−2x1x2
(ii) f(x1, x2) = 10(x2–𝑥12 )2+(1
1 –x1)2
(ii) f(x1, x2, x3) = (x1 + 3x2 + x3)2+ 4(x1 − x3−2)2

Q2 explain convex and concave function and its test conditions.

Q3.In solving the following problem:

Minimize f(x1, x2) = 10 + 𝑥12 − 5𝑥1 𝑥2 + 9𝑥22 + 𝑥2

Using 𝑥1 = (-1, 1)T and 𝑥2 = (1, -1)T and a search direction d = (1, 1)T, we
would like to use the conjugate direction method using the parallel
subspace property.
(i) Find the direction s which is C-conjugate to d. Show that s is C-
conjugate to d.
(ii) Continue to find the minimum solution of the above function. Verify
this solution by finding the minimum using the first and second-order
optimality conditions.

Q4. In order to minimize the unconstrained objective function

𝑥13 − 2𝑥1 𝑥22 + 4𝑥1 + 10


a search direction (cos 𝜃, sin 𝜃)𝑇 needs to be used at the point (1, 2)𝑇 . Whatis the range of θ
for which the resulting search direction is descent? What is the steepest descent direction?

Q5. Consider the following function for minimization

𝑓(𝑥1 , 𝑥2 , 𝑥3 )=𝑥12 + 𝑥22 + 𝑥33 + 2𝑥1 𝑥2 + 2𝑥2 𝑥3 − 3𝑥1

and a search direction s(1) = (1, −1, −1)T. Using two points (1, 0, 1)T and (0, 1, 0)T,
find a new search direction s(2) conjugate to s(1) using parallel subspace property.
Show that the search direction s(2) is conjugate to s(1) with respect to the above
function.

Q6. Find whether the given direction s at the point x is descent for the respective
functions:

(i) 𝑓(𝑥1 ,𝑥2 )=2𝑥12 + 𝑥22 − 2𝑥1 𝑥2 + 4

s = (1,1)T, x=(2,3)T.

Page 1 of 2
𝑥
(ii) For 𝑓(𝑥1 , 𝑥2 ) = 𝑥14 + 𝑥24 − 2𝑥12 𝑥22 + 10 𝑥12
2

s = (−1,2)T, x=(0,1)T.

Q7 Prove that two consecutive search directions obtained in the steepest descent
search algorithm are mutually orthogonal to each other.

Q8. Starting from the point(1,1)T, perform two iterations of DFP method
to find a stationary point of the following function:
f(x1, x2) = 10 − x1 + x1x2+ 𝑥22

Q9. What are the differences between Cauchy’s and Newton’s search
methods? Determine for what values of x1 and x2, Newton’s search is
guaranteed to be successful for the following unconstrained minimization
problem:

f(x1, x2) = x13 − 4x1x2 +𝑥22

Q10. Find out whether an exploratory search used in the Hooke-Jeeves pattern
search method at x(0) = (0.0, 0.5)T with ∆x1 = ∆x2 = 0.8 is successful in minimizing
the following function:

𝑓(𝑥1 , 𝑥2 ) = 10𝑥1 − 𝑥2 + 2𝑥12 𝑥22 + 𝑥23

Q11. In trying to find the maximum of the following function, the point x(t) = (3, 2,
−1)T is
encountered:Determinewhetherthesearchdirections(t)=(−1,1,−2)Twouldbeableto
find better solutions locally from x(t).
𝑓(x1 , x2 , x3 ) = 6x12 x2 − 5x3 + 2𝑥1

Q12 Explain Variable-metric Method (DFP Method) and difference between


Conjugate Gradient Method and Conjugate direction Method.

Q13 Explain the algorithm of Conjugate Gradient Method and why the search
direction used in Cauchy’s method is the negative of the gradient.

Q14 Explain the reason why Cauchy’s method is initially followed and there after
Newton’s method is adopted in Marquardt’s method.

Q15. Explain the reason why Newton’s method uses second-order derivatives to
create search directions and write the algorithm termination criteria and steps.
Q.16. Write steps of simplex search method explain optimality criteria for
multivariable optimization.

Page 2 of 2

You might also like