0% found this document useful (0 votes)
64 views4 pages

Written Assignment-2 Optimization Methods (Spring 2023) : Instructions

This document provides instructions for a written assignment in an optimization methods course. It includes 10 questions assessing topics like backtracking, convexity of compositions, affine transformations of ellipsoids, necessary conditions for local minimizers, applying Newton's method and gradient descent to the Rosenbrock function, properties of modified Newton's method and DFP algorithm, and examples of Newton's method with fixed step size. Students must submit handwritten solutions by April 25th for a total of 36 marks.

Uploaded by

Anmol Agarwal
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
64 views4 pages

Written Assignment-2 Optimization Methods (Spring 2023) : Instructions

This document provides instructions for a written assignment in an optimization methods course. It includes 10 questions assessing topics like backtracking, convexity of compositions, affine transformations of ellipsoids, necessary conditions for local minimizers, applying Newton's method and gradient descent to the Rosenbrock function, properties of modified Newton's method and DFP algorithm, and examples of Newton's method with fixed step size. Students must submit handwritten solutions by April 25th for a total of 36 marks.

Uploaded by

Anmol Agarwal
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 4

Written Assignment-2

Optimization Methods (Spring 2023)


Submission Deadline: 25th April 2023 (6:00 PM)
Total Marks: 36

Instructions
1. Attempting all questions is mandatory.
2. Marks for each of the question are mentioned at the question itself.
3. Plagiarism is a strict No.
4. Only handwritten answer sheets written by pen and paper will be ac-
cepted. No ipad hand written submissions will be accepted.
5. Students have to physically submit the hard copy to either the TAs or the
course instructor by the deadline.

Question 1 - 4 Marks
Suppose f is strongly convex with mI ⪯ ∇2 f (x) ⪯ M I. Let d be a descent
direction at x. Show that the backtracking stopping condition holds for
dT ∇f (x)
0<α<−
M ∥d∥2
Use this to give an upper bound on the number of backtracking iterations.

Question 2 - 4 marks
Let h : R → R and g : Rn → R. Consider the composition function f as
f = h ◦ g : Rn → R, defined by

f (x) = h(g(x)), domain(f ) = {x ∈ domain(g) | g(x) ∈ domain(h)}

Using the definition of convex function, show the following.

1
1. f is convex if h is convex and non-decreasing, and g is convex. [2 marks]
2. f is convex if h is convex and non-increasing, and g is concave. [2 marks]

Question 3 - 4 marks
Let D ∈ Rn×n be a symmetric positive definite matrix and z ∈ Rn . Given a
vector a ∈ Rn , let R be a rotation matrix such that RD1/2 a = ∥D1/2 a∥2 e1 ,
where e1 is n-dimensional vector whose 1st element is 1 and other elements are
zero. Let T (.) be an affine transformation defined by

T (x) = RD−1/2 (x − z)

Let E ′ = E(z̄, D̄) be the ellipsoid determined by

1 Da
z̄ = z + √
n + 1 aT Da
n2 2 DaaT D
 
D̄ = 2 D− .
n −1 n + 1 aT Da

Let
n2
  
e1 2
E0′ =E , I− e1 eT1 .
n+1 n−1 n+1

Prove that T (E ′ ) = E0′ .

Question 4 - 4 marks
Consider the problem

min − x22
x1 ,x2

s.t. |x2 | ≤ x21


x1 ≥ 0

where x1 , x2 ∈ R.
1. Does the point [x1 x2 ]⊤ = 0 satisfy the first order necessary condition
for a minimizer? That is, if f is the objective function, is it true that
d⊤ ∇f (0) ≥ 0 for all feasible directions d and 0? [2 marks]

2. Is the point [x1 x2 ]⊤ = 0 a local minimizer, a strict local minimizer, a


local maximizer, or none of the above? [2 marks]

2
Question 5 - 6 marks
Consider Rosenbrock Function:

f (x1 , x2 ) = 100(x2 − x21 )2 + (1 − x1 )2

1. Prove that [1 1]⊤ is the unique global minimizer of f over R2 . [2 marks]


2. With a starting point of [0 0]⊤ , apply two iterations of Newton’s method.
 −1  
a b 1 d −b
Hint: = ad−bc . [2 marks]
c d −c a

3. With a starting point of [0 0]⊤ , apply two iterations of gradient descent


with a fixed step size of αk = 0.05. [2 marks]

Question 6 - 2 marks
Consider the Modified Newton’s Algorithm

x(k+1) = x(k) − αk ∇2 f (x(k) )−1 ∇f (x(k) )

where αk = arg minα≥0 f (x(k) − α∇2 f (x(k) )−1 ∇f (x(k) )). Suppose that we
apply the algorithm to a quadratic function f (x = 21 xT Qx − bT x, where Q
is a symmetric positive definite matrix. Recall that the standard Newton’s
method reaches the point x∗ such that ∇f (x∗ ) in just one step starting from
any initial point x(0) . Does the above modified Newton’s algorithm possess the
same property? Justify your answer.

Question 7 - 2 marks
Let f : R2 → R. Consider the problem

min f (x1 , x2 )
x1 ,x2

s.t. x1 , x2 ≥ 0
∂f ∂f
Suppose that ∇f (0, 0) ̸= 0, and ∂x1 ≤ 0, ∂x2 ≤ 0. Show that [0 0]⊤ can not
be a minimizer for this problem.

Question 8 - 2 marks
Consider a function f : Ω → R, where Ω ⊂ Rn is a convex set and f ∈ C1 .
Given x∗ ∈ Ω, suppose there exists c > 0 such that dT ∇f (x∗ ) ≥ c∥d∥ for all
feasible directions d at x∗ . Show that x∗ is a strict local minimizer of f over Ω.

3
Question 9 - 4 Marks
Consider the DFP algorithm applied to the quadratic function
1 T
f (x) = x Qx − bT x
2
where Q is a symmetric and positive definite matrix.
1. Write down formula for αk in terms of Q, ∇f (x(k) ) and d(k) . [2 marks]

2. Show that if ∇f (x(k) ) ̸= 0, then αk > 0. [2 marks]

Question 10 - 4 Marks
Newton’s method with fixed step size α = 1 can diverge if the initial point is
not close to the minimizer x∗ . In this problem we consider two examples.
1. f (x) = log(ex + e−x ) has a unique minimizer x∗ = 0. Run Newton’s
method with fixed step size α = 1, starting at x(0) = 1 and at x(0) = 1.1.
Plot f and f ′ , and show the first few iterates. [2 marks]
2. f (x) = − log x + x has a unique minimizer x∗ = 1. Run Newton’s method
with fixed step size α = 1, starting at x(0) = 3. Plot f and f ′ , and show
the first few iterates. [2 marks]

You might also like