Written Assignment-2 Optimization Methods (Spring 2023) : Instructions
Written Assignment-2 Optimization Methods (Spring 2023) : Instructions
Instructions
1. Attempting all questions is mandatory.
2. Marks for each of the question are mentioned at the question itself.
3. Plagiarism is a strict No.
4. Only handwritten answer sheets written by pen and paper will be ac-
cepted. No ipad hand written submissions will be accepted.
5. Students have to physically submit the hard copy to either the TAs or the
course instructor by the deadline.
Question 1 - 4 Marks
Suppose f is strongly convex with mI ⪯ ∇2 f (x) ⪯ M I. Let d be a descent
direction at x. Show that the backtracking stopping condition holds for
dT ∇f (x)
0<α<−
M ∥d∥2
Use this to give an upper bound on the number of backtracking iterations.
Question 2 - 4 marks
Let h : R → R and g : Rn → R. Consider the composition function f as
f = h ◦ g : Rn → R, defined by
1
1. f is convex if h is convex and non-decreasing, and g is convex. [2 marks]
2. f is convex if h is convex and non-increasing, and g is concave. [2 marks]
Question 3 - 4 marks
Let D ∈ Rn×n be a symmetric positive definite matrix and z ∈ Rn . Given a
vector a ∈ Rn , let R be a rotation matrix such that RD1/2 a = ∥D1/2 a∥2 e1 ,
where e1 is n-dimensional vector whose 1st element is 1 and other elements are
zero. Let T (.) be an affine transformation defined by
T (x) = RD−1/2 (x − z)
1 Da
z̄ = z + √
n + 1 aT Da
n2 2 DaaT D
D̄ = 2 D− .
n −1 n + 1 aT Da
Let
n2
e1 2
E0′ =E , I− e1 eT1 .
n+1 n−1 n+1
Question 4 - 4 marks
Consider the problem
min − x22
x1 ,x2
where x1 , x2 ∈ R.
1. Does the point [x1 x2 ]⊤ = 0 satisfy the first order necessary condition
for a minimizer? That is, if f is the objective function, is it true that
d⊤ ∇f (0) ≥ 0 for all feasible directions d and 0? [2 marks]
2
Question 5 - 6 marks
Consider Rosenbrock Function:
Question 6 - 2 marks
Consider the Modified Newton’s Algorithm
where αk = arg minα≥0 f (x(k) − α∇2 f (x(k) )−1 ∇f (x(k) )). Suppose that we
apply the algorithm to a quadratic function f (x = 21 xT Qx − bT x, where Q
is a symmetric positive definite matrix. Recall that the standard Newton’s
method reaches the point x∗ such that ∇f (x∗ ) in just one step starting from
any initial point x(0) . Does the above modified Newton’s algorithm possess the
same property? Justify your answer.
Question 7 - 2 marks
Let f : R2 → R. Consider the problem
min f (x1 , x2 )
x1 ,x2
s.t. x1 , x2 ≥ 0
∂f ∂f
Suppose that ∇f (0, 0) ̸= 0, and ∂x1 ≤ 0, ∂x2 ≤ 0. Show that [0 0]⊤ can not
be a minimizer for this problem.
Question 8 - 2 marks
Consider a function f : Ω → R, where Ω ⊂ Rn is a convex set and f ∈ C1 .
Given x∗ ∈ Ω, suppose there exists c > 0 such that dT ∇f (x∗ ) ≥ c∥d∥ for all
feasible directions d at x∗ . Show that x∗ is a strict local minimizer of f over Ω.
3
Question 9 - 4 Marks
Consider the DFP algorithm applied to the quadratic function
1 T
f (x) = x Qx − bT x
2
where Q is a symmetric and positive definite matrix.
1. Write down formula for αk in terms of Q, ∇f (x(k) ) and d(k) . [2 marks]
Question 10 - 4 Marks
Newton’s method with fixed step size α = 1 can diverge if the initial point is
not close to the minimizer x∗ . In this problem we consider two examples.
1. f (x) = log(ex + e−x ) has a unique minimizer x∗ = 0. Run Newton’s
method with fixed step size α = 1, starting at x(0) = 1 and at x(0) = 1.1.
Plot f and f ′ , and show the first few iterates. [2 marks]
2. f (x) = − log x + x has a unique minimizer x∗ = 1. Run Newton’s method
with fixed step size α = 1, starting at x(0) = 3. Plot f and f ′ , and show
the first few iterates. [2 marks]