0% found this document useful (0 votes)
7 views1 page

ODS Model Quiz

The document is a quiz on optimization methods, consisting of multiple sections covering topics such as Newton's Method, Steepest Descent, Directional Derivatives, and Quasi-Newton Methods. It includes various question types including multiple choice, true/false, and short answer questions, with a total of 20 marks available. The quiz is designed to assess understanding of key concepts and applications in optimization techniques.

Uploaded by

mukherjee.soumya
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
7 views1 page

ODS Model Quiz

The document is a quiz on optimization methods, consisting of multiple sections covering topics such as Newton's Method, Steepest Descent, Directional Derivatives, and Quasi-Newton Methods. It includes various question types including multiple choice, true/false, and short answer questions, with a total of 20 marks available. The quiz is designed to assess understanding of key concepts and applications in optimization techniques.

Uploaded by

mukherjee.soumya
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 1

Optimization Methods Quiz

Time: 30 minutes

Total Marks: 20

Section A: Newton's Method

1. What is the primary advantage of Newton's method over gradient descent?

a. Lower computational cost per iteration


b. Quadratic convergence rate
c. No requirement for second derivatives
d. Guaranteed global convergence

2. True/False: Newton's method requires the Hessian matrix to be positive definite for minimization
problems.

True False

3. Short Answer: Why might Newton's method fail to converge for non-convex functions?

(2 marks)

Section B: Steepest Descent

4. The steepest descent direction for minimizing f(x) is:

a. ∇f(x)
b. -∇f(x)
c. ∇²f(x)⁻¹∇f(x)
d. ∇f(x) × ∇²f(x)

5. True/False: The steepest descent method converges linearly for quadratic functions.

True False

6. Calculate the steepest descent direction for f(x,y) = x² + 3y² at (1, -1).

(2 marks)

Section C: Directional Derivatives

7. The directional derivative of f(x,y) = xy at (2,3) in the direction u = (1/√2, 1/√2) is:

a. 5/√2
b. 3/√2
c. 2.5
d. 6

8. True/False: The maximum directional derivative of a function equals the magnitude of its gradient.

True False

Section D: Quasi-Newton Methods

9. BFGS and DFP are examples of:

a. Line search methods


b. Hessian approximation methods
c. Stochastic optimization
d. Penalty function methods

10. Why are quasi-Newton methods preferred over Newton's method for large-scale problems?

(2 marks)

Section E: Application

11. For f(x) = x⁴ - 3x² + 2:

Perform one iteration of Newton's method starting at x₀ = 1.

(4 marks)

You might also like