0% found this document useful (0 votes)
27 views7 pages

N.O. Study Guide

This document provides study materials for numerical optimization including: - Links to view or download notes on linear programming problems (LPP), simplex method, and chapters 1, 2, and 9. - Theory and numerical questions on topics like constrained vs unconstrained optimization, convexity, linear and nonlinear optimization methods. - Units cover linear and nonlinear programming, optimization algorithms like gradient descent, Newton's method, and conjugate gradient method. - Links are provided on techniques like approximating gradients, Hessians, and Jacobians.

Uploaded by

Piyush Agrahari
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
27 views7 pages

N.O. Study Guide

This document provides study materials for numerical optimization including: - Links to view or download notes on linear programming problems (LPP), simplex method, and chapters 1, 2, and 9. - Theory and numerical questions on topics like constrained vs unconstrained optimization, convexity, linear and nonlinear optimization methods. - Units cover linear and nonlinear programming, optimization algorithms like gradient descent, Newton's method, and conjugate gradient method. - Links are provided on techniques like approximating gradients, Hessians, and Jacobians.

Uploaded by

Piyush Agrahari
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 7

Numerical Optimization

Ma’am Notes: Click Here to View or Download

Unit 1
Study Links

Chapter 1: Click Here to View or Download


Simplex notes: Click Here to View or Download

LPP(Graphically):
Linear Programming 1: Maximization -Extreme/Corner Points
- YouTube
Linear Programming 2: Graphical Solution - Minimization
Problem - YouTube

LPP(Graphically): Special Cases of Graphical Method

Simplex Method:
Intro to Simplex Method | Solve LP | Simplex Tableau -
YouTube
Operation Research | Two Phase Simplex Method | Linear
Programming - YouTube
Simplex Special Cases: Read Here

Theory Questions

1. Define optimization problem.


2. Define transportation problem.
3. Compare the following:
a) Continuous and Discrete Optimization
b) Constrained and Unconstrained Optimization
c) Local and Global Optimization
d) Stochastic and Deterministic Optimization
4. Define Convexity.
5. What are the features of the optimization problem.

Numerical Questions

1. Solve the following using graphical method:


a) Maximize Z = 8x + y
Constraints are:
x + y <=40
2x + y <= 60
x>=0, y>=0
b) Maximize Z = 50x + 15y
Constraints are:
5x + y <= 100
x + y <= 50
x>=0, y>=0
c. Minimize Z = 20x + 10y
Constraints are:
x + 2y <= 40
3x + y >= 30
4x + 3y >= 60
x>=0, y>= 0
d. Maximize Z = 3x + 5y
Constraints are:
x + 5y <= 10
2x + 2y <= 5
x, y >= 0
2. Solve the linear programming problem (any three):
3. Solve the following using simplex method:

4. Prove the convexity of the following:

5. Solve the following using two-phase method:


Min Z = 4x1 + 3x2
Such that:
2x1 + x2 >= 10
-3x1 + 2x2 <= 6
x1 + x2 >= 6
x1, x2 >= 0
Unit 2

Study Links

Chapter 2: Click Here to View or Download


Chapter 9: Click Here to View or Download

Theory Questions

1. What is unconstrained optimization? Give an example.


2. What is non-linear least squares problem.
3. Define the following:
a) Local minimizer
b) Global minimizer
c) Strict local minimizer
d) Isolated local minimizer
4. State Taylor’s theorem.
5. What are the necessary and sufficient conditions for local minima.
6. What is stationary point?
7. State the condition for local minimizer to be global minimizer and
vice versa.
8. What are the characteristics of non-smooth problem.
9. What is unimodal problem? Give example.
10. What is solution set?
11. Define the following:
a) Descent Property
b) Quadratic Termination Property
c) Global Convergence
d) Order of Convergence
e) Unimodal min function
12. Explain the line search method for unimodal functions.
13. Explain Golden Section method. Write its algorithm.
14. Explain Fibonacci search method. Write its algorithm.
15. Explain the relationship between Golden Section and Fibonacci
search method.
16. Explain Steepest Descent and write its algorithm.
17. What are the advantages and disadvantages of Steepest Descent
method.
18. Explain Newton’s method. Write its algorithm.
19. Explain modified Newton’s method.
Numerical Questions

1. Solve for all the critical points of the given function. Then, for each
critical point, use the hessian matrix to determine whether the critical
point is a local minima, maxima, neither.
a) 7x2 + 6xy + 2x + 7y2 - 22y + 23
b) 2x2 + 2xy + 2x + y2 - 2y + 5
2. Find min 2x2 + 5 over [2,10] by golden section rule. Take ε = 1.2.
3. Find min x2 - 1 over [1,17] by golden section rule. Take ε = 1.7.
4. Find the min of x2 in the inteval [-5,15] by taking n = 7 using
Fibonacci search method.
5. Find min 2x2 + 5 over [2,10] by Fibonacci. Take n = 6.
6. Use the steepest descent method to minimize f(x1,x2) = 3x12 - 4x1x2
+ 2x22 + 4x1+ 6 over (x1,x2) ∈ R2.
7. Use Newton’s method to minimize f(x1,x2) = 8x12 - 4x1x2 + 5x22,
(x1,x2) ∈ R2.

Unit 3
Study Links

Chapter 9: Click Here to View or Download

Theory Questions

1. Explain conjugate gradient method. Write its algorithm.


2. Where the advantages and disadvantages of it.
3. Compare Steepest Descent, Newton’s and Conjugate Gradient
Method.

Numerical Questions

1. Use the conjugate gradient method to minimize f(x1,x2) = 3x12 -


4x1x2 + 2x22 + 4x1 + 6, (x1,x2) ∈ R2.
2. Use the conjugate gradient method to minimize f(x1,x2) = 7x12 +
3x1x2 + x22 + 13, (x1,x2) ∈ R2.
3. Use the conjugate gradient method to minimize f(x1,x2) = x12 - x1x2
+ 5x22 + 6x1 + 2, (x1,x2) ∈ R2.
Unit 4
Study Links

Chapter 8(only 8.1): Click Here to View or Download


Class Notes: View Unit 4 Class Notes Here
Important links:
Gradient, Hessian and Jacobian
Gradient and Gradient Hessian Approximation

Theory Questions

1. Explain method of approximating a gradient.


2. Explain method of approximating sparse jacobian.
3. Explain method of approximating hessian.
4. Explain method of approximating sparse hessian.

Numerical Questions

1. Compute the gradient and hessian of the given function:


f(x,y) = 5x2 + 3xy + y3

2. Compute the jacobian of the given function:


f(x,y) = [sinx + y x+cosy]

3. Construct quadratic polynomial approximation for:


a) f(x) = xex ; x0 = 0
b) f(x1,x2) = x1ex2 ; x0 = (0,0)

4. Construct Taylor series expansion for the function log(1+x) about


the point x0 = 0.
Unit 5
Study Links

Chapter 12(only 12.1): Click Here to View or Download


Important links:
Lagrangian Multipliers - Khan Academy
Lagrangian Multiplier - Calcworkshop
Lagrangian Multiplier - tutorial.math.lamar.edu
Lagrangian Inequality Constraints pdf

Theory Questions

1. Compare constrained and unconstrained optimization.


2. State the optimality conditions for unconstrained minimization
problem.
3. Define:
a) Local Solution
b) Strict Local Solution
c) Isolated Local Solution
4. Compare Local and Global Solution.
5. Why smoothness of an objective function and constraint is an
important issue in characterizing solution? Justify with the help of an
example.
6. Define Active Set.

Numerical Questions

1. Find the maximum and minimum of f(x,y)=5x−3y subject to the


constraint x2+y2=136.
2. Find the maximum and minimum values of f(x,y,z)=xyz subject to
the constraint x+y+z=1. Assume that x,y,z≥0.
3. Find the maximum and minimum values of f(x,y)=4x2+10y2 on the
disk x2+y2≤4.
4. Find the dimensions of the box with largest volume if the total
surface area is 64 cm2.

You might also like