0% found this document useful (0 votes)
22 views24 pages

NLP 2

Uploaded by

Hrijul chauhan
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
22 views24 pages

NLP 2

Uploaded by

Hrijul chauhan
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 24

NON LINEAR PROGRAMMING

PROBLEM
(NLP)
Contents

• Classical Optimization Theories (Calculus based)

– Unconstrained optimization

• Multi variable problems

– Constrained optimization

• With equality type constraints

• With inequality type constrained


Multi variable unconstrained problems
• Example:

Necessary condition for stationary point:


If f(x1, x2,………., xn) or f(x) has an extreme point ( maxima or minima) at

X=x* and if the first order partial derivatives of f(x) exist at x*then,
Multi variable unconstrained problems
Sufficient condition for optimality:

1. At extreme point, if the matrix of the second partial


derivative (Hessian Matrix) is positive definite then it is
local minimum.

2. If hessian matrix is negative definite, then it is local


maximum.

3. But, if Hessian matrix neither positive nor negative definite


then it is a saddle point,
Hessian Matrix:

The hessian ( ) of f(x1, x2,………., xn) is:


Terminology
Positive Definite: Principal minor determinants of A are
greater than zero.

Negative Definite: Principal minor determinants of A satisfy.


Example-1
Find extreme points and their nature in for the following function.

Necessary condition for stationary point:

Thus, the extreme points are:

(0,0), (0, -2/3), (-2,0) and (-2, -2/3)


Sufficient condition for optimality:

• Principal minors are:


Extreme H1 H2 Nature of H Nature of f(x)
point extreme
point x

(0,0) 6 36 Positive Local 24


Definite Maxima

(0, -2/3) 6 -36 Indefinite Saddle 76/9


point

(-2,0) -6 -36 Indefinite Saddle 28


point

(-2,-2/3) -6 36 Negative Local 76/9


definite Minima
Constrained Optimization – Equality
constraints
⮚Method of Direct Substitution
• Transforming the constrained optimization to an
unconstrained one using substitution.

⮚Lagrangian Multiplier Technique

• Transforming the constrained optimization to an

unconstrained one after introducing Lagrangian

multiplier.
Method of Direct Substitution
• Transforming the constrained optimization to an
unconstrained one using substitution.

Minimize f(x)

Subject to: gi(x)=bj, j=1,2, 3,…….,m

Where x= {x1, x2, x3, ……,, xn}T

‘n’ variable and ‘m’ equality constraints.


Method of Direct Substitution
Step- 1
• Using the ‘m’ relations, the original objective
function involving ’n’ variables can newly be
represent with ‘n-m’ variables.

Step- 2
• Thus the original optimization model takes its
equivalent form which is an unconstrained problem
with n-m variables.
Example - Method of Direct Substitution
Minimize:

Where,

Therefore,
Example - Method of Direct Substitution

Necessary condition for stationary point:

Possible extreme points:

( 2, 2, 4)
Sufficient condition for optimality:

• Principal minors are:

• Hence, (2, 2, 4) is point of minima and the minimum value


of objective function is 24.
Method of Lagrangian Multiplier
Minimize f(x1, x2)

Subject to: g(x1, x2)=b, x1, x2>0


• A constrained problem is converted into unconstrained problem with the
help of certain unspecified parameters known as Lagrangian multipliers.
• Converted to:

• Minimize L(x1, x2, λ) = f(x1, x2) + λ [g(x1, x2)-b];

Where, L(x1, x2, λ) is lagrangian function and

λ, is an unspecified positive or negative constant called Lagrangian

multiplier.

• New objective is to find approximate value of λ to minimize, L(x1, x2, λ).


Method of Lagrangian Multiplier

• This is done by treating λ, as a variable , finding the

unconstrained minimum of L(x1, x2, λ) and adjusting

λ, so that g(x1, x2)=b, is satisfied.

Necessary condition for stationary point:

At extreme point:
Method of Lagrangian Multiplier
Step- 1
• Take derivative of L(x1, x2, λ) w.r.t. xi and λ, setthem
equal to zero.
Method of Lagrangian Multiplier
Note
• If there are n variables (i.e., x1, x2,………., xn) then you will get
(n+1) equations with (n+1) unknowns ( i.e., n variables and
one lagrangian multiplier, λ)

Step- 2
Express all xi’s in terms of λ.
Step- 3
Plug xi’s in terms of ‘λ’ in constraint g(x1, x2)=b and solve for λ .
Step- 4
Calculate all Express all xi’s using the value of λ.
Example 1 - Method of Lagrangian Multiplier
Minimize:

STC,

Necessary condition,
Example 1- Method of Lagrangian Multiplier
Solving the equations, we get:

At optimal point

Continued after sufficient condition discussion.


Example 2 - Method of Lagrangian Multiplier
Minimize:

STC,

Necessary condition,
Example 2- Method of Lagrangian Multiplier
Solving the equations, we get:

At optimal point
Example 2- Method of Lagrangian Multiplier
Solving the equations, we get:

At optimal point

H is positive definite. Hence ‘L’ is minimum at

You might also like