0% found this document useful (0 votes)
16 views12 pages

Kuhn Tucker Optimality Conditions

Uploaded by

Parth Gaikwad
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
16 views12 pages

Kuhn Tucker Optimality Conditions

Uploaded by

Parth Gaikwad
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 12

Non-Linear

Programming
Constrained optimization (Equality and Inequality constraints)
Constrained Non-linear Programming – General case

• The general non-linear programming problem (NLP) considering both equality and inequality constraints:

Where,

• First we will consider non-linear programming problem (NLP) considering only inequality constraints:

Where,
Optimality conditions (Inequality constraints)
Necessary conditions of optimality (Specific case - inequality) Eqn (3) implies that either
• If and means that the jth constraint is inactive and
hence can be ignored. In other words, the point X is
The inequality constraints can be transformed to
within the feasible region of , but not on the boundary
equality constraints by adding non-negative slack
of the constraints (i.e. )
variables, .
The problem becomes • If and , means that the jth constraint is active. In other
words, the point X is on the boundary of the constraint
, (i.e. )
Consider the division of constraints into two subsets, J 1
This problem can be easily solved by method of
Where, ,
and J2.
Lagrange multipliers. We can construct the Lagrange - Set J1 indicates the indices of those constraints that are
function as
active.
)=where =
- Set J2 indicates the indices of those constraints that are
inactive.
is a vector of Lagrange multipliers. The optimality conditions can

..(1) - J1 and J2 represents the total set of m constraints.


written as

..(2)
the Eqn (1) can be written as
..(3)
Or )
Or )
Assuming the first p constraints are active:

Above Eqn indicates that the gradient of the objective


function can be expressed as a linear combination of the
gradients of the active constraints at the optimum point.
Optimality conditions (Inequality constraints)

• , (
Optimality conditions (Inequality constraints)
𝜵𝐟

c1>c2>c3

• ,(
Optimality conditions (Inequality constraints)

Kuhn-Tucker necessary conditions of optimality:


=+
At local minimum point (X*):

(minimization case)

If . then is local minima


Kuhn-Tucker sufficient conditions of optimality:

= , where
Optimality conditions (Equality & Inequality constraints)

Where,
Kuhn-Tucker necessary conditions of optimality:
At local minimum point (X*): =++

(minimization case)

If . then is local minima


Kuhn-Tucker sufficient conditions of optimality:

= , where
Illustrative example-1
Case 1: =
Solving equation (5) and (6) we get , )=(, )
= doesn’t satisfy constraint (2), not a feasible point

Solution: Case 2: =0,


Due to constraint (4) and , we get = Solving equations
=, = (5) and (6) we get, , )=(, ) and . Since all the KT conditions
== are satisfied, , )=(, ) is a local minimum.
= is positive definite
==0
==1
, )=(, ) is a local minimum

++ = Case 3: ,
Due to constraint (3) and , we get = Solving equations
++ = (5) and (6) we get, , )=(, ) and , which violates the case-3
Kuhn-Tucker (KT) conditions: condition ( ).
…(1) Case 4: 0,
…(2) Due to constraints (3) and (4), 0 and , we get, , )=(, ).
Solving equations (5) and (6) we get, and which violates
…(3)
the case-4 conditions ( , )
…(4)
…(5)
…(6)
0, 0 …(7)
Illustrative example-2
Case 1: =
Solving equations (7) (8) and (9) we get , , )=(0, 0, 0)
Subject to ; ; which doesn’t satisfy constraints (1) (2),& not a feasible
Solution: point
=, =, =
== =
Case 2:0, 0,
Solving the equations , and , we get , , )=(, , 50) .
== 0 =
Substituting in equations (7) (8)(9) and solving we get , , , which
== = satisfies all the KT conditions. , , )=(, , 50) is a candidate for local
== =
= is positive definite
minimum
++ =
++ = ., , )=(, , 50) is a local minimum
++ = Case 3: =0, =0,
Kuhn-Tucker (KT) conditions: Solving the equations (7) (8) (9) and (from eqn (6), since
…(1) ), we get , , )=(, , 60) and constraint (1) is not
satisfied, , , )=(, , 60) is a infeasible point
…(2)
…(3) Case 4: =0, 0,
…(4) Solving the equations (7) (8) (9) and (from eqn (5), since
…(5) ), we get , , )=(, , 0) and hich doesn’t satisfy constraints (1)
…(6) & not a feasible point
…(7)
Case 5: 0, 0,
…(8) Solving the equations (7) (8) (9) and (from eqn (4), since
), we get , , )=(, , 0) and hich doesn’t satisfy constraints (2)
& not a feasible point
…(9)
0, 0 , 0 …(10)
Illustrative example-2
Case 6: =0, 0,
Solving the equations and , we get =50. Solving
Subject to ; ; equations (7) (8) and (9) we get , , )=(, , 50), , , which
Solution: doesn’t satisfy constraints (1) not a feasible point
=, =, =
== =
Case 7:0, 0,
Solving the equations and , we get =100. Solving
== 0 =
equations (7) (8) and (9) we get , , )=(, , 55), , , which
== = doesn’t satisfy constraints (2) not a feasible point
== =
++ = Case 8: 0, 0,
Solving the equations and , we get =50. From equation (9) we
++ = get =0, which doesn’t satisfy constraints (3) not a feasible
++ = point.
Kuhn-Tucker (KT) conditions: Only feasible local minimum for the problem is , , )=(, , 50)
…(1) with lagrange multipliers , , .
…(2)
…(3)
…(4)
…(5)
…(6)
…(7)
…(8)
…(9)
0, 0 , 0 …(10)
Illustrative example-3
Case 1: μ=0
From equation (6) we get λ=
Solving equations (5) and (6) we get
Substituting in equation (1) we get
Solution:
But substituting and in constraint (2), satisfies it.
== 1 μ=0 satisfies KT conditions and results in a local
==1 minimum feasible point.
==1
+λ +μ = = , which is positive definite
+λ +μ =
Kuhn-Tucker (KT) conditions:
…(1) , )=(, ) is a local minimizer
…(2)
Case 2: μ0
…(3)
Due to constraint (3) and μ0, + , which when considered
…(4) and solved along with eqn (1), gives , , which provides
…(5) μ=0, which violates the case-2 condition (μ0)
…(6)
Next session…
• Optimality conditions can solve only simple problems
• Algorithms for NLP-constrained optimization involving equality and
inequality constraints will be discussed

You might also like