0% found this document useful (0 votes)
44 views7 pages

Lagrangian + KKT

The document discusses the Lagrangian method for constraint optimization, providing examples of maximizing and minimizing functions with equality constraints. It introduces necessary and sufficient conditions for optimality, including the use of the bordered Hessian matrix and KKT conditions for problems with inequality constraints. Various examples illustrate the application of these concepts in solving nonlinear programming problems.

Uploaded by

Techno Mania
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
44 views7 pages

Lagrangian + KKT

The document discusses the Lagrangian method for constraint optimization, providing examples of maximizing and minimizing functions with equality constraints. It introduces necessary and sufficient conditions for optimality, including the use of the bordered Hessian matrix and KKT conditions for problems with inequality constraints. Various examples illustrate the application of these concepts in solving nonlinear programming problems.

Uploaded by

Techno Mania
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 7

Lagrangian and KKT Examples

Dr. Dibyajyoti Guha


February 5, 2025

1 Lagrange Method
Constraint optimization with equality constraint.
Example 1: Max Z = x21 + 3x22 + 5x23
s.t. x1 + x2 + 3x3 = 4
x1 , x2 , x3 ≥ 0.
Example 2: max z = x21 + 3x22 + x1 − x2 + 17
s.t. x1 + x2 = 4.
x1 , x2 , x3 ≥ 0.
General NLPP with equality constraint:
max/min z = f (x)
s.t. g(x) = 0, x ≥ 0.
We assume that f(x) and g(x) are differentiable w.r.t. x.
necessary condition: New function to be optimized (unconstrained):
L(x, λ) = f (x) + λg(x)
The function L(x, λ) is called Lagrangian function and λ is Lagrangian multi-
plier.
Necessary condition: δL δL
δx = 0 and δλ = 0
After solving these equations, we get a stationary point (x∗ , λ∗ ).
Example: max z = x21 + 3x22 + 5x23
s.t. x1 + x2 + 3x3 = 4, ∀xi ≥ 0. Sol:
Define the Lagrangian L as
L = x21 + 3x22 + 5x23 + λ(x1 + x2 + 3x3 − 4). Setting the partial derivatives to 0,
we get: 2x1 + λ = 0
6x2 + λ = 0
10x3 + λ = 0
x1 + x2 + 3x3 − 4 = 0.
Sufficient conditions for NLPP at stationary point (x∗ , λ∗ ): 1st Method:
The necessary conditions become sufficient conditions for a max (min) if,

• the objective function f(x) is concave (convex) and


• the constraints are equality sign g(x) =0

1
2nd Method:
The bordered
 Hessian
 matrix H B as
0 U
HB = is a (m + n) × (m + n) matrix where m is no. of constraints
UT V
and n is no.h of ivariables.
h i
δg δg
where U = δx = δx , δg , · · · , δx
1 δx2
δg
n
m×n
 δL 
δx1
h i  δL 
δ2 L δ  δx2 
V = δx2 = δx  . 
n×n  .. 
δL
δxn
∗ ∗
At stationary point (x , λ ):
Starting with principal minor of order 2m+1, compute the last (n-m) principal
minors of H B at the point (x∗ , λ∗ ) is:

• Maximum point if the principal minors form an Alternate sign starting


with (−1)m+n
• min point if the principal minors form of H B have the sign of (−1)m

Example: Solve the NLPP


z = 2x21 − 24x1 + 2x22 − 8x2 + 2x23 − 12x3 + 200
s.t. x1 + x2 + x3 = 11, ∀x1 ≥ 0.
Solution: Define Lagrangian function as:
L(x, λ) = ... setting the partial derivatives to 0: x1 = −λ+24
4
x2 = −λ+8 4
x3 = −λ+12 4
It will result into λ = 0. The stationary point is: (6,2,3;0).
Sufficient conditions: h i
δg
Define the bordered Hessian matrix when m = 1, n =3, then U = δx =
 
h i 4 0 0
δg δg δg 0 4 0.
, ,
δx1 δx2 δx3 = [1, 1, 1] and V =
1×3
0 0 4
 
0 1 1 1
1 4 0 0
So H B =  1 0 4 0 How many principal minor we need to calculate: (n-m)

1 0 0 4
no.s = 2
Which are the two principla minors? : starting from 2m + 1 = D3 , D4 .
0 1 1 1
0 1 1
1 4 0 0
D3 = 1 4 0 = -8 and D4 = = -48. Both the principal minors
1 0 4 0
1 0 4
1 0 0 4
are of same sign (−1)1 .
2nd Method: Sufficient conditions

2
Since the given objective function is quadratic and constraint is equality sign, so
there is enough to check whether the given function is convex (for minimization)
problem.  
4 0 0
FInd the Hessian matrix H = 0 4 0. The principal minors are 4, 16, 64.
0 0 4
All are greater than 0. Hence it is positive definite matrix. Thus it is convex
function hence, the stationary point is a minimum point.
Example: Min z = 2x21 + x22 + 3x23 + 10x1 + 8x2 + 6x3 − 100
s.t. x1 + x2 + x3 = 20.
Solving the Lagrangian we get,
x1 = λ−104
x2 = λ−82
x3 = λ−66 , then we get λ =  30. Hence
 x1 = 5, x2 = 11, x3 = 4, λ = 30.
4 0 0
The Heassian matrix is: 0 2 0 Eigen values are 4, 2, 6 > 0. So objective
0 0 6
function is convex. So minimum will occur.
Example: min/max z = 3e2x1 +1 + 2ex2 +5
s.t. x1 + x2 = 7 and x1 , x2 ≥ 0.
Sol: Check the convexity/concavity of the function:
The heassian
 2x +1matrix of the  objective function is
12e 1 0
H= The principal minors are D1 = 12e2x1 +1 and
0 2ex2 +5
12e2x1 +1 0
D2 = = 24e2x1 +x2 +6 > 0. Since all principal minors are
0 2ex2 +5
positive, and hence, it is positive definite matrix. So the objective function is
convex. So minimum to occur.
After setting the partial derivatives to 0, we get: λ = 6e2x1 +1
λ = 2ex2 +5
x2 = 7 − x1
If we equate,
6e2x1 +1 = 2ex2 +5
log 3 + 2x1 + 1 = 12 − x1
x1 = 11−log3
e3

11+loge 3
x2 = 3
Example: Min z = x21 + x22 + x23
s.t. 4x1 + x22 + 2x3 = 14, ∀xi ≥ 0.
Sol: Define Lagrangian
L = x21 + x22 + x23 + λ(4x1 + x22 + 2x3 − 14)
The necessary condition for min is

3
x1 = 2λ
x2 (1 + λ) = 0
x3 = −λ
4x1 + x22 + 2x3 − 14 = 0

We have either x2 = 0 or λ = −1
When x2 = 0,

4x1 + 2x3 = 14
=⇒ 4(−2λ) + 2(−λ) = 14
14
=⇒ λ = −
10
14 14
Hence, x1 = 5 , x3 = 10

When λ = −1,
x1 = 2, x3 = 1 and

4(2) + x22 + 2(1) = 14


=⇒ x2 = ±2

Since x2 ≥ 0, so possible value of x2 = 2


Hence the stationary points are: (2, 2, 1; -1) and (14/5, 0, 14/10; -14/10).
To
 check the maxima or  minima, we construct the bordered Hessian matrix as
0 4 2x2 2
 4 2 0 0
2x2 0 2(1 + λ) 0 Here, m =1, n=3. Then n-m = 2 and 2m+1 = 3. We
 

2 0 0 2
need to check two principal minors D3 and D4 .  
0 4 4 2
4 2 0 0
Then find H at (2, 2, 1; -1), it turns out to be: H B =  4 0 0 0. So

2 0 0 2
D3 = −16, D4 = −64. So the given point is minimum.
Again
 we find Hessian
 at (14/5, 0, 14/10; -14/10). It turns out to be: H B =
0 4 0 2
4 2 0 0
0 0 −4/5 0. Therefore,
 

2 0 0 2
0 4 0 2
0 4 0
4 2 0 0
D3 = 4 2 0 = 64/5 > 0 and D4 = = −80 < 0. This point
0 0 2 0
0 0 −4/5
2 0 0 2

4
is maximum as the signs of D3 and D4 are alternative.

1.1 Lagrangian method with 2 constraints


min z = x21 + x22 + x23
s.t. x1 + x2 + 3x3 = 2
5x1 + 2x2 + x3 = 5.
Sol: The Lagrangian is:
L = f (x) − λ1 g1 (x) − λ2 g2 (x)
= x21 + x22 + x23 − λ1 (x1 + x2 + 3x3 − 2) − λ2 (5x1 + 2x2 + x3 − 5)
The necessary conditions are:
δL δL δL δL δL
= 0; = 0; = 0; = 0; =0
δx1 δx2 δx3 δλ1 δλ2
Therefore,
2x1 − λ1 − 5λ2 = 0
2x2 − λ1 − 2λ2 = 0
2x3 − 3λ1 − λ2 = 0
x1 + x2 + 3x3 − 2 = 0
5x1 + 2x2 + x3 − 5 = 0
You will get
11λ1 + 10λ2 = 4
10λ1 + 30λ2 = 10

2 KKT
max z = 3.6x1 − 0.4x21 + 1.6x2 − 0.2x22
s.t. 2x1 + x2 ≤ 10, ∀xi > 0
.
Sol: Necessary and sufficient conditions for max, f(x) should be concave and
g(x) ≤ 0 is convex.  
−0.8 0
Hessian matrix = The principal minors are D1 = −0.8, and
0 −0.4
D2 = 0.32. They are alternating sign with D1 < 0. Hence it is concave.
Also the constraint is linear, we know every linear function is convex function.
Hence KKT conditions are sufficient conditions for maximum.
Define the Lagrangian:

L = f (x) − λg(x)
= (3.6x1 − 0.4x21 + 1.6x2 − 0.2x22 ) − λ(2x1 + x2 − 10)

5
The necessary conditions are:

δL
= 0; λg = 0; λ ≥ 0; g ≤ 0; x ≥ 0
δx
Therefore,
3.6 − 0.8x1 − 2λ = 0
1.6 − 0.4x2 − λ = 0
λ(2x1 + x2 − 10) = 0
λ≥0
2x1 + x2 ≤ 0
x1 , x2 ≥ 0
Case I: when λ = 0 and case II: when λ ̸= 0.
min z = − log x1 − log x2
Example: s.t. x1 + x2 ≤ 2, ∀xi > 0
.
Sol: Necessary and sufficient conditions for min, f(x) should be convex and
g(x) ≤ 0 is convex." #
1
x21
0
Hessian matrix = . Minors are D1 = x12 > 0, and D2 = x21x2 > 0.
0 x12 1 1 2
2
Also the constraint are linear, hence it is also convex function.
Hence, the KKT conditions are necessary and sufficient conditions for minimum.
Define the Lagrangian function: L = − log x1 − log x2 − λ(x1 + x2 − 2)
The necessary conditions are

δL
= 0; λg = 0; λ ≥ 0; g ≤ 0; x ≥ 0
δx
Necessary conditions are:
1
− −λ=0
x1
1
− −λ=0
x2
λ(x1 + x2 − 2) = 0
λ≤0
x1 + x2 ≤ 2
x1 , x2 ≥ 0
Case I: when λ = 0 and case II: when λ ̸= 0.
max z = 8x1 + 10x2 − x21 − x22
Example: s.t. 3x1 + 2x2 ≤ 6, ∀xi > 0
.

6
Sol: Necessary and sufficient conditions for max, f(x) should be concave and
g(x) ≤ 0 is convex. 
−2 0
Hessian matrix =
0 −2
The principal minors are D1 = −2, and D2 = 4. They are alternating sign with
D1 < 0. Hence it is concave.
Also the constraint are linear, hence it is also convex function.
Hence, the KKT conditions are necessary and sufficient conditions for maxi-
mum.
Define the Lagrangian function: L = 8x1 + 10x2 − x21 − x22 λ(3x1 + 2x2 − 6)
The necessary conditions are

δL
= 0; λg = 0; λ ≥ 0; g ≤ 0; x ≥ 0
δx
The necessary conditions are

8 − 2x1 − 3λ = 0
10 − 2x1 − 2λ = 0
λ(3x1 + 2x2 − 6) = 0
λ≥0
3x1 + 2x2 ≤ 6
x1 , x2 ≥ 0

You might also like