Chapter 9st - Non-Linear Programming
Chapter 9st - Non-Linear Programming
¨ Introduction
¨ Unconstrained Optimization: Single Variable
¨ Unconstrained Optimization: Multiple Variables
¨ Constrained NLPs: Lagrange Multipliers (reading)
¨ Constrained NLPs: The Kuhn-Tucker Conditions
(reading)
¨ Ref: Chapter 21- Taha’s book
1.Introduction
¨ Definition of NLP: Let x = (x1, x2, …, xn)
(NLP): Maximize f(x)
Subject to gi(x) ≤ bi, i = 1, 2, …, m
Nonlinear objective function f(x) and/or Nonlinear
constraints gi(x)
Could include xi ≥ 0 by adding the constraints
xi = yi2 for i=1,…,n.
¨Global vs. local optima: Let x be a feasible solution,
then
x is a global max if f(x) ≥ f(y) for every feasible y.
x is a local max if f(x) ≥ f(y) for every feasible y
sufficiently close to x (i.e., xj – ε ≤ yj ≤ xj + ε for all j and
some small ε ).
Local or Global Optimum ?
Local optimum
Global optimum
Concavity and Convexity
• Convex Functions: f(x)
Any point on a
f(y +(1-λ)z) ≤ λf(y) +(1-λ)f(z)
line segment
y and z and for 0≤ λ ≤1 .
connecting any
“strict” convexity ↔ 0< λ <1. Concave
Any point on a line two points on f(x)
• Concave Functions:
segment is f(x)
f(y+(1- λ)z)≥ λf(y) +(1- λ)f(z)
y and z and for 0≤ λ ≤1 . connecting any two
“strict” convexity ↔ 0< λ <1. points on f(x) is
• A local max (min) of a f(x)
concave (convex) function on Convex
a convex (convex) feasible x
region is also a global max • Given this, we can exactly solve: max.
(min) convex programming. (min.) Problems with a concave (convex)
• Strict convexity (concavity) → objective function and linear constraints
the global optimum is unique. → how good the local optimal solutions
are.
Concave or Convex ?
f(x)
Concave
Neither
Convex
Both !
x
2. Unconstrained Optimization:
Single Variable NLP
¨ Classical approach:
max (min) f(x)
s.t. x[a,b]
Optimal solution:
A boundary point
A stationary point: a < x < b, f’(x) = 0, f’’(x) <0 (> 0)
A point where f’(x) does not exist
¨ Direct search method: seek the optimal solution for a unimodal
function (there is at most one local optimum)
Step 0: initialization, the current interval I0 = (xL, xR) = (a, b)
Step i: the current interval is I = (x , x ). Define x , x such that x < x
i-1 L R 1 2 L 1
<x2 <xR. The next interval Ii is determined as follows.
if f(x1) > f(x2) then xL< x* < x2, set Ii = (xL, xR = x2)
if f(x1) < f(x2) then x1< x* < xR, set Ii = (x1= xL, xR)
if f(x1) = f(x2) then x1< x* < x2, set Ii = (xL= x1, xR = x2)
Check I ≤ ε to terminate the algorithm, ε = user-defined level of accuracy
i
Other Methods
¨ Dichotomous method
Global
Local Optimum
Optimum
Example
¨Maximize
¨The exact optimum is
¨The gradient is:
¨ Assume to start at X0=(1,1)
¨We have:
¨Thus:
4. Constrained NLPs: Lagrange Multipliers
¨ Consider the problem: (Reading)
𝑀𝑖𝑛𝑖𝑚𝑖𝑧𝑒 𝑍=𝑓 ( 𝑋
The function f(X) and g(X) are assumed twice continuously
Differentiable. Let L( X , ) f ( X ) g ( X ), is the Lagrange multipliers
¨ The necessary condition:L 0 and L 0
¨ The sufficient condition: X
g ( X )
1
LetB 0 | P Where . L( X , )
2
𝑀𝑎𝑥𝑖𝑚𝑖𝑧𝑒𝑍=𝑓 (𝑋)
Where i is the Lagrange multiplier associated with constraint i, Si is
slack or surplus variable associated with constraint i=1,2,…,m
¨ The necessary condition: 0
f ( X ) g ( X ) 0
i g i ( X ) 0; i 1,2,..., m
g( X ) 0
For the case of minimization, there is only the first condition is changed
to 0
¨ The sufficient condition:
Sense of Optimization Required Conditions
Objective Function Solution
Space
Maximization Concave Convex Set
Minimization Convex Convex Set
Example
¨ Minimize 𝑓 ( 𝑋 )=𝑥 21 +𝑥22 +𝑥 23
Subject to
𝑔1(𝑋)=2𝑥1+𝑥2 −5≤0
¨ The K-T condition:
=()- 1(+s21)
¨ The solution is: …..
x1 = 1,x2 = 2,x3 = 0;1 = 2 = 5 = 0, 3 = - 2, 4 = - 4
¨ Since the function f(X) is convex and the solution space
g(X) 0 is also convex, L(X,S,) must be convex and
the resulting stationary point yields a global
constrained minimum
The General K-T Sufficient Conditions
¨ Consider: