Constraint Qualification
Constraint Qualification
subject to 2 (1 1 )3 0
2 0
2
1()
1
2()
The solution is obviously (1, 0), but this does not satisfy the Kuhn-Tucker
conditions. The gradients of the two constraints at (1, 0) are (0, 1) and (0, 1)
which are linearly dependent. The constraints are not regular at the optimum
(1, 0). The problem is that the set
(x ) = {dx : (x ) dx 0,
= 1, 2} = {(1 , 0)}
= 1, 2, 3} = (x )
and the optimum (1, 0) satises the Kuhn-Tucker conditions. We seek conditions on the constraints g such that the set (x ) correctly represents the set
of feasible perturbations (x ), which is known as the cone of tangents.
1
such that (
x) < 0 for every (x )
Regularity The set { (x ) : (x ) } is linearly independent
Then the Kuhn-Tucker conditions are necessary for an optimal solution.
The most common constraint qualication conditions encountered are
linear = concave (e.g. linear programming)
such that (
Slater condition: convex and there exists x
x) <
0 for every
1.2 Constraint qualication with nonnegative variables (Corollary 5.4.1)
Provided the binding constraints (x ) in the problem
max (x) subject to g(x) 0
x0
subject to g(x) 0
h(x) = x 0
2
We note that h is linear, and is therefore both concave and convex. Further
h[x] = 0 for every x. Therefore, if g satises one of the three constraint
qualication conditions, so does the combined constraint (g, h). By Theorem
5.4, the Kuhn-Tucker conditions are necessary for a local optimum.
subject to p x
has one functional constraint (x) = p x and inequality constraints
(x) = 0, the gradients of which are
= p
= e ,
= 1, 2, . . . ,
where e is the unit vector (Example 1.79). Provided all prices are positive
p > 0, it is clear that these are linearly independent and the regularity
condition of Corollary 5.3.2 is always satised. However, it is easier to appeal
directly to Corollary 5.4.1, and observe that the budget constraint (x) =
p x is linear and therefore concave.
2 Sucient conditions
We know that the KT conditions are sucient when is concave and is
convex. The following is a signcant generalization (e.g. consumer theory).
2.1 Sucient conditions for a global optimum (Theorem 5.5)
Suppose that x satises the KT conditions and
is pseudoconcave
is quasiconvex
Then x is a global maximum.
Proof. For every
either = 0 which implies (x ) (x x ) = 0 for every x
or (x ) = 0 and therefore (x) 0 = (x ) for every x = {x :
(x) 0, = 1, 2, . . . , }. Quasiconvexity implies
(x ) (x x ) 0 for every x
and since 0
(x ) (x x ) 0 for every x
3
The KT condition is
(x ) =
(x )
=1
and therefore
(x ) (x x ) =
(x ) (x x ) 0
=1
Example: Linear programming (Section 5.4.4) A linear programming problem is a special case of the general constrained optimization problem
max (x)
x0
(1)
subject to (x) 0
in which both the objective function and the constraint function g are
linear. Consequently, the Kuhn-Tucker conditions are both necessary and
sucient for a global optimum. The simplex algorithm is an ecient algorithm for solving the Kuhn-Tucker conditions.
3 Homework
1. Suppose that x is a local solution of
max (x) subject to (x) 0
x
at which the binding constraints (x ) satisfy one of the above constraint qualication conditions, so that x satises the Kuhn-Tucker
conditions
(x ) =
(x ) and (x ) = 0 = 1, 2 . . . ,
=1
Solutions 5
1 Since = 0 for every
/ (x ), the Kuhn-Tucker conditions imply
(x ) =
(x )
(x )
(x )
(x ) =
(x )
then
(x ) (x ) =
( ) (x ) = 0
(x )
(a) Let 1 denote the quantity of union labour and 2 the quantity of
non-union labour hired by the rm. The rms optimisation problem
is
max (1 + 2 ) 1 1 2 2
1 ,2 0
1 and 2 0
(1 + 2 ) 1 1 2 2
subject to (1 ) = 1 0
Forming the Lagrangean
(1 , 2 , ) = (1 + 2 ) 1 1 2 2 ( 1 )
the Kuhn-Tucker conditions for an optimum are
1 = (1 + 2 ) 1 + = 0
(1)
2 = (1 + 2 ) 2 0 2 0 () 2 2 = 0 (2)
1
( 1 ) = 0
(2 + ) = 2
2 = 0 (2) implies
(1 ) 2 < 1 = > 0 = 1 =
In both cases, the rm hires exactly units of union labour (1 ). If
() > 2 , then the rm hires additional units of non-union labour
(2 ).
(b) Since the constraint function (1 ) = 1 is linear, it is satises the
CQ condition. The Kuhn-Tucker conditions are necessary.
(c) Diminishing marginal product means that the production function is
concave. Therefore the objective function (1 + 2 ) 1 1 2 2 is
concave. Since the constraint function is convex (linear), the KuhnTucker conditions are also sucient.