Optimization
Optimization
1
4.4
Let A be a symmetric square matrix. Consider the linear programming problem
minimize c0 x
subject to Ax c
x0
maximize p0 c
subject to p0 A c0
p0
(b) In order to obtain the best possible lower bound of the form considered in part (a), we form the linear
programming problem
maximize p0 b
subject to p0 A = 00
subject to p0 e = 1
p0
Formulate the dual:
minimize v
subject to p0 A = 00
subject to p0 e = 1
xf ree
2
in long-term storage, at the end of period t:
T
X
minimize t1 (dt xt cyt ) + T dT +1 zT
t=1
subject to xt + yt wt = bt , t = 1, . . . , T,
zt + wt zt1 f yt = 0, t = 1, . . . , T,
z0 = 0
xt , yt , wt , zt 0
(a) Let pt and qt be dual variables associated with the first and second equality constraint, respectively .
Write down the dual problem.
(b) Assume that 0 < f < 1, bt 0, and c 0. Show that the following formulae provide an optimal solution
to the dual problem:
(c) Explain how the result in part (b) can be used to compute an optimal solution to the original problem.
Primal and dual nondegeneracy can be assumed.
Let x Rn , be a nondegenerate basic feasible solulution to the primal. Therefore, x satisfies exactly n
linearly independent constraints as well as all inequality constraints.
pi (a0i x bi ) = 0 i,
(cj p0 Aj )xj = 0 j.
Lets consider an LP in the general form:
minimize c0 x maximize p0 b
subject to ai x bi , i M1 , subject to pi 0, i M1 ,
ai x bi , i M2 , pi 0, i M2 ,
ai x = bi , i M3 , pi free, i M3 ,
xj 0, j N1 , p0 Aj cj , j N1 ,
0
xj 0, j N2 , p Aj cj , j N2 ,
0
xj free, j N3 , p A j , = cj j N3 ,
n
Let x R be a nondegenerate basic feasible solulution to the primal. Therefore, x satisfies exactly n linearly
independent constraints as well as satisfying all inequality constraints. Hence, |M3 | = n. The rest of the
constraints are strict inequalities. By the complementary slackness conditions, all pi = 0 for i corresponding
to the strict inequality constraints, i M1 M2 . Now we need to determine the other pi , i M3 .
Let us construct a matrix B for which the rows of B are the a0i , i M3 . B is a full rank matrix with
dimension n n. If an xj = 0, x is degenerate because there would be more than n active constraints -
the constraint xj = 0 would be active in addition to those represented in B. Therefore, xj 6= 0. Hence, by
3
complementary slackness, (cj p0 Aj ) = 0 cj = p0 Aj .
Let A be a matrix with rows a0i . WLOG, suppose that {i : i M3 } = {1, . . . , n}, which means pi = 0 for
i > n and the rows a0i are ordered such that the equality constraints are the first n rows in A (the first n rows
are then equivalent to B). This implies that only the first n elements of each column Aj are relevant to the
C.S. condition cj = p0 Aj , because these remaining mn components are being multiplied by pi = 0. Letting
q be a vector with n components, our C.S. condition looks as follows: cj = q0 Bj c = q0 B q = c0 B1 .
Set pi = qi for i = 1, . . . , n, and pi = 0 for all other i.
p is clearly a uniquely defined vector with n uniquely defined components from the system q = c0 B1 and
the rest of the m n constraints being set to 0 by the C.S. conditions.
4.20
(a) Consider the following linear programming problem and its dual:
minimize c0 x
maximize p0 b
subject to Ax b
subject to p0 A c0 ,
x 0,
and assume that both problems have an optimal solution. Fix some j. Supose that every optimal solution
to the primal satisfies xj = 0. Show that there exists an optimal solution p to the dual such that p0 Aj < cj .
Hint: Let d be the optimal colst. Consider the problem of minimizing xj subject to Ax = b, x 0, and
c0 x d, and form its dual.
(b) Show that there exist optimal solutions x and p to the primal and to the dual, respectively such that
for every j we have either xj > 0 or p0 Aj < cj . Hint: Use part (a) for each j, and then take the average of
the vectors obtained.
minimize c0 x maximize p0 b
subject to Ax b subject to p0 A c0
x 0, p 0.
Assume that both problems have an optimal solution. Show that there exist optimal solutions to the primal
and to the dual, respectively that satisfy strict complementary slackness, that is: (i) For every j, we have
either xj > 0 or p0 Aj < cj
(ii) For every i, we have either a0i x > bi or pi > 0. Hint: Convert the primal to standard form and apply
part (b).
4.26
Let A be a given matrix. Show that exactly one of the following alternatives must hold.
((a) (b))
Ax = 0, x 0 x = 0, meaning A has n linearly independent columns. Regardless of m, the column space
of A spans Rn . By setting all pi = 0 for i
/ {1, . . . , n}, it is clear that we can find a vector p such that
4
p0 A > 0.
((a) (b))
Suppose there exists some x 6= 0 such that Ax = 0, x 0 the columns of A are linearly dependent.
ui = pi (a0i x bi ) = 0 i
vi = (cj p0 Aj )xj = 0 j
and assume that both problems have an optimal solution. Fix some j. Supose that every optimal solution
to the primal satisfies xj = 0. Show that there exists an optimal solution p to the dual such that p0 Aj < cj .
Hint: Let d be the optimal colst. Consider the problem of minimizing xj subject to Ax = b, x 0, and
c0 x d, and form its dual.
(b) Show that there exist optimal solutions x and p to the primal and to the dual, respectively such that
for every j we have either xj > 0 or p0 Aj < cj . Hint: Use part (a) for each j, and then take the average of
the vectors obtained.
minimize c0 x maximize p0 b
subject to Ax b subject to p0 A c0
x 0, p 0.
Assume that both problems have an optimal solution. Show that there exist optimal solutions to the primal
and to the dual, respectively that satisfy strict complementary slackness, that is: (i) For every j, we have
either xj > 0 or p0 Aj < cj
(ii) For every i, we have either a0i x > bi or pi > 0. Hint: Convert the primal to standard form and apply
part (b).
4.26
Let A be a given matrix. Show that exactly one of the following alternatives must hold.
((a) (b))
Ax = 0, x 0 x = 0, meaning A has n linearly independent columns. Regardless of m, the column space
of A spans Rn . By setting all pi = 0 for i
/ {1, . . . , n}, it is clear that we can find a vector p such that
p0 A > 0.
((a) (b))
Suppose there exists some x 6= 0 such that Ax = 0, x 0 the columns of A are linearly dependent.