0% found this document useful (0 votes)
828 views

Optimization

This document contains homework problems from the course IEOR 6613 - Optimization I. The problems cover various topics in linear programming including duality theory, complementary slackness conditions, degeneracy, and alternative optimality conditions. John Min submitted this homework on October 9, 2013 containing solutions to 7 problems numbered 4.4, 4.7, 4.9, 4.12, 4.14, 4.20, and 4.26.

Uploaded by

sweetie05
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
828 views

Optimization

This document contains homework problems from the course IEOR 6613 - Optimization I. The problems cover various topics in linear programming including duality theory, complementary slackness conditions, degeneracy, and alternative optimality conditions. John Min submitted this homework on October 9, 2013 containing solutions to 7 problems numbered 4.4, 4.7, 4.9, 4.12, 4.14, 4.20, and 4.26.

Uploaded by

sweetie05
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 5

IEOR 6613 - Optimization I

HW 4: 4.4, 4.7, 4.9, 4.12, 4.14, 4.20, 4.26


John Min
jcm2199
October 9, 2013

1
4.4
Let A be a symmetric square matrix. Consider the linear programming problem

minimize c0 x
subject to Ax c
x0

Prove that if x satisfies Ax = c and x 0, then x is an optimal solution.

Suppose Ax = c and x 0. Lets formulate the dual:

maximize p0 c
subject to p0 A c0
p0

p0 A c0 p0 Ax c0 x p0 c c0 x and since A is square and symmetric, we have c0 p c0 x .


Since the duals objective is to maximize p0 c, set p = x . Since x 0, p 0. x and p are feasible
solutions to the primal and dual and p0 b = c0 x where b = c in this problem according to standard notation.
By Corollary 4.2, x and p are optimal solutions to the primal and dual, respectively.

4.7 (Duality in piecewise linear convex optimization)


Consider the problem of minimizing maxi=1,...,m (a0i x bi ) over all x Rn . Let v be the value of the optimal
cost, assumed finite. Let A be the matrix with rows a1 , . . . , am , and let b be the vector with components
b1 , . . . , b m .
Pm
(a) Consider any vector p Rm that satisfies p0 A = 00 , p 0 and i=1 pi = 1. Show that p0 b v.

(b) In order to obtain the best possible lower bound of the form considered in part (a), we form the linear
programming problem
maximize p0 b
subject to p0 A = 00
subject to p0 e = 1
p0
Formulate the dual:
minimize v
subject to p0 A = 00
subject to p0 e = 1
xf ree

4.9 (Back-propogation of dual variables in a multiperiod problem)


A company makes a product that can be either sold or stored to meet future demand. Let t = 1, . . . , T denote
the periods of the planning horizon. Let bt be the production volume during period t, which is assumed
to be known in advance. During each period t, a quantity xt of the product is sold, at a unit price of dt .
Furthermore, a quantity yt can be sent to long-term storage, at a unit transportation cost of c. Alternatively,
a quantity wt can be retrieved from storage, at zero cost. We assume that when the product is prepared for
long-term storage, it is partly damaged, and only a fraction f of the total survives. Demand is assumed to
be unlimited. The main question is whether it is profitable to store some of the production in anticipation
of higher prices in the future. This leads us to the following problem, where zt stands for the amount kept

2
in long-term storage, at the end of period t:

T
X
minimize t1 (dt xt cyt ) + T dT +1 zT
t=1
subject to xt + yt wt = bt , t = 1, . . . , T,
zt + wt zt1 f yt = 0, t = 1, . . . , T,
z0 = 0
xt , yt , wt , zt 0
(a) Let pt and qt be dual variables associated with the first and second equality constraint, respectively .
Write down the dual problem.

(b) Assume that 0 < f < 1, bt 0, and c 0. Show that the following formulae provide an optimal solution
to the dual problem:

(c) Explain how the result in part (b) can be used to compute an optimal solution to the original problem.
Primal and dual nondegeneracy can be assumed.

4.12 (Degeneracy and uniqueness)


Consider a general linear programming problem and suppose that we have a nondegenerate basic feasible
solution to the primal. Show that the complementary slackness conditions lead to a system of equations for
the dual vector that has a unique solution.

Let x Rn , be a nondegenerate basic feasible solulution to the primal. Therefore, x satisfies exactly n
linearly independent constraints as well as all inequality constraints.

The complementary slackness conditions state the following:

pi (a0i x bi ) = 0 i,
(cj p0 Aj )xj = 0 j.
Lets consider an LP in the general form:
minimize c0 x maximize p0 b
subject to ai x bi , i M1 , subject to pi 0, i M1 ,
ai x bi , i M2 , pi 0, i M2 ,
ai x = bi , i M3 , pi free, i M3 ,
xj 0, j N1 , p0 Aj cj , j N1 ,
0
xj 0, j N2 , p Aj cj , j N2 ,
0
xj free, j N3 , p A j , = cj j N3 ,
n
Let x R be a nondegenerate basic feasible solulution to the primal. Therefore, x satisfies exactly n linearly
independent constraints as well as satisfying all inequality constraints. Hence, |M3 | = n. The rest of the
constraints are strict inequalities. By the complementary slackness conditions, all pi = 0 for i corresponding
to the strict inequality constraints, i M1 M2 . Now we need to determine the other pi , i M3 .

Let us construct a matrix B for which the rows of B are the a0i , i M3 . B is a full rank matrix with
dimension n n. If an xj = 0, x is degenerate because there would be more than n active constraints -
the constraint xj = 0 would be active in addition to those represented in B. Therefore, xj 6= 0. Hence, by

3
complementary slackness, (cj p0 Aj ) = 0 cj = p0 Aj .

Let A be a matrix with rows a0i . WLOG, suppose that {i : i M3 } = {1, . . . , n}, which means pi = 0 for
i > n and the rows a0i are ordered such that the equality constraints are the first n rows in A (the first n rows
are then equivalent to B). This implies that only the first n elements of each column Aj are relevant to the
C.S. condition cj = p0 Aj , because these remaining mn components are being multiplied by pi = 0. Letting
q be a vector with n components, our C.S. condition looks as follows: cj = q0 Bj c = q0 B q = c0 B1 .
Set pi = qi for i = 1, . . . , n, and pi = 0 for all other i.

p is clearly a uniquely defined vector with n uniquely defined components from the system q = c0 B1 and
the rest of the m n constraints being set to 0 by the C.S. conditions.

4.20
(a) Consider the following linear programming problem and its dual:

minimize c0 x
maximize p0 b
subject to Ax b
subject to p0 A c0 ,
x 0,
and assume that both problems have an optimal solution. Fix some j. Supose that every optimal solution
to the primal satisfies xj = 0. Show that there exists an optimal solution p to the dual such that p0 Aj < cj .
Hint: Let d be the optimal colst. Consider the problem of minimizing xj subject to Ax = b, x 0, and
c0 x d, and form its dual.

(b) Show that there exist optimal solutions x and p to the primal and to the dual, respectively such that
for every j we have either xj > 0 or p0 Aj < cj . Hint: Use part (a) for each j, and then take the average of
the vectors obtained.

(c) Consider now the following LP and its dual:

minimize c0 x maximize p0 b
subject to Ax b subject to p0 A c0
x 0, p 0.
Assume that both problems have an optimal solution. Show that there exist optimal solutions to the primal
and to the dual, respectively that satisfy strict complementary slackness, that is: (i) For every j, we have
either xj > 0 or p0 Aj < cj
(ii) For every i, we have either a0i x > bi or pi > 0. Hint: Convert the primal to standard form and apply
part (b).

4.26
Let A be a given matrix. Show that exactly one of the following alternatives must hold.

(a) There exists some x 6= 0 such that Ax = 0, x 0.


(b) There exists some p such that p0 A > 0.

((a) (b))
Ax = 0, x 0 x = 0, meaning A has n linearly independent columns. Regardless of m, the column space
of A spans Rn . By setting all pi = 0 for i
/ {1, . . . , n}, it is clear that we can find a vector p such that

4
p0 A > 0.

((a) (b))
Suppose there exists some x 6= 0 such that Ax = 0, x 0 the columns of A are linearly dependent.
ui = pi (a0i x bi ) = 0 i
vi = (cj p0 Aj )xj = 0 j
and assume that both problems have an optimal solution. Fix some j. Supose that every optimal solution
to the primal satisfies xj = 0. Show that there exists an optimal solution p to the dual such that p0 Aj < cj .
Hint: Let d be the optimal colst. Consider the problem of minimizing xj subject to Ax = b, x 0, and
c0 x d, and form its dual.

(b) Show that there exist optimal solutions x and p to the primal and to the dual, respectively such that
for every j we have either xj > 0 or p0 Aj < cj . Hint: Use part (a) for each j, and then take the average of
the vectors obtained.

(c) Consider now the following LP and its dual:

minimize c0 x maximize p0 b
subject to Ax b subject to p0 A c0
x 0, p 0.
Assume that both problems have an optimal solution. Show that there exist optimal solutions to the primal
and to the dual, respectively that satisfy strict complementary slackness, that is: (i) For every j, we have
either xj > 0 or p0 Aj < cj
(ii) For every i, we have either a0i x > bi or pi > 0. Hint: Convert the primal to standard form and apply
part (b).

4.26
Let A be a given matrix. Show that exactly one of the following alternatives must hold.

(a) There exists some x 6= 0 such that Ax = 0, x 0.


(b) There exists some p such that p0 A > 0.

((a) (b))
Ax = 0, x 0 x = 0, meaning A has n linearly independent columns. Regardless of m, the column space
of A spans Rn . By setting all pi = 0 for i
/ {1, . . . , n}, it is clear that we can find a vector p such that
p0 A > 0.

((a) (b))
Suppose there exists some x 6= 0 such that Ax = 0, x 0 the columns of A are linearly dependent.

You might also like