0% found this document useful (0 votes)
68 views8 pages

Exam #1

The document provides notes on linear programming and the simplex method. It defines key terms like decision variables, objective function, and constraints. It explains how to represent problems in standard form and find feasible solutions. The simplex method is introduced as an algorithm that starts at a feasible extreme point and iteratively moves to adjacent points with better objective values until no improvement is possible. Each iteration involves choosing an improving direction, calculating a maximum step size, and updating the basis accordingly. Degeneracy can occur if a basic variable reaches zero, requiring multiple zero-length moves before progressing further.

Uploaded by

Stacc
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as RTF, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
68 views8 pages

Exam #1

The document provides notes on linear programming and the simplex method. It defines key terms like decision variables, objective function, and constraints. It explains how to represent problems in standard form and find feasible solutions. The simplex method is introduced as an algorithm that starts at a feasible extreme point and iteratively moves to adjacent points with better objective values until no improvement is possible. Each iteration involves choosing an improving direction, calculating a maximum step size, and updating the basis accordingly. Degeneracy can occur if a basic variable reaches zero, requiring multiple zero-length moves before progressing further.

Uploaded by

Stacc
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as RTF, PDF, TXT or read online on Scribd
You are on page 1/ 8

Page 1 of 6

Exam 1 notes

Exam 1 Notes
Linear program terminology
o Decision variables the variables we need to determine
o Input parameters the costs, times taken for things, etc
o Objective function the thing that needs to be maximised or
minimized.
o Unbounded feasible choices of the decision variables can produce
arbitrarily good objective function values.
o Linear program constraints and objective function are linear (ie:
weighed sums of the decision variables).
Solving problems graphically
o Optimal solution optimal line minimised only at corner point.
o Plotting a line of the form ax + by = C
Get the x by itself and set y to 0. Vice
versa.
For the objective function, let C = some random value.
Step sizes and directions
o As long as the step size is finite, theres no indication that the
program is unbounded.
o To find the conditions required for a feasible direction Dw at a
given point
Find the active (tight) constraints at that
point
Require
that

Dw (constraint vector) =

as needed.

o To check whether a direction is improving, dot it with the objective


function vector, and look at the sign of the result.
o To find the maximum step size, l , assuming you make the change
lDw , change all the variables accordingly, solve, and take the
smallest one.
Improving search
o The feasible region of an LP is convex [the line segment between
Daniel Guetta

Page 2 of 6
Exam 1 notesevery pair of feasible points falls entirely within the feasible region].

Daniel Guetta

o This means that every local optimal solution is a global optimal


solution.
o Algorithm
Initialisation choosing stating solution
Check for local optimality (no improving solution)
Find improving and feasible direction
Find step size
Advance
Standard-form LP
o All variables must be non-negative, and only equalities are included.
o Add slack variables to make up for it.

o Objective function is c x and constraints are Ax = b, x 0 .


o For a negative variable, make a new variable x =
-x

. For a

variable that can be positive or negative, make two new variables


x= x

+ x

o For an absolute value x , replace it with a new variable z and add


constraints z > x and z > x, with z1, z2, z3 > 0.
o For a maximin (maximise the minimum), introduce a new variable f,
and maximise f subject to f < [said variable].
o n is the number

of decision variables,

m is the number

of

constraints, cj is the objective function coefficient of xj and ai,j is the


coefficient of xj in the i th constraints and bj is the RHS of main
constraint i.
Types of points
o Interior point no inequality is active
o Boundary point at least one inequality constraint is satisfied as
equality at a given point
o Extreme points of convex sets those that do not lie within the line
segment between any two other points in the set. Generally a
solution of a system of n equations and n variables. Some can be
determined by different sets of active constraints.

o Adjacent extreme points determined by active

constraints

differing in only 1 element.


Simplex intro
o Effectively an improving search algorithm
o Starts at an extreme point, and moves to adjacent extreme point
with better objective value
o until no adjacent extreme point has better objective value.
o Algorithm
Initialization choose starting feasible solution
If no improving feasible direction, stop
Construct improving feasible direction
Choose step size [if no limit, stop]
Advance
Simplex standard display
Max c

x1

xn

c1

cn

a11

b
b1

A
ann

bn

Basic var?
x(0)

cx
c

Dx for x?

(0)

...ratios
New bas. var?

x(1)

cx

(1)

Simplex basic solutions


o Fix n m variables (nonbasic variables) to 0. Obtain a unique
solution for the remaining system of m variables (basic variables)
and m equations.
o Qualifications

A basic solution

is feasible (BFS)

if it

satisfies

all

non- negativity constraints.


Sometimes, we cant even get a basic solution, if the system
of equations obtained after setting nonbasic variables to 0
doesnt have a unique solution.
o For a standard-form LP, the BFSs are exactly the extreme points of
the feasible region. We need to cycle through them.
Simplex first phase
o This phase finds a basic solution, by creating an artificial linear
program.
o Procedure
Multiply constraints by 1 as necessary to make b positive.
Add

non-negative

artificial

variable

for

each

constraint, and set to objective function to minimise the


sum of these artificial variables.
This has an easy to find BFS set all the non-artifical
variables to 0.
o This is good because
It cant be infeasible (because it has a BFS)
It cant be unbounded (because the variables are > 0)
o Solve.
o If the solution of A doesnt make all artificial variables 0, then the
original program is infeasible.
o If the solution of A has all the artificial variables 0, then whats left
over is a BFS for the original program.
Simplex finding an improving direction
o In improving a direction, we choose one nonbasic variable, and
move it out of the basis. To decide which, we see which improves
our cost best.
o Procedure
Choose a nonbasic variable xj
Move direction

+1

0
Dx =

i
?

i= j
i j (x i nonbasic)
otherwise

Need
A (x + lDx )
= b
ADx =
0
m variable, m equations, has unique solution because x is a
BFS.
The change in our objective function
is
c (x + lDx ) - c x =
lc Dx
And so we calculate the reduced cost for the variable j
cj = c
Dx
o We calculate those for each nonbasic variable, and then choose the
one with the best reduced cost to move into the basis.
o If there is no improving direction stop. Were done
Simplex step size
o This is an LP, so the improving directions are improving forever,
and we constructed the system such that equality constraints are
satisfied, so problems must come from violating non-negativity. We
want to increase l until we violate one of those.
Dx j < 0 .
o Increasing l can only lead to bad things for
o We therefore use step size
x (tj )

: Dx
l = min
< 0
j

-Dx
j

Calculate this ratio for every variable in the basis for our chosen
direction, add to standard display, and pick the minimum one.
Simplex updating the basis

(t +1)

(t )

o x
x + lDx
o Nonbasic variable used to generate direction becomes basic.
o Basic variable that determines step size becomes nonbasic.
Simplex degeneracy

o Happens if more than the required number of constraints are active


at a given extreme point.
o In other words, one of the basic variables is 0.
o The simplex method may generate a step size of 0 (if the simplex
direction involves decreasing a variable that is already equal to 0)
and can then get stuck for a few steps.
o Computations will (usually) escape these zero-length moves and
eventually produce a direction where improving progress can be
made.
o Thus, the simplex method does not necessary move to an adjacent
extreme point, but it does move to an adjacent basis.

You might also like