0% found this document useful (0 votes)
8 views

Linear Programming 2

The document discusses linear programming problems (LPP), including defining the objective function, constraints, and non-negativity principle of LPPs. It describes how to formulate an LPP and solve it using graphic and simplex methods, and covers post-optimality analysis.

Uploaded by

mengistu
Copyright
© © All Rights Reserved
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
8 views

Linear Programming 2

The document discusses linear programming problems (LPP), including defining the objective function, constraints, and non-negativity principle of LPPs. It describes how to formulate an LPP and solve it using graphic and simplex methods, and covers post-optimality analysis.

Uploaded by

mengistu
Copyright
© © All Rights Reserved
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 60

4/22/24, 8:16 AM Linear programming 2

UNITY 3: LINEAR PROGRAMMING

Contents
3.0 Aim and Objectives
3.1 Introduction
3.2 Definition
3.2.1 The Objective Function
3.2.2 The Constraints
3.2.3 The Non-Negatively Constraint
3.3 Formulation of LPP
3.4 Solving LPP: Graphic Method
3.5 Primal Versus Dual
3.6 The Simplex Method
3.6.1 Converting LPP to SLPP
3.6.2 Constructing the Initial Simple Table
3.6.3 Getting the Optimal Solution
3.6.4 Ties in the Entering or Leaving Variables
3.6.5 Alternative Optimal Solution
3.6.6 Unbounded Solution
3.6.7 Minimizing LPPs with  Constraints
3.6.8 LPPs with Negative Decision Variables
3.6.9 LPPs with Slack Variables that Affect the Objective Function
3.6.10 Post Optimality (sensitivity) Analysis
3.6.11 Finding the Admissible Range of Change.
3.6.12 Key Words

3.0 AIM AND OBJECTIVES

The aim of this unit is to enable you formulate linear programming problems (LPP),
optimize the problems using graphic and/or simplex methods and run post-optimality
analysis.

70

about:blan 1/
4/22/24, 8:16 AM Linear programming 2

After completing this unit, you will be able to:


• formulate LPPs
• differentiate the primal and the dual LPPs
• solve LPPs using graphic method
• solve LPPs using simplex method
• run post-optimality analysis

3.1 INTRODUCTION

In this unit, you will be introduced to linear programming. Problems (LPP) First you will
learn about the major components of a LPP, then you will learn how to correctly
formulate a LPP and a standard linear programming problem (SLPP). Then you will learn
how to solve LPPs using graphic and simplex methods. Finally, you will learn some
special cases and post-optimality analysis.

Through out this unit, we employ the techniques of matrix algebra discussed is the
previous unit. Therefore, if you find difficulties, revise unit Two-Matrix Algebra.

3.2 DEFINITION:

Linear programming is a tool or technique by which one can select the optimal solution
of problems like distribution of commodities, allocation of machines, scheduling
transport services, scheduling production, --- etc. In linear programming models the
objective function and all the constraints are polymnals of degree one. - that is why the
name linear..
linear

A typical linear programming problem has three parts: the objective function, the
constraints, and the non-negativity principle.

3.2.1 The Objective Function.


The primary task of the programmer is to formulate clear and measurable objective. An
objective should be either maximizing or minimizing some things. The manager may

71

about:blan 2/
4/22/24, 8:16 AM Linear programming 2

want to maximize profit, sales, quality, efficiency, welfare, production, etc. or, he or she
may want to minimize cost, inefficiency, time, waste, pollution, ---etc.

The objective function is of the form:


Maximize  = P1 X1 + P2 X2 +--- + Pn Xn
Or Minimize C = C1 X1 + C2 X2 + --- + Cn Xn
Where  is profit (or any other variable to be maximized) C
is cost (or any other variable to be minimized)
P1, P2 … Pn represent the contribution of X1, X2---Xn to the profit (or the
variable to be maximized)
C1, C2 --- Cn represent the contribution of X1, X2, … Xn to the cost. (or the
variable to be minimized)

3.2.2 The Constraints.

The constraints define the situations under which the objective function should be
maximized or minimized. It answer questions like: how much money, time, resources,
technical capacity, etc. are available to maximize or minimize the objective function?
How big or narrow is the market demand? All the relevant constraints should be
formulated in linear form. A constraint may be of <, , =,  or > type.

a 11 X 1 + a 12 X 2 + a 13 X 3 + … +a 1n X n (<, , =, , >) b 1
Constraints
a 21 X 1 + a 22 X 2 + a 23 X 2 + … +a 2n X n (<, , =, , >) b 2

a m1 X 1 + a m 2 X 2 + a m 3 X 3 + … + a mn X n (<,  =  >>)) b n

The coefficients a 11, a 12... a mn are called technical ccooeeffffiicciieennttss of X ii and measure
the contribution of each X ii to the constraint.
The matrix
a11 a12 a13 a1n
A= a21 a22 a23 a2n

am1 am2 am3 … amn


is called the technical ccooeeffffiicient
cient matrix.

72

about:blan 3/
4/22/24, 8:16 AM Linear programming 2

X1, X2,-----X
Xn are called decision variables. Managerial decision is expected to be based
on the values of these variables.

Question. List major types of constraints faced by producers?


Answer.
1. Budget constraint
2. Technical constraint
3. Time constraints
4. Resource constraint
5. Market demand constraint
6. Legal constraint
7. Environmental constraint
8. Cultural constraints
9. Etc

Note that some of the constraints may be very difficult or even impossible to express in
mathematical formula. However, for better results, you should consider all the constraints
you face against your objective.

3.2.3 The Non-Negativity Constraint.

The non-negativity constraint is actually a principle, which prohibits that any of the
variables be negative.

In economics, in general, and in linear programming in particular negative numbers are


rarely used. Mathematical economics operates, largely, in the Ist quadrant where all the
variables are greater than or equal to zero.

73

about:blan 4/
4/22/24, 8:16 AM Linear programming 2

II I quadrant
Quadrant most economic models
Operate in this quadrant
Where X and Y  0

X
III quadrant IV quadrant

Using negative numbers in economics may be misleading. For example, what does it
mean to produce –10 quintals of wheat? One cannot produce negative quantity. If it is
destruction, call it destruction rather than negative production. Likewise, you cannot pay
negative price. Therefore, in principle, the decision variables should not be negative. That
is why the non-negativity constraint becomes an integral part of the typical linear
programming problem. It is written as
 Xi  0
Or X1, X2-------------nn  0

Note that under specific conditions, negative decision variables may be admissible. In
such cases, the non-negativity principle should be relaxed or totally ignored (see section
3.6.8).

A set of variables X 1,, X2, X 3, … Xn that satisfies all the constraints is called a feasible
point or a feasible vector. The set of all such points constitute the feasible area or the
feasible space.
space .

3.3 FORMULATION OF LINEAR PROGRAMMING PROBLEM (LPP)

Example 1)) A firm produces two types of commodities, A and B, using two types of
inputs, I1 and I2. Production of one A requires 12 units of I 1 and 10 units of I2, and
production of one B require 7 units of I1 and 15 units of I 2. The daily supply of I 1 to the
firm cannot exceed 84 units and that of I2 150 units. Due to limited demand, the daily
production of A cannot exceed 6, however, there is no such limit with respect to the

74

about:blan 5/
4/22/24, 8:16 AM Linear programming 2

demand for B. The manager wants to maximize the sum of the two commodities. How
many As and Bs should be produced in order to fulfill the manager’s objectives?

Solution:
Now, the primary task is to properly formulate the problem. We can convert the above
explanation into a table form.

Input Requirement Maximum


To Produce One Daily
A B
Supply
Input 1 12 7 84
Input 2 10 15 150

Let X1 be the number of commodity A, and


X22 be the number of commodity B.

The LPP will be


Maximize S = X1 + X2 - The Objective function
Subject to
12X1 + 7 X2  84 - The input 1 constraint
10X1 + 15 X2  150 - The input 2 constraint
X1  6 - The demand constraint
X1,, X2 0 - The non-negativity constraint

Check your progress


Three products are processed through three different machines. The time required per unit
of each product, the daily capacity of the machines and the net profit per unit sold for
each product are given below:

Time required
Machines capacity
Per unit (in minutes)
(minutes per day)

75

about:blan 6/
4/22/24, 8:16 AM Linear programming 2

Machine Product I Product II Product III


M1 5 2 4 52
M2 0 4 3 55
M3 2 3 6 50
Net profit /unit 4 6 3
(Birr)

Build up a linear programming model of the above production-planning problem and find
the optimum daily production for the three products that maximizes the net profit.

Solution:
Let X11 be the number of product I
X2 be the number of product II
X3 be the number of product III

Maximize  = 4X1 + 6X2 + 3X3 The objective function


Subject to:
5X1 + 2X2 + 4X3  52 Machine 1 constraint
4X2 + 3X 3  55 Machine 2 constraint
2X1 + 3X2 + 6X3  50 Machine 3 constraint
X11, X2 , X3  0 The non-negativity constraint

3.4 SOLVING LINEAR PROGRAMMING PROBLEMS (LPP)

Graphic method
If the LPP consists of only two decision variables, then the problem can be solved using
the graphic method.

Example 1.
Maximize Z = 2X + 3Y
Subject to:
6X + 4Y  24
4X + 5Y  20

76

about:blan 7/
4/22/24, 8:16 AM Linear programming 2

X, Y  0

In order to solve this LPP, we draw the graphs of the constraints.

Y If an optimal solution of an LPP exists,


6 6X + 4Y = 24 then the solution will be the coordinates
4C The feasible area of the corners of the feasible area.
4X + 5Y = 20

0 A X 4
5

Any point in the feasible area satisfies the conditions, but the task is to identify the point
that maximizes the objective function. Checking each and every point in the feasible area
is both unimportant and impossible. Of course, the maximizing point cannot be inside the
feasible area since for every point inside the feasible area there is another point that will
improve the objective functions. Therefore, the solution should be at a corner. In the
above example, the solution should be either at O, A, B or C. Thus, we have to check
each of these points.

Corner X Y Z = 2X + 3Y
Points
0 0 0 2 (0) + 3 (0) = 0
A 4 0 2 (4) + 3 (0) = 8
B 20/7 12/7 2 (20/7) + 3 (12/7) = 76/7 = 10.86
C 0 4 2 (0) + 3 (4) = 12

Note: in order to find the coordinates of B, solve the two equations simultaneously
6X + 4Y = 24
4X + 5Y = 20
After solving this simultaneous equation, you will get X = 20/7 and Y = 12/7

The objective function Z is maximum at point C where X = 0 and Y = 4. Therefore, the


optimal solution is X = 0 and Y = 4, Maximum Z = 12

77

about:blan 8/
4/22/24, 8:16 AM Linear programming 2

Example 2.
Solve the LPP discussed in 3.3 above, using graphic method

Solution: The LPP is


Maximize S = X1 +
X22 Subject to:
12 X1 + 7 X2  84
10 X1 + 15 X2  150
X1 6
X1 X2  0

Then, plot the graph of the constraints


X2
12 12X1 + 7X2 = 84

10 D C X1 = 6
Feasible
Area 10 X 1 + 15 X2 = 150
B
A
0 6 7 15
X1

Then, check the corners


Corner X1 X2 S = X1 + X2
Points
0 0 0 0
A 6 0 6
B 6 12/7 54/7  7.71

78

about:blan 9/
4/22/24, 8:16 Linear programming

C 35/22 102/11 239/22  10.86


D 0 10 10

Hence, the optimal solution is X1 = 35/22 , X22 = 102/11 and Maximum S = 239/22

Example 3. The case of minimization LPP


Minimize Z = 5X + 7Y
Subject to:
X + 2Y  20
3X + Y  15
4X + 3Y  60
X, Y  0

ines. Note that the second constraint is  sign.


Solution: We plot the llines.

Y
20

15 II
Feasible Area
10 D III

C
I
A B X
5 15 20

Corner X Y Z = 5X + 7Y
Points
A 5 0 25
B 15 0 75
C 12 4 88
D 2 9 73

Hence the optimal solution is at A where X = 5 and Y = 0. The optimal (minimum) Z


equals 25.

79

about:blan 10/
4/22/24, 8:16 Linear programming

Note that if the object, was to maximize Z = 5X + 7Y, then the optimal solution would
have been at point C, where X = 12 and Y = 4, Max Z = 88.

Example 4. The case of no solution LPP


Maximize Z = 3X + 4Y
Subject to:
-X + Y  1
-X+Y0
X, Y  0
I
Y
II

As it is clear from the figure, the two constraints do not intersect and therefore, there is
no feasible point. Hence, the given LPP has no solution, consequently no optimal
solution.

Example 5.. The case of alternative optimal solutions


Maximize 𝜃 = 4X 1 + 18X22

Subject to:
2X1 + 9X2  36
4X1 + 3X2  42 X1
, X2  0

X2

II

4 C
B
I
A

80

about:blan 11/
4/22/24, 8:16 Linear programming

0 10.5 18 X1

Corner X1 X2 𝜃
points
0 0 0 0
A 21/2 0 42
B 9 2 72
C 6 4 72

Both points B and C are optimal solutions. Actually, any point on line-segment BC is
optimal. Therefore, there are many alternative optimal solutions note that the objective
function and the first constraint are parallel.

Example 6. The case of unbounded (no ffiin


niittee)) solution.
Maximize Z = 4X + 5Y
Subject to:
X+Y1
-2X + Y  1
4X – 2Y  1
X, Y 0
0

II
Unbounded in the maximization
direction

III
1
Feasible area

I
¼ 1

The optimal area is unbounded and therefore the LPP has no finite solution. There are
solutions that fulfill the conditions; but not optimal (i.e. max. Z.) solution.

81

about:blan 12/
4/22/24, 8:16 Linear programming

Note that if the objective was to minimize Z, the above LPP would have had a finite
solution.

Check your progress


Solve the following LPPs using graphic method
1. Maximize  = 2X1 – 3X2
Subject to:
3X1 + 7X22  42
X1 + 5X22  22
X1, X2  0
2. Maximize 𝜃 = 5X1 +
8X2 Subject to:
X11 + X2  13
X11 + 2X2  22
2X11+ X2  20

X11, X2  0
3. Maximize Z = X11 +
2X2 Subject to:
X11+ X2  5
2X11 + 3X2  12
X11, X2  0
Answers
1. X1 = 14, X2 = 0, = 28
2. X1 = 4, X22 = 9, 𝜃 = 92
3. X1 = 0, X2 = 4, Z = 8

3.5 PRIMAL VERSUS DUAL

Every linear programming problem has associated with it another LPP which is called the
dual of the original (or primal ) problem. If the objective function of the primal is
maximization, that of the dual will be minimization and vise-versa.

82

about:blan 13/
4/22/24, 8:16 Linear programming

Primal Dual
Maximize Z = 3X11 + 4X2 Minimize C = 20Y1 + 24Y2
Subject to Subject to
5X11 + 2X2  20 5Y1 + 4Y2  3
4X11 + 6X2  24 2Y1 + 6Y2  4
X1, X2  0 Y1, Y2  0

Note that:
1. A maximization objective function is converted into a minimization objective
function, Z is replaced by C.
2. The decision variables are different; Xs are converted into Ys.
3. The less than or equal to ()) constraints are converted into great
greater than or equal
to () constraints.
4. The technical coefficient matrix is transposed, that is, the rows are converted into
columns and columns into rows. That is, 5 2 is transposed and become 5 4
4 6 2 6
5. The constant terms of the constraints and the coefficients of the objective function
are transposed. That is, 20 and 24 become coefficients of the objective function of
the dual while 3 and 4 become constant terms of the constraints of the dual.

Theorem: A primal has optimal solution if and only if its dual has optimal solution.
Maximum Z = Minimum C, where Z is the objective function of the primal
and C is the objective function of the dual.

It is always possible to give an economic interpretation to the dual problem. For example
if the primal is profit maximization, the dual may be cost minimization. If decision
variables of the primal represent quantity of output, the decision variables of the dual
represent the imputed values or shadow prices or opportunity costs of the inputs.

Example: Solve the primal and the dual LPP given above, using graphic method.

Maximize Z = 3X11 + 4X 2 Minimize C = 20 Y1 + 24Y2


Subject to Subject to
5X1 + 2X22  20 5Y1 + 4Y2 3

83

about:blan 14/
4/22/24, 8:16 Linear programming

4X1 + 6X22 24 2Y1 + 6Y2  4


X1, X2 0 Y1, Y2  0
Y2 Y2
10
I 3/4 C
C I Feasible
4 II 2/3 Area
B
Feasible B
Area II
A A
0 3/5 2 Y1
0 4 6 X1

Corner Corner
Points X11 X2 Z = 3X1 + 4X2
0 0 0 0
A 4 0 12
B 36/11 20/11 188/11  17.09
C 0 4 16
Points Y1 Y22 C = 20Y1 +
24Y2
A 2 0 40
B 1/11 7/11 188/1117.0
C 0 3/4 9
18

The optimum is at point B where X11 = 36/11 The optimum is at point B, where
X2 = 20/11 and max. Z = 188/11 Y1= 1/11, Y22 = 7/11 and min. C =

188/11 Note that optimal value is both cases is the same 188/11.

1. A Metallurgy Factory produces two types of auto spare parts X and Y. Net profit
from production of one X is 2 birr and of one Y is 3 birr. Each of the two
machines the factory owns works 8 hours a day and 6 days per week. A table
below shows the productivity of the machines in terms of time requirements. One
X requires 19m2 and one Y requires 7 m2 of space for storage and the total storage

84

about:blan 15/
4/22/24, 8:16 Linear programming

area available is 133, 000 m2. Weekly production of X cannot exceed 6000 and
that of Y 9000 due to the limited market demand.
Time Needed to Produce One
X Y
Machine
M1 20 Sec. 15 Sec.
M2 10 Sec. 18 Sec.

a. Write the lliinneeaarr Prrog


ogrraamm
mmiinngg problem assuming that the manager
wants to maximize profit.
b. Write the dual LPP

2. A firm produces two types of outputs, X1 and X2 using two types of inputs
according to the table below. The management wants to maximize profit.

Inputs Required to The available


Inputs Produce One Stock of Inputs
X1 X2
Input one 2 units 3units 12 units
Input Two 2 uni
nitss 1 units 8 units
Profit Per Unit 3 Birrs 2 Birrs

a. Convert the problem into linear programming problem (LPP)


b. Write the dual LPP and interpret it.
c. Solve both LPPs using graphic method.

1.
a) Maximize  = 2X + 3Y
Subject to:
20X + 15Y  172,800 (M 1 constraints in seconds)
10X + 18Y  172,800 (M2 constraints in seconds)
19X + 7Y  133,000 (space constraint)
X  6,000 (demand constraint)

85

about:blan 16/
4/22/24, 8:16 Linear programming

Y  9,000 (demand constraint)


X, Y  0
b) Let the dual decision variables be aii
Minimize C = 172,800a1 + 172,800a2 + 133,000a3 + 6,000a4 + 9,000a5
Subject to:
20a1 + 10a2 + 19a3 + a4  2
15a1 + 18a2 + 7a3 + a5  3
aii  0
2.
a) Maximize  = 3X11 + 2X2
Subject to:
2X 1 + 3X2  12
2X1 + X2  8
X1, X2  0

b)
Minimize C = 12Y1 + 8Y22
Subject to:
2Y1 + 2Y2  3
3Y1 + Y2  2
Y1, Y2  0
The dual minimize the cost of production;Ys are opportunity costs.

c)
For the primal LPP
X2 8

4C
B

0 4 6 X1

Corner X1 X2  = 3X1 + 2X2


points
0 0 0 0

86

about:blan 17/
4/22/24, 8:16 Linear programming

A 4 0 12
B 3 2 13
C 0 4 8
The optimal solution is X11 = 3, X2 = 2 and Maximum  =13

for “b” For the Dual LPP

2 C

3/2
B

Corner
A
2/3 3/2
Y1 Y2 C = 12Y1 + 8Y22
points
A 3/2 0 18
B ¼ 5/4 13
C 0 2 16

The optimal solution is Y1 = 1/4, Y2 = 5/4 and minimum C = 13

3.6 THE SIMPLEX METHOD

When the number of variable in a LPP is more than two, graphic method cannot be used
because of the difficulty or even impossibility of plotting more than two-dimensional
graphs on a plane. In such situations, a method known as “Simplex’ is used. “Simplex” is
a mathematical term and it is not related with the word “simple”. Simplex is an algebraic
method of solving LPPs.

In order to apply simplex method all the constraints should be in the form of equations,
not in the form of inequalities.

Definition:

87

about:blan 18/
4/22/24, 8:16 Linear programming

An LPP in which all the constraints are in the form of equations is called a standard linear
programming problem (SLPP).

Every LPP can be converted into SLPP by applying the following procedure
1. If a constraint is in the form of  type, then add a new non-negative variable to the left
side of the inequality to make both sides equal and then rewrite the constraint as an
equation.

Example:
If 2X1 + 3X2  20, add S1  0 to the left side of the inequality to make both sides
Equal. Thus 2X1 + 3X 2 + S1 = 20, where S1  0

Generally, constraints of the form


n

 ai Xi bi are rewritten as


i =1


i =1
ai Xi+ Xn+1 =bi

Where Xi, Xn + 1  0

The new added variables (S 1 or the Xn + 1 in the above examples) are called slack
variables meaning weak, loose or inactive variables. The purpose of slack variables is to
make the computation possible. They do not affect the manager’s decision and therefore
they are non-decision variables.

2. If a constraint is in the form of  type, then subtract a new non-negative variable from
the left side of the inequality to make both sides equal. Then, rewrite the constraint as an
equation.

Example:
If 2X1 + 5X22  30, subtract S2  0 from the left side of the inequality and rewrite
it as:

2X11+ 5X2 - S2 = 30, where S 2  0


Generally, constraints of the form

88

about:blan 19/
4/22/24, 8:16 Linear programming

ajXj bj are rewritten as


j =1

 ajxj- Xn+1 =bj , Where  Xj and  Xn + 1  0


j =1

The subtracted variables (S2 or Xn + 1 in the above exampl


examples) are called surplus
varriiaabblleess.
va . The major purpose of these variables is make the simplex computation
possible. They do not affect managerial decisions and therefore they are also non-decision
variables.

Note that decision variables are those vari ablleess upon whose optimal values depend
var iab
managerial decision. For example, they determine how much of what should be
produced, how much of labor or other resources should be employed or for how many
hours a machine should work, etc. in order to maximize or minimize the objective
function. The non-decision variables, on the other hand, are those variables whose
optimal values do not directly affect managerial decisions. Typically, they are included to
facilitate the simplex procedure. However, this does not mean that the non-decisions
variables are totally irrelevant except for their computational purpose. They have at least
three more advantages: a) if a non-decision variable has a positive optimal value (that is,
if the value of a slack or a surplus variable is greater than zero in the optimal solution),
this indicates that there is excess resource, money or time (or any other thing represented
by the corresponding constraint) that cannot be utilized due to the constraints and this
may help as a clue to the manager to reconsider the constraints, b) as it will be discussed
in more detail in section3.6.9, although non-decision variables do not affect managerial
decision, they can nevertheless may affect the objective function under certain
circumstances (Hence, affecting the objective function do not necessarily mean affecting
managerial decision),and C) as it will be explained soon, the non-decision variables of
the primal LPP are decision variables of the dual LPP and therefore they enable simplex
method solve the two LPPs (the primal and its dual) simultaneously.

3.6.1 Converting LPP to SLPP

An LPP given in the form of


Maximize  = P1 X1 + P2 X2 + P3 X3 (1)

89

about:blan 20/
4/22/24, 8:16 Linear programming

Subject to:
a11 X1 + a12 X2 + a13 X3
b1 a21 X1 + a22 X2 + a23 X 3 b2
a31 X1 + a32 X2 + a33 X3
 b3
X1 , X2 , X 3 0

can be converted to SLPP by adding slack variables to the left sides of the inequalities.
Since, in this case, the slack variables do not affect the objective function, the coefficients
of the slack variable in the objective function is zero. Thus, you may write them as OX n +
, OXn + 2----------etc. or ignore them. After making the necessary arrangement the SLPP will
1

be

Maximize  = P1 X1 + P2 X2 + P3 X3 + OY1 + OY2 + OY3


Subject to
a11 X1 + a12 X2 + a13 X3 + Y1 =b1
a21 X1 + a22 X2 + a23 X3 + Y2 =b2
a31 X1 + a X + a X + Y =b
3 2 33 3 3 3
2
 Xi , Yi 0

Note that the Xs are decision variables and the Ys are slack or non-decision variables.
Since the slack variables do not affect the objective function, their coefficients in the
objects function are equal to zero. For brevity, you can ignore writing zero values in the
objectives function.

For the sake of convenience transfer the right side variables of the objective function to
the left side and then you will get:
Maximize  - P1X1 – P2X2 – P3X33 = 0
Subject to:
a11 X1 + a12 X2 + a13 X3 + Y1 =b1
a21 X1 + a22 X2 + a23 X3 + Y2 =b2
a31 X1 + a32 X2 + a33 X3 + Y3 =b3
 Xi, Yi 0

A linear programming problem (LPP) written in the above form is called a standard
linear programming problem (SLPP).

90

about:blan 21/
4/22/24, 8:16 Linear programming

about:blan 22/
4/22/24, 8:16 Linear programming

about:blan 23/
4/22/24, 8:16 Linear programming

about:blan 24/
4/22/24, 8:16 Linear programming

about:blan 25/
4/22/24, 8:16 Linear programming

about:blan 26/
4/22/24, 8:16 Linear programming

about:blan 27/
4/22/24, 8:16 Linear programming

about:blan 28/
4/22/24, 8:16 Linear programming

about:blan 29/
4/22/24, 8:16 Linear programming

about:blan 30/
4/22/24, 8:16 Linear programming

about:blan 31/
4/22/24, 8:16 Linear programming

about:blan 32/
4/22/24, 8:16 Linear programming

about:blan 33/
4/22/24, 8:16 Linear programming

about:blan 34/
4/22/24, 8:16 Linear programming

about:blan 35/
4/22/24, 8:16 Linear programming

about:blan 36/
4/22/24, 8:16 Linear programming

about:blan 37/
4/22/24, 8:16 Linear programming

about:blan 38/
4/22/24, 8:16 Linear programming

about:blan 39/
4/22/24, 8:16 Linear programming

about:blan 40/
4/22/24, 8:16 Linear programming

about:blan 41/
4/22/24, 8:16 Linear programming

about:blan 42/
4/22/24, 8:16 Linear programming

about:blan 43/
4/22/24, 8:16 Linear programming

about:blan 44/
4/22/24, 8:16 Linear programming

about:blan 45/
4/22/24, 8:16 Linear programming

about:blan 46/
4/22/24, 8:16 Linear programming

about:blan 47/
4/22/24, 8:16 Linear programming

about:blan 48/
4/22/24, 8:16 Linear programming

about:blan 49/
4/22/24, 8:16 Linear programming

about:blan 50/
4/22/24, 8:16 Linear programming

about:blan 51/
4/22/24, 8:16 Linear programming

about:blan 52/
4/22/24, 8:16 Linear programming

about:blan 53/
4/22/24, 8:16 Linear programming

about:blan 54/
4/22/24, 8:16 Linear programming

about:blan 55/
4/22/24, 8:16 Linear programming

about:blan 56/
4/22/24, 8:16 Linear programming

about:blan 57/
4/22/24, 8:16 Linear programming

about:blan 58/
4/22/24, 8:16 Linear programming

about:blan 59/
4/22/24, 8:16 Linear programming

about:blan 60/

You might also like