Student Guide 2
Student Guide 2
LECTURE 2
Linear Programming
Model Formulation
1
Linear programming
Linear programming can be defined as a
technique for optimizing a linear function to
reach the best outcome. This linear function
or objective function consists of linear
equality and inequality constraints. We obtain
the best outcome by minimizing or maximizing
the objective function.
Constraints
Constraints are conditions or limitations that define the feasible region within which the optimal
solution of the optimization problem must lie. These conditions are typically expressed as linear
inequalities or equations involving the decision variables. Decision variables are the unknown
quantities or choices that must be determined in a mathematical optimization problem.
In mathematical terms, if we have n decision variables (x1x2, ..., xn), the general form of a constraint
can be represented as:
Where:
a1 , a2 , ..., an are coefficients representing the resource requirements or limitations.
b is the maximum available quantity of the resource.
Constraints can also have equality relationships or greater-than inequalities, depending on the
problem
Example of Constraints
Consider a manufacturing company that produces two types of products, A and B. The production
process requires a certain amount of labour and raw material. The constraints could be formulated
as follows:
2
2A + 3B ≤ 200 (labour constraint)
A + B ≤ 100 (raw material constraint)
A ≥ 0, B ≥ 0 (non-negativity constraints)
Objective Function: The objective function is the mathematical expression that represents the goal
of the optimization problem. It is usually a linear combination of the decision variables, and the goal
is to either maximize or minimize this function. The coefficients of the decision variables in the
objective function represent the contribution of each variable to the overall objective.
Where:
Objective functions in linear programming are linear because they involve only first-degree terms,
each consisting of a constant coefficient multiplied by a decision variable. The decision variables
represent the quantities to be determined or optimized.
Feasible Region
The feasible region is determined by the intersection of the individual constraints. It is typically
represented graphically in two-dimensional or three-dimensional space, depending on the number
of decision variables involved.
3
Corner Points
Corner points (also known as vertices or extreme points) are the points where the boundaries of the
feasible region intersect. These points represent specific combinations of values for the decision
variables that satisfy all the constraints simultaneously.
The importance of corner points in linear programming is that the optimal solution, which
maximizes or minimizes the objective function, always occurs at one of the corner points of the
feasible region.
The optimal solution can be determined by evaluating the objective function at each corner point.
The optimal solution will occur at one or more corner points if the objective function is linear. If the
objective function is convex or concave, the optimal solution may lie within the line segment
connecting two adjacent corner points.
To find the optimal solution, linear programming algorithms systematically evaluate the objective
function at the corner points of the feasible region until the best solution is identified.
Where
4
§ cj are coefficients representing the per unit profit (or cost) of decision variable xj to the value of
objective function.
§ The aij’s are called technological coefficients (or input-output coefficients). These represent
the amount of resource, say consumed per unit of variable (activity) xj . These coefficients can
be positive, negative or zero.
The bi represents the total availability of the ith resource. The term resource is used in a very
general sense to include any numerical value associated with the right-hand side of a constraint. It
is assumed that bi ≥ 0 for all i. However, if any bi < 0, then both sides of constraint i are multiplied by
–1 to make bi > 0 and reverse the inequality of the constraint.
Step 2: Write the objective function: The decision variables you have just selected should be
employed to jot down an algebraic expression showing the quantity we are trying to optimize. In
other words, we can say that the objective function is a linear equation that is composed of decision
variables.
Step 3: Identify a set of constraints: Constraints are the limitations in the form of equations or
inequalities on the decision variables. Remember that all the decision variables are non-negative,
i.e., positive or zero.
Step 4: Choose the method for solving the linear programming problem: Multiple techniques can
be used to solve a linear programming problem. These techniques include
Note: For a problem to be a linear programming problem, the decision variables, objective function
and constraints all have to be linear functions.
Example:
A manufacturing company produces three types of products: A, B and C. The production department
produces, each day, components sufficient to make 50 units of A, 25 units of B and 30 units of C. The
management is confronted with the problem of optimizing the daily production of the products in the
assembly department, where only 100 man-hours are available daily for assembling the products. The
following additional information is available:
5
Type of Product Profit Contribution per Unit (Rs.) Assembly Time per Product (hrs)
A 12 0.8
B 20 1.7
C 45 2.5
The company has a daily order commitment for 20 units of products A and a total of 15 units of
products B and C. Formulate this problem as an LP model to maximize the total profit.
Solution:
A B C
Decision Variables: Let x1, x2 and x3 = number of units of products A, B and C to be produced,
respectively.
6
Different Types of Linear Programming Models
The different types of linear programming models are
1. Simplex
2. Criss-cross
3. Ellipsoid
1. Simplex Method:
The Simplex method is a widely used algorithm to solve linear programming problems. It was
developed by George Dantzig in the late 1940s. The method iteratively moves along the edges of
the feasible region (a convex polytope) toward the optimal solution. At each step, the algorithm
moves to an adjacent vertex that improves the objective function value until an optimal solution
is reached.
Example: Consider a manufacturing company that wants to maximize profit from producing two
product types while considering resource constraints. The Simplex method can be applied to find
the optimal production quantities for these products.
2. Criss-cross Method:
The Criss-cross method, also known as the "Two-Phase Simplex" method, enhances the Simplex
algorithm. It solves linear programming problems with equality constraints (not just inequalities).
The Criss-cross method first transforms the problem into an equivalent form with only
inequalities before applying the Simplex algorithm.
3. Ellipsoid Method:
The Ellipsoid method is another algorithm for solving linear programming problems, particularly
in the context of higher dimensions. It is based on the geometric concept of inscribed ellipsoids.
The method iteratively reduces the size of an inscribed ellipsoid that contains the feasible region
until the optimal solution is approximated.
Example: In portfolio optimization, an investor wants to allocate funds among various assets to
maximize expected return while staying within a risk tolerance. The ellipsoid method could
potentially be used to find an efficient allocation strategy.
7
Sensitivity Analysis
Sensitivity analysis in linear programming is a technique used to analyze how changes in the input
parameters of a linear programming model affect the optimal solution and the values of the
decision variables and the objective function. It provides insights into the stability and robustness
of the solution when the problem data or constraints are modified.
§ How does a change in the objective function coefficients impact the optimal solution?
§ What happens if a constraint is relaxed or tightened?
§ How sensitive is the optimal solution to changes in the right-hand side values of the
constraints?
§ What is the range of values for an input parameter within which the current solution remains
optimal?
Sensitivity analysis involves examining different scenarios by adjusting one or more input
parameters while keeping others fixed. By evaluating the impact on the optimal solution, sensitivity
analysis helps identify critical parameters, determine the range of validity for the current solution,
and make informed decisions.
Common measures used in sensitivity analysis include the shadow price (dual value), which
represents the marginal value of relaxing or tightening a constraint, and the range of optimality,
which shows the range of values over which the current optimal solution remains valid.
Overall, sensitivity analysis is a valuable tool in linear programming to assess the robustness and
flexibility of the solution and understand how it responds to changes in the problem's input
parameters.
References:
§ Hillier, F. S., & Lieberman, G. J. (2013). Introduction to operations research. New York, NY: McGraw-Hill.
§ Winston, W. L. (2014). Operations research: Applications and algorithms. Boston, MA: Cengage Learning.
§ Taha, H. A. (2016). Operations research: An introduction. Upper Saddle River, NJ: Pearson.