0% found this document useful (0 votes)
10 views

Linear Programming Models-1

Uploaded by

rituchahal035
Copyright
© © All Rights Reserved
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
10 views

Linear Programming Models-1

Uploaded by

rituchahal035
Copyright
© © All Rights Reserved
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 32

Linear

Programmi
ng model
An optimization model seeks to find the values
What is of the decision variables that optimize
Optimizati (maximize or minimize) an objective function
among the set of all values for the decision
on model variables that satisfy the given constraints
Its three main components are:

• Objective function: a function to be optimized (maximized or minimized).

• Decision variables: controllable variables that influence the performance of the


system.

• Constraints: set of restrictions (i.e. linear inequalities or equalities) of decision


variables. A non-negativity constraint limits the decision variables to take positive
values (e.g. you cannot produce negative number of items x1, x2 and x3).

The solution of the optimization model is called the optimal feasible solution.
Modeling Steps
Modeling accurately an operations research problem is the most significant-and sometimes the most
difficult-task. A wrong model will lead to a wrong solution, and thus, will not solve the original problem.
The following steps should be performed by different team members with different areas of expertise
to obtain an accurate and greater view of the model:

•Problem definition: defining the scope of the project and identifying that the result is the identification
of three elements: description of decision variables, determination of the objective and determination
of the limitations (i.e. constraints).
•Model construction: translating the problem definition into mathematical relationships.
•Model solution: using a standard optimization algorithm. Upon obtaining a solution, a sensitivity
analysis should be performed to find out the behavior of the solution due to changes in some of the
parameters.
•Model validity: checking if the model works as it was supposed to.
•Implementation: translating the model and the results into the recommendation of a solution.
Linear Programming
•Linear programming (also referred as LP) is an operations research
technique used when all the objectives and constraints are linear (in
the variables) and when all the decision variables are continuous. In
hierarchy, linear programming could be considered as the easiest
operations research technique.

•The lpSolve package from R contains several functions for solving


linear programming problems and getting significant statistical analysis.
Sample Question - 1
To find the optimal solution for the problem given below.
•Suppose a company wants to maximize the profit for two products A and B which are sold at $25 and $20
respectively. There are 1800 resource units available every day and product A requires 20 units while B requires 12
units. Both of these products require a production time of 4 minutes and the total available working hours are 8 in
a day. What should be the production quantity for each of the products to maximize profits?

objective function in the problem will be


Max(sales)=max(25×a+20×b)
a is the units of product A produced, b is the units of product B produced
a and b are also called decision variables.

The constraints are resources and time in this case.


Resource Constraint => 20*a + 12*b <= 1800
Time Constraint => 4*a + 4*b <= 8*60
Constraint Matrix

Create constraint matrix


Conclusion
•From the above output, we can see that
the company should produce 45 units of
product A and 75 units of product B to
get sales of $2625, which is the
maximum sales that the company can
get given the constraints.
Sample Question - 2
Below there is a diet chart which gives me calories, protein, carbohydrate and fat content
for 4 food items. Sara wants a diet with minimum cost. The diet chart is as follows

The chart gives the nutrient content as well as the per-unit cost of each food item. The
diet must be planned in such a way that it should contain at least 500 calories, 6
grams of protein, 10 grams of carbohydrates and 8 grams of fat.
1. Identify Decision Variables : food items are the decision variables in this
example
Food1 = a, Food2 = b, Food3 = c, Food4 = d
2. Objective Function: As we want to keep the cost of diet minimum
Z(min) = 0.5a + 0.2b + 0.3c + 0.8d
3. Define the Constraints:
400a + 200b + 150c + 500d >= 500
3a + 2b + 0c + 0d >= 6
2a + 2b + 4c + 4d >= 10
2a + 4b + 1c + 5d >= 8
4. Non-negative Restrictions: a >= 0 , b >= 0 , c >= 0 , d >= 0
summary(opt)
Conclusion:
•As we see for our optimum solution means diet
with minimum cost Sara should have 0 units of
Food1, 3 units of Food2, 1 unit of Food3 and 0
units of Food4. the cost of diet will be $0.90 and
this diet will provide her at least 500 calories, 6
grams of protein, 10 grams of carbohydrates and 8
grams of fat.
Sample Question - 3

Farmer Jean has two types of crops, Parsnips and


Kale. Parsnips cost $0.20 per plot to grow and sell
for $0.35 per plot. Kale cost $0.70 per plot to grow Constraints:
and sell for $1.10 per plot. She has 200 plots of land,
and a $100 budget. How many of each crop should
she plant to maximise her profits?

Decision variables:
X_1 = number of plots of parsnips grown
X_2 = number of plots of kale grown
Objective function:
Maximize Profits = 0.15 X_1 + 0.40 X_2
• The Feasible
Region is the
intersection of all the
constraint regions.
• Objective function parameters, which are just the coefficients of X1 and X2 in the
object function:
• Profits = 0.15 X1 + 0.40 X2

• Constraints, which are broken up into 3 variables:


• the constraints matrix
• the constraint directions
• the constraint values (or constraint RHS for right-hand-side).
Sample Question - 4
• A company produces two models of chairs: 4P and 3P. The model 4P
needs 4 legs, 1 seat and 1 back. On the other hand, the model 3P
needs 3 legs and 1 seat. The company has a initial stock of 200 legs,
500 seats and 100 backs. If the company needs more legs, seats and
backs, it can buy standard wood blocks, whose cost is 80 euro per
block. The company can produce 10 seats, 20 legs and 2 backs from a
standard wood block. The cost of producing the model 4P is 30
euro/chair, meanwhile the cost of the model 3P is 40 euro/chair. Finally,
the company informs that the minimum number of chairs to produce is
1000 units per month. Define a linear programming model, which
minimizes the total cost (the production costs of the two chairs, plus
the buying of new wood blocks).
• Ans - 48680
• https://desmond-ong.github.io/stats-notes/
optimization-i-linear-optimization.html
For further
reference • https://desmond-ong.github.io/stats-notes/e
xercises-linear-optimization.html
optim()
• The function optim() provides algorithms for general-purpose
optimisations.

optim(par, fn, ...)

• par represents the initial parameter values,


• fn is the objective function to be minimized or maximized,
• ... denotes additional arguments that can be passed to the objective function.
The optim() function requires you to define the objective function
that you want to optimize. The objective function should take a
vector of parameters as input and return a scalar value
representing the objective function's output.

The optim() function attempts to find the parameter values that


minimize the objective function by iteratively updating the
parameter values based on the chosen optimization algorithm.
In this example, we define an objective function that calculates the squared
distance between a given point (x[1], x[2]) and a target point (3, 4).

We use optim() to find the parameter values that minimize this objective
function.

The optimized parameter values and the minimum value of the objective
function are then printed.
• The Residual Sum of Squares (RSS), also known as the Sum of Squared Residuals
(SSR), is a metric used to measure the variance or discrepancy between the
observed values and the predicted values in a regression analysis. It quantifies the
difference between the actual dependent variable values and the values predicted
by a regression model.

• In a regression analysis, the goal is to find a line or curve that best fits the observed
data points. The residuals are the differences between the observed values and the
predicted values on the regression line. The RSS is calculated by squaring each
residual, summing up the squared residuals, and obtaining the total sum.

• Mathematically, the RSS is defined as:


RSS = Σ(yᵢ - ȳ)²
Where:
• yᵢ represents the observed values of the dependent variable.
• ȳ represents the predicted values of the dependent variable.
• Σ denotes the summation symbol, indicating that the squared differences are
summed across all data points.
• The RSS is an important metric in regression analysis as it helps assess
how well the model fits the data.
• Lower RSS values indicate a better fit, meaning that the predicted
values are closer to the actual values.
• In contrast, higher RSS values indicate a poorer fit, suggesting that the
model's predictions have more significant discrepancies from the
observed data.
• By minimizing the RSS, regression models can be optimized to find the
best-fitting line or curve that represents the relationship between the
independent and dependent variables.
Demonstrate the calculation of RSS :
• Suppose we have the following dataset:
X = [1, 2, 3, 4, 5] Y = [2, 4, 5, 4, 6]
• We want to fit a simple linear regression model to predict Y based on X.
• The regression model can be represented by the equation Y = β₀ + β₁X, where β₀ and β₁ are the intercept and
slope coefficients, respectively.

• Let's assume that after fitting the linear regression model, we obtain the following predicted values for Y:
Predicted Y = [2.2, 3.8, 5.4, 7.0, 8.6]

• To calculate the RSS, we need to compute the squared differences between the observed Y values and the
predicted Y values, and then sum them up:
Residuals = [2 - 2.2, 4 - 3.8, 5 - 5.4, 4 - 7.0, 6 - 8.6] = [-0.2, 0.2, -0.4, -3.0, -2.6]

• Squared Residuals = [(-0.2)², 0.2², (-0.4)², (-3.0)², (-2.6)²] = [0.04, 0.04, 0.16, 9.0, 6.76]
RSS = Sum of Squared Residuals = 0.04 + 0.04 + 0.16 + 9.0 + 6.76 = 16.0

• Therefore, the RSS for this linear regression model on the given dataset is 16.0.
• This value quantifies the overall discrepancy or variance between the observed Y values and the predicted Y
values.
Minimise residual sum of squares
• x-y data set has a linear relationship and therefore fit y against x by minimising the residual sum of squares.

• Next, create a function that calculates the residual sum of square of data against a linear model with two
parameter. Think of y = par[1] + par[2] * x.
• Optim minimises a function by varying its parameters.
• first argument is the initial parameter values, par in this case;
• second argument is the function to be minimised, min.RSS.
• plot(y ~ x, data = dat,
main="Least square regression")

• abline(a = result$par[1], b =
result$par[2], col = "red")

You might also like