Linear Programming Models-1
Linear Programming Models-1
Programmi
ng model
An optimization model seeks to find the values
What is of the decision variables that optimize
Optimizati (maximize or minimize) an objective function
among the set of all values for the decision
on model variables that satisfy the given constraints
Its three main components are:
The solution of the optimization model is called the optimal feasible solution.
Modeling Steps
Modeling accurately an operations research problem is the most significant-and sometimes the most
difficult-task. A wrong model will lead to a wrong solution, and thus, will not solve the original problem.
The following steps should be performed by different team members with different areas of expertise
to obtain an accurate and greater view of the model:
•Problem definition: defining the scope of the project and identifying that the result is the identification
of three elements: description of decision variables, determination of the objective and determination
of the limitations (i.e. constraints).
•Model construction: translating the problem definition into mathematical relationships.
•Model solution: using a standard optimization algorithm. Upon obtaining a solution, a sensitivity
analysis should be performed to find out the behavior of the solution due to changes in some of the
parameters.
•Model validity: checking if the model works as it was supposed to.
•Implementation: translating the model and the results into the recommendation of a solution.
Linear Programming
•Linear programming (also referred as LP) is an operations research
technique used when all the objectives and constraints are linear (in
the variables) and when all the decision variables are continuous. In
hierarchy, linear programming could be considered as the easiest
operations research technique.
The chart gives the nutrient content as well as the per-unit cost of each food item. The
diet must be planned in such a way that it should contain at least 500 calories, 6
grams of protein, 10 grams of carbohydrates and 8 grams of fat.
1. Identify Decision Variables : food items are the decision variables in this
example
Food1 = a, Food2 = b, Food3 = c, Food4 = d
2. Objective Function: As we want to keep the cost of diet minimum
Z(min) = 0.5a + 0.2b + 0.3c + 0.8d
3. Define the Constraints:
400a + 200b + 150c + 500d >= 500
3a + 2b + 0c + 0d >= 6
2a + 2b + 4c + 4d >= 10
2a + 4b + 1c + 5d >= 8
4. Non-negative Restrictions: a >= 0 , b >= 0 , c >= 0 , d >= 0
summary(opt)
Conclusion:
•As we see for our optimum solution means diet
with minimum cost Sara should have 0 units of
Food1, 3 units of Food2, 1 unit of Food3 and 0
units of Food4. the cost of diet will be $0.90 and
this diet will provide her at least 500 calories, 6
grams of protein, 10 grams of carbohydrates and 8
grams of fat.
Sample Question - 3
Decision variables:
X_1 = number of plots of parsnips grown
X_2 = number of plots of kale grown
Objective function:
Maximize Profits = 0.15 X_1 + 0.40 X_2
• The Feasible
Region is the
intersection of all the
constraint regions.
• Objective function parameters, which are just the coefficients of X1 and X2 in the
object function:
• Profits = 0.15 X1 + 0.40 X2
We use optim() to find the parameter values that minimize this objective
function.
The optimized parameter values and the minimum value of the objective
function are then printed.
• The Residual Sum of Squares (RSS), also known as the Sum of Squared Residuals
(SSR), is a metric used to measure the variance or discrepancy between the
observed values and the predicted values in a regression analysis. It quantifies the
difference between the actual dependent variable values and the values predicted
by a regression model.
• In a regression analysis, the goal is to find a line or curve that best fits the observed
data points. The residuals are the differences between the observed values and the
predicted values on the regression line. The RSS is calculated by squaring each
residual, summing up the squared residuals, and obtaining the total sum.
• Let's assume that after fitting the linear regression model, we obtain the following predicted values for Y:
Predicted Y = [2.2, 3.8, 5.4, 7.0, 8.6]
• To calculate the RSS, we need to compute the squared differences between the observed Y values and the
predicted Y values, and then sum them up:
Residuals = [2 - 2.2, 4 - 3.8, 5 - 5.4, 4 - 7.0, 6 - 8.6] = [-0.2, 0.2, -0.4, -3.0, -2.6]
• Squared Residuals = [(-0.2)², 0.2², (-0.4)², (-3.0)², (-2.6)²] = [0.04, 0.04, 0.16, 9.0, 6.76]
RSS = Sum of Squared Residuals = 0.04 + 0.04 + 0.16 + 9.0 + 6.76 = 16.0
• Therefore, the RSS for this linear regression model on the given dataset is 16.0.
• This value quantifies the overall discrepancy or variance between the observed Y values and the predicted Y
values.
Minimise residual sum of squares
• x-y data set has a linear relationship and therefore fit y against x by minimising the residual sum of squares.
• Next, create a function that calculates the residual sum of square of data against a linear model with two
parameter. Think of y = par[1] + par[2] * x.
• Optim minimises a function by varying its parameters.
• first argument is the initial parameter values, par in this case;
• second argument is the function to be minimised, min.RSS.
• plot(y ~ x, data = dat,
main="Least square regression")
• abline(a = result$par[1], b =
result$par[2], col = "red")