0% found this document useful (0 votes)
2 views

Module 4

Module 4 covers Linear Programming (LP) and Nonlinear Optimization, detailing formulation, solving methods, and applications in algorithm design and resource allocation. It discusses constrained optimization techniques such as Lagrange multipliers and KKT conditions, providing examples for clarity. The document emphasizes the importance of optimization in various fields, including resource allocation and algorithm efficiency.

Uploaded by

Racel Cagnayo
Copyright
© © All Rights Reserved
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
2 views

Module 4

Module 4 covers Linear Programming (LP) and Nonlinear Optimization, detailing formulation, solving methods, and applications in algorithm design and resource allocation. It discusses constrained optimization techniques such as Lagrange multipliers and KKT conditions, providing examples for clarity. The document emphasizes the importance of optimization in various fields, including resource allocation and algorithm efficiency.

Uploaded by

Racel Cagnayo
Copyright
© © All Rights Reserved
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 4

Module 4: Linear Programming (LP)

 Formulation:
o Linear programming deals with optimizing (maximizing or minimizing) a
linear objective function subject to linear equality and inequality
constraints.
o A general LP problem can be formulated as:

 Objective function: Maximize or Minimize Z=c1x1+c2x2+...+cnxn


 Subject to:
 a11x1+a12x2+...+a1nxn≤b1
 a21x1+a22x2+...+a2nxn≤b2
 ...
 am1x1+am2x2+...+amnxn≤bm
 x1,x2,...,xn≥0 (non-negativity constraints)
o Where:

 xi are decision variables.


 ci are objective function coefficients.
 aij are constraint coefficients.
 bi are constraint right-hand-side values.
 Solving Methods:
o Simplex Method:

 A classic and widely used algorithm.


 Iteratively moves from one feasible solution (corner point of the
feasible region) to another, improving the objective function at each
step.
 Guaranteed to find an optimal solution if one exists.
o Graphical Method:

 Used for problems with two variables.


 Involves plotting the constraint lines and identifying the feasible
region.
 The optimal solution is found at a corner point of the feasible region.
o Interior-Point Methods:

 Move through the interior of the feasible region rather than along
the edges.
 Often more efficient for large-scale problems.
 Example:
o A factory produces two types of products, A and B.

o Product A requires 2 hours of labor and 1 unit of raw material.

o Product B requires 3 hours of labor and 2 units of raw material.

o The factory has 12 hours of labor and 8 units of raw material available.

o Product A yields a profit of $5 per unit, and product B yields a profit of $7


per unit.
o Problem: Maximize profit.

o Formulation:

 Let x1 be the number of units of A, and x2 be the number of units of


B.
 Maximize Z=5x1+7x2
 Subject to:
 2x1+3x2≤12 (labor constraint)
 1x1+2x2≤8 (material constraint)
 x1,x2≥0
2. Nonlinear Optimization
 Description:
o Deals with optimizing objective functions that are nonlinear, and/or
subject to nonlinear constraints.
o More complex than linear programming.

 Methods:
o Gradient Descent:

 Iteratively moves towards the minimum of a function by taking steps


proportional to the negative of the gradient.
 Simple and widely used, but can be slow and get stuck in local
minima.

∇f(xn) is the gradient of the function f at xn


 Update rule: xn+1=xn−α∇f(xn) where α is the learning rate, and

o Newton's Method:

 Uses the second derivative (Hessian matrix) to find the minimum.


 Converges faster than gradient descent, but requires calculating the
Hessian, which can be computationally expensive.

 Update Rule: xn+1=xn−[Hf(xn)]−1∇f(xn) where Hf(xn) is the hessian


matrix of f at xn.
o Quasi-Newton Methods:

 Approximate the Hessian, reducing computational cost.


 Examples include BFGS and L-BFGS.
 Example:
o Minimize f(x)=x2+4cos(x)

o Gradient descent can be used to find the minimum by iteratively updating


x.
3. Applications in Algorithm Design and Resource Allocation
 Algorithm Design:
o Optimization is used to design efficient algorithms.

o Example: Finding the shortest path in a graph (Dijkstra's algorithm, A*


search).
o Another example is optimizing the weights of a neural network.

 Resource Allocation:
o Optimizing the allocation of limited resources (e.g., time, money,
manpower).
o Example: Scheduling tasks to minimize completion time.

o Example: optimizing the amount of products to produce, given a limited


amount of raw materials.
o Example: bandwidth allocation in networks.

4. Constrained Optimization
 Description:
o Optimization problems where the solution must satisfy certain constraints.

o Constraints can be equality or inequality constraints.

 Methods:
o Lagrange Multipliers:

 Used to solve equality-constrained optimization problems.


 Introduces Lagrange multipliers to convert the constrained problem
into an unconstrained problem.
o Karush-Kuhn-Tucker (KKT) Conditions:

 Generalize Lagrange multipliers to handle inequality constraints.


 Provide necessary conditions for optimality.
o Sequential Quadratic Programming (SQP):

 Solves constrained nonlinear optimization problems by iteratively


solving quadratic programming subproblems.
o Penalty Methods:

 Adds a penalty to the objective function, if constraints are violated.


 Example:
o Maximize f(x,y)=xy subject to x2+y2=1

o Using Lagrange multipliers, we can solve this constrained optimization


problem.
o Minimize f(x,y)=(x−1)2+(y−2)2 subject to x+y≥1. This is a constrained
optimization problem with an inequality constraint.

You might also like