Lecture 3
Lecture 3
• Khushbu (2019MT10701)
1 Overview
In last lecture, we gave basic overview of linear programming. We also discussed some basic
terminologies like Decision variables, Cost vector, Objective function, Feasible solution/region and
Optimal solution. In LP, we try to maximize (or minimize) a linear function subject to linear
constraints. We discussed two important problems- Transportation and Communication Network.
In this lecture, we shall introduce piecewise linear convex functions and graphical solutions of two
variable LP problems.
2 Main Section
A function f : Rn → R is called convex if for every x, y ∈ Rn ,and every λ ∈ [0, 1], we have
f (λx + (1 − λ)y) ≤ λf (x) + (1 − λ)f (y).
A function f : Rn → R is called concave if for every x, y ∈ Rn ,and every λ ∈ [0, 1], we have
f (λx + (1 − λ)y) ≥ λf (x) + (1 − λ)f (y).
√
Example 2: −x2 , x
1
Figure 1: a) A convex function. b) A concave function. c) A function that is neither convex nor
concave.
Q2: For a convex function, can there be a local minimum that fail to be global minimum?
Ans: No, any local minimum of a convex function is also global minimum.
Suppose f is convex, and let x* be a local minimum of f in the convex set χ.Then for some neigh-
borhood N ⊆ χ about x* , we have f (x) ≥ f (x∗ ) ∀x ∈ N . Suppose towards a contradiction that
there exists x′ ∈ χ such that f (x′ ) < f (x∗ ).
Consider the line segment x(t) = tx∗ + (1 − t)x′ , t ∈ [0, 1], noting that x(t) ∈ χ by the convexity of
χ . Then by the convexity of f,
2
It follows that f (x∗ ) ≤ f (x) for all x ∈ χ , so x∗ is a global minimum of f in χ.
Theorem 1. Let f1 , ..., fm : Rn → R be convex functions. Then the function f defined by f (x) =
maxi=1,...,m fi (x) is also convex.
A function of the form maxi=1,...,m (cTi x + di ) is called a piecewise linear convex function.
A generalization of LP, where the objective function is piecewise linear and convex is:
Minimize maxi=1,...,m (cTi x + di )
Subject to Ax ≥ b
Note that maxi=1,...,m (cTi x + di ) is equal to the smallest number z that satisfies z ≥ (cTi x + di ) ∀i.
The above generalization is equivalent to:
Minimize z
Subject to z ≥ (cTi x + di ),i= 1,...m
Ax ≥ b
3
And this problem is equivalent to :
Minimize ni=1 ci zi
P
Subject to Ax ≥ bxi ≤ zi , i = 1, ...n
−xi ≤ zi , i = 1, ...n
Here x ∈ Rn is the variable, and a1 , ..., ak ∈ Rn , b1 , ..., bk ∈ Rn are parameters that specify the
problem instance.
Chebyshev approximation problem in LP form:
Minimize t
Subject to aTi x − t ≤ bi , i = 1, ...k,
−aTi x − t ≤ −bi , i = 1, ...k,
x ∈ Rn and t ∈ R.
Here the feasible set is the shaded region in Fig 2 on page 5.For any given scalar z, we consider the
set of all points whose cost c′ x is equal to z; this is the line described by the equation −x1 − x2 = z.
And this line is perpendicular to the vector c=(-1,-1). Note that different values of z will lead to
different lines all of them parallel to each other.If we increase the value of z it will correspond to
moving the line −x1 − x2 = z aing the direction of vector c. We want the value of z to be minimal
for that we should move the line in the direction of -c. The best we can do is z = −2 (see Figure
2), and vector x=(1,1)(a corner point) is an optimal solution(optimal solutions occurs at a corner
point).
4
Figure 2: Graphical solution of the Example 6.
References