0% found this document useful (0 votes)
49 views5 pages

Lecture 3

The document discusses optimization methods including convex and concave functions, local and global minima, piecewise linear convex functions, and graphical solutions for two variable linear programs. Key topics covered include the properties of convex and concave functions, how any local minimum of a convex function is also a global minimum, representing problems with absolute values and maximum functions as linear programs, and using graphs to find optimal solutions for two variable linear programs.

Uploaded by

Tony Abhishek
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
49 views5 pages

Lecture 3

The document discusses optimization methods including convex and concave functions, local and global minima, piecewise linear convex functions, and graphical solutions for two variable linear programs. Key topics covered include the properties of convex and concave functions, how any local minimum of a convex function is also a global minimum, representing problems with absolute values and maximum functions as linear programs, and using graphs to find optimal solutions for two variable linear programs.

Uploaded by

Tony Abhishek
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 5

MTL103: Optimization Methods and Applications Spring 2023

Lecture 3 — 6th Jan, 2023


Lecturer: Prof. Minati De Scribe: Team 3

• Khushbu (2019MT10701)

• Adika Malviya (2019MT10670)

• Ishita Agrawal (2019MT10695)

• Srinija Varma (2019MT10690)

• Shruti Jain (2019MT10726)

• Pradim Siwa (2019MT10712)

1 Overview

In last lecture, we gave basic overview of linear programming. We also discussed some basic
terminologies like Decision variables, Cost vector, Objective function, Feasible solution/region and
Optimal solution. In LP, we try to maximize (or minimize) a linear function subject to linear
constraints. We discussed two important problems- Transportation and Communication Network.
In this lecture, we shall introduce piecewise linear convex functions and graphical solutions of two
variable LP problems.

2 Main Section

2.1 Convex Function

A function f : Rn → R is called convex if for every x, y ∈ Rn ,and every λ ∈ [0, 1], we have
f (λx + (1 − λ)y) ≤ λf (x) + (1 − λ)f (y).

Example 1: −log(x), eax

2.2 Concave Function

A function f : Rn → R is called concave if for every x, y ∈ Rn ,and every λ ∈ [0, 1], we have
f (λx + (1 − λ)y) ≥ λf (x) + (1 − λ)f (y).


Example 2: −x2 , x

1
Figure 1: a) A convex function. b) A concave function. c) A function that is neither convex nor
concave.

Q1: Why Convex (or concave) Functions are important in Optimization?


Ans: Convex (or concave) functions play a central role in optimization because they allow for
efficient and effective optimization algorithms. The property of convexity (or concavity) provides
important information about the shape of the function, which can be used to determine the global
minimum (or maximum) of the function. In particular, for a convex function, any local minimum is
also a global minimum, and for a concave function, any local maximum is also a global maximum.

2.3 Local and Global Minimum

A vector x is a local minimum of f if f (x) ≤ f (y) for all y in the vicinity of x.


A vector x is a global minimum of f if f (x) ≤ f (y) for all y.

Q2: For a convex function, can there be a local minimum that fail to be global minimum?

Ans: No, any local minimum of a convex function is also global minimum.
Suppose f is convex, and let x* be a local minimum of f in the convex set χ.Then for some neigh-
borhood N ⊆ χ about x* , we have f (x) ≥ f (x∗ ) ∀x ∈ N . Suppose towards a contradiction that
there exists x′ ∈ χ such that f (x′ ) < f (x∗ ).

Consider the line segment x(t) = tx∗ + (1 − t)x′ , t ∈ [0, 1], noting that x(t) ∈ χ by the convexity of
χ . Then by the convexity of f,

f (x(t)) ≤ tf (x∗ ) + (1 − t)f (x′ ) < tf (x∗ ) + (1 − t)f (x∗ ) = f (x∗ )

for all t ∈ (0, 1).


We can pick t to be sufficiently close to 1 that x(t) ∈ N ; then f (x(t)) ≥ f (x∗ ) by the definition of
N, but f (x(t)) < f (x∗ ) by the above inequality, a contradiction.

2
It follows that f (x∗ ) ≤ f (x) for all x ∈ χ , so x∗ is a global minimum of f in χ.

Theorem 1. Let f1 , ..., fm : Rn → R be convex functions. Then the function f defined by f (x) =
maxi=1,...,m fi (x) is also convex.

Proof. Let x, y ∈ Rn and let λ ∈ [0, 1].We have,


f (λx + (1 − λ)y) = maxi=1,..,m fi (λx + (1 − λ)y)

≤ maxi=1,..,m (λfi (x) + (1 − λ)fi (y))

≤ maxi=1,..,m λfi (x) + maxi=1,..,m (1 − λ)fi (y)

= λf (x) + (1 − λ)f (y).


Hence proved.

2.4 Piecewise Linear Convex Function

A function of the form maxi=1,...,m (cTi x + di ) is called a piecewise linear convex function.

Example 3: The absolute value function f (x) = |x| = max{x, −x}.

2.5 Piecewise Linear Convex Objective Function

A generalization of LP, where the objective function is piecewise linear and convex is:
Minimize maxi=1,...,m (cTi x + di )
Subject to Ax ≥ b
Note that maxi=1,...,m (cTi x + di ) is equal to the smallest number z that satisfies z ≥ (cTi x + di ) ∀i.
The above generalization is equivalent to:
Minimize z
Subject to z ≥ (cTi x + di ),i= 1,...m
Ax ≥ b

Example 4: Problems involving absolute values:

Minimize ni=1 ci |xi |


P
Subject to Ax ≥ b
Here |xi | is the smallest number zi that satisfies xi ≤ zi and −xi ≤ zi .

3
And this problem is equivalent to :
Minimize ni=1 ci zi
P
Subject to Ax ≥ bxi ≤ zi , i = 1, ...n
−xi ≤ zi , i = 1, ...n

Example 5: Chebyshev approximation problem:

Minimize maxki=1 |aTi x − bi |.

Here x ∈ Rn is the variable, and a1 , ..., ak ∈ Rn , b1 , ..., bk ∈ Rn are parameters that specify the
problem instance.
Chebyshev approximation problem in LP form:

Minimize t
Subject to aTi x − t ≤ bi , i = 1, ...k,
−aTi x − t ≤ −bi , i = 1, ...k,
x ∈ Rn and t ∈ R.

3 Graphical representation and Solution

Consider the following LP problem with two variables:

Example 6: Minimize −x1 − x2


Subject to x1 + 2x2 ≤ 3
2x1 + x2 ≤ 3
x1 , x2 ≥ 0

Here the feasible set is the shaded region in Fig 2 on page 5.For any given scalar z, we consider the
set of all points whose cost c′ x is equal to z; this is the line described by the equation −x1 − x2 = z.
And this line is perpendicular to the vector c=(-1,-1). Note that different values of z will lead to
different lines all of them parallel to each other.If we increase the value of z it will correspond to
moving the line −x1 − x2 = z aing the direction of vector c. We want the value of z to be minimal
for that we should move the line in the direction of -c. The best we can do is z = −2 (see Figure
2), and vector x=(1,1)(a corner point) is an optimal solution(optimal solutions occurs at a corner
point).

4
Figure 2: Graphical solution of the Example 6.

References

[1] Bertsimas, John N. Tsitsiklis Introduction to linear optimization

You might also like