0% found this document useful (0 votes)
103 views

HW3 Solution

This document contains 4 problems related to optimization. Problem 1 asks about the existence of an optimal solution for minimizing the log function subject to x being non-negative. Problem 2 involves finding the global minimum of an objective function and determining if it is convex. Problem 3 examines the feasibility of a problem with multiple constraints. Problem 4 considers properties of an objective function and whether adding constraints can change the optimal solution.

Uploaded by

prakshi
Copyright
© © All Rights Reserved
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
103 views

HW3 Solution

This document contains 4 problems related to optimization. Problem 1 asks about the existence of an optimal solution for minimizing the log function subject to x being non-negative. Problem 2 involves finding the global minimum of an objective function and determining if it is convex. Problem 3 examines the feasibility of a problem with multiple constraints. Problem 4 considers properties of an objective function and whether adding constraints can change the optimal solution.

Uploaded by

prakshi
Copyright
© © All Rights Reserved
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 7

ISyE 6669 HW 3

1. Consider the following linear optimization problem

min lnx
s.t. x ≥ 0.

Does this problem have an optimal solution? Explain your answer.

Ans: The problem does not have an optimal solution.

The objective function ln(x) is continuously decreasing as x reduces


to 0. However, it does not attain a minimum value within the feasible
set because it approaches negative infinity as x approaches 0. Hence,
while the problem has a feasible set, it does not have a finite optimal
solution.

2. Consider the following optimization problem

min (x2− 1)2


s.t. x ϵR.

(a) Find all the global minimum solutions. Explain how you find
them. Hint: there may be multiple ones.
(b) Is there any local minimum solution that is not a global
minimum solution?
(c) Is the objective function f(x) = (x2− 1)2 a convex function on R?

Ans:

Find all the global minimum solutions. Explain how you find them. Hint:
there may be multiple ones.
 To find all the global minimum solutions, we need to minimize the
objective function f(x)=(x2−1)2 over the entire real line x ϵ R.
Since f(x) is non-negative for all x, the minimum value of f(x) occurs
when f(x)=0.
The function f(x) equals zero when x2−1=0, which happens at x=±1.
At these points, the function takes the value of f(1) = f(−1) = 0.
Therefore, the global minimum solutions are x=−1 and x=1.

Is there any local minimum solution that is not a global minimum


solution?
 There are no local minimum solutions that are not also global
minimum solutions in this case.
Since the objective function f(x)=(x 2−1)2 is non-negative everywhere, the
only local minimum points would be those where the function attains its
minimum value, which we've identified as x=−1 and x=1.
Since these are also global minimum solutions, there are no local
minimum solutions that are not global minimum solutions.
Is the objective function f(x) = (x2− 1)2 a convex function on R?

 To determine if the objective function f(x)=(x 2−1)2 is convex on R, we


need to check if its second derivative is always non-negative. Let's find
the second derivative of f(x):

f(x) = (x2−1)2

f’(x) = 2(x2−1) (2x) = 4x(x2−1)

f’’(x) = 4(x2−1) + 4x(2x) = 4(x2−1+2x2) = 4(3x2−1)

For f’’(x) to be non-negative, 3x2−1 must be non-negative which results


in x2>=1/3

So, f’’(x) is positive for (-∞, - √1/3] and [-∞, - √1/3) and

f’’(x) is negative for (- √1/3, - √1/3)

Since f’’(x) changes signs on, the function is not convex on R.

3. Consider the following optimization problem

min x2 + y2
s.t. 2x + y ≥ 3

x − 2y ≤−1

y ≤ 0.

Does this problem have an optimal solution? Explain your answer.

Ans:
Constraint 1: Green Region represents the constraint: 2x+y>=3

Constraint 2: Purple Region represents the constraint: x-2y<=-1

Constraint 3: Grey Region represents the constraint: y<=0

Upon examining these constraints, it becomes evident that there's no


feasible region that satisfies all three simultaneously. Constraint (3)
restricts y to be non-positive, while constraints (1) and (2) together
involve positive values of y or values greater than zero.

Therefore, since the constraints are mutually exclusive, there is no


feasible region that satisfies all three constraints simultaneously.

Consequently, this optimization problem does not have an optimal


solution because there's no feasible region to optimize over.

4. Consider the following problem


(a) Is the objective function a convex function defined on R? Explain
your answer by checking the definition of convexity.
Ans: To determine if the objective function is a convex function
we need to check if x2 + f(x) is convex.
1. For -1<x<1:
f(x)=x in this interval, the objective function becomes x2 + x.
x2 + x is a quadratic function, which is convex.

2. For x=-1 and x=1:


f(x)=2 at x=-1 and x=1, the objective function becomes x2 + 2.
x2 + 2 is a constant function, which is convex.

3. For x>1 and x<-1:


f(x)= +∞ for x>1 and x<-1, the objective function becomes
x2+∞ = +∞. Since the objective function becomes +∞, it
trivially satisfies the convexity condition.
Since each segment is convex individually, the overall objective function
is convex on its entire domain R. Therefore the objective function is
convex.
(b) Find an optimal solution, or explain why there is no optimal
solution.

Ans: To determine if the objective function is a convex function


we need to check if x2 + f(x) is convex.
1. For -1<x<1:
 x2 + x is a quadratic function with a positive leading
coefficient, so it is minimized at its vertex.
 The vertex occurs at x=-1/2, and the function is
strictly increasing for x>-1/2 and strictly decreasing
for x<-1/2.
 The function approaches positive infinity as x
approaches negative or positive infinity.

2. For x=-1 and x=1:


 x2 + 2 is a constant function, which is minimized at
x=−1 or x=1 and remains constant elsewhere.
3. For x>1 and x<-1:
 x2 + f(x) = +∞.
 Since the objective function is unbounded for x>1 or
x<−1, there is no minimum value to be attained.
For −1<x<1, the objective function has a minimum value at its vertex,
which occurs at x=-1/2
Therefore, the optimal solution, if it exists, is x=-1/2, where x2 + f(x)
achieves its minimum value.
Ans:
A) True: Yes, adding a new constraint to an optimization problem can
indeed change the solution. When I add a constraint, I am essentially
altering the feasible region, which can lead to a different optimal
solution.
Here's why:

 Feasible Region Modification: The new constraint modifies the


feasible region by either shrinking it or altering its shape. This
change can exclude previously feasible solutions or introduce
new feasible solutions.
 Objective Function Impact: With a different feasible region, the
objective function may have different optimal points. The
change in the feasible region can affect the trade-offs between
minimizing the objective function and satisfying the
constraints.
 Optimal Solution Adjustment: The optimization process involves
finding the best solution within the feasible region. When the
feasible region changes, the optimal solution may shift to a
different point that better balances the objective function and
the new constraint.
Overall, adding a constraint can significantly impact the optimization
problem's solution by reshaping the feasible region and potentially
leading to a different optimal solution.
B) True: And it follows from the weak duality theorem in optimization. The
Lagrangian dual problem (D) is obtained by minimizing the Lagrangian
function with respect to the Lagrange multipliers subject to non-
negativity constraints on the multipliers. The Lagrangian function L(λ) is
defined as:

L(λ)=maxx {f(x)+∑i∈I λi(gi(x)−bi)}

If vP is the optimal objective value of the primal problem (P), and v D is


the optimal objective value of the dual problem (D), then according to
weak duality:

vP≤vD
This inequality essentially states that the optimal value of the primal
problem is always less than or equal to the optimal value of the dual
problem.

C) True: The statement is true. This property is often referred to as


"convexity of objective function" in optimization. Here's the justification:
a. Definition of Convex Set: A set X is convex if, for any two points x 1,
x2 in X, the line segment joining x1 and x2 lies entirely within X.
b. Convex Optimization: In convex optimization, minimizing a convex
function over a convex set often ensures that local optima are also
global optima.
c. Local Optimal Solutions: Given that every local optimal solution of
problem (P) is also globally optimal, it suggests that the function
f(x) exhibits convexity properties.
d. Convexity and Local Minima: For convex functions, any local
minimum is also a global minimum. This property aligns with the
given condition.
e. Closed Convex Set: The fact that X is a non-empty closed convex
set implies that it contains all its limit points and that it is closed
under convex combinations. This typically aids in ensuring the
existence of global optima.
Therefore, if every local optimal solution is also globally optimal, it
strongly suggests that f(x) must be a convex function.

You might also like