0% found this document useful (0 votes)
95 views3 pages

Op Tim Ization

The document describes using various optimization functions in MATLAB's Optimization Toolbox to minimize the Rosenbrock function. It provides examples of unconstrained minimization using fminunc and fminsearch, constrained minimization with linear constraints using fmincon, and constrained minimization with bound constraints and nonlinear constraints defined by an ellipse using fmincon. The optimal solutions found for each example are reported.

Uploaded by

Enaqkat Painav
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
95 views3 pages

Op Tim Ization

The document describes using various optimization functions in MATLAB's Optimization Toolbox to minimize the Rosenbrock function. It provides examples of unconstrained minimization using fminunc and fminsearch, constrained minimization with linear constraints using fmincon, and constrained minimization with bound constraints and nonlinear constraints defined by an ellipse using fmincon. The optimal solutions found for each example are reported.

Uploaded by

Enaqkat Painav
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 3

MatLab Optimization tool box

S. M. Shafkat Raihan

December 6, 2020

Exercise 1
Question:
By taking account of any optimization problem demonstrate the use of various functions
available in the MatLab optimization Toolbox.
Answer:
The Rosenbrock function is a non-convex function, introduced by Howard H. Rosenbrock
in 1960, which is used as a performance test problem for optimization algorithms. It is also
known as Rosenbrock’s valley or Rosenbrock’s banana function. The global minimum is inside
a long, narrow, parabolic shaped flat valley. To find the valley is trivial. To converge to the
global minimum, however, is difficult. The function is defined by

𝑓 (𝑥, 𝑦) = (𝑎 − 𝑥)2 + 𝑏(𝑦 − 𝑥 2 )2

It has a global minimum at (𝑥, 𝑦) = (𝑎, 𝑎2 ), where f (𝑥, 𝑦) = 0. Usually these parameters are set
such that 𝑎 = 1, 𝑏 = 100. Only in the trivial case where 𝑎 = 0 the function is symmetric and the
minimum is at the origin. To optimize it, we apply various MATLAB optimization functions.
Unconstrained minimization: Unconstrained minimization - fminunc,for unconstrained
multivariable function
%Optimizing Rosenbrock function
%Unconstrained minimization - fminunc(for unconstrained multivariable
%function),fminsearch(for unconstrained multivariable function using derivative-f

fun = @(x)100*(x(2)-x(1)^2)^2 + (1-x(1))^2;


x0 = [1,1];
[x,fval] = fminunc(fun,x0);
fprintf(’Unconstrained Minimum is at %f,%f \n’,x(1),x(2))

Unconstrained Minimum is at 1.000000,1.000000.


Unconstrained minimization: Unconstrained minimization - fminsearch, for uncon-
strained multivariable function using derivative-free method.
fun = @(x)100*(x(2) - x(1)^2)^2 + (1 - x(1))^2;
x0 = [-1.2,1];

1
x = fminsearch(fun,x0);
fprintf(’Unconstrained Minimum with derivative-free method is %f,%f \n’,x(1),x(2)

Unconstrained Minimum with derivative-free method is 1.000022,1.000042


Constrained Minimization
%constrained nonlinear multivariable problems
% With linear constraints
fun = @(x)100*(x(2)-x(1)^2)^2 + (1-x(1))^2; %Anonymous function used to define ob
x0 = [0.5,0]; %Initial point
A = [1,2]; % A is a matrix which needed for the inequality constraint A.x <= b
b = 1; % b is vector which needed for the constraint A.x <= b
Aeq = [2,1]; % Aeq is a matrix which needed for the equality constraint Aeq.x ==
beq = 1; % beq is a vector which needed for the equality constraint Aeq.x == beq
x = fmincon(fun,x0,A,b,Aeq,beq);
fprintf(’Minimum with linear constraints at %f,%f\n’,x(1),x(2))

Minimum with linear constraints at 0.414944,0.170111


%With bound constraints
fun = @(x)100*(x(2)-x(1)^2)^2 + (1-x(1))^2;
%bound constraints are 0 <= x(1) <= 1, 0 <= x(2) <= 2
lb = [0,0]; %lower bound
ub = [1,2]; %upperbound
A = [];
b = [];
Aeq = [];
beq = [];
x0 = (lb + ub)/2; % start of in the middle point of [lb,ub]
x = fmincon(fun,x0,A,b,Aeq,beq,lb,ub);
fprintf(’Minimum with bound constraints at %f,%f\n’,x(1),x(2))

Minimum with bound constraints at 0.999640,0.999280


%With nonlinear constraints
%Find the point where Rosenbrock’s function is minimized
%within a circle, also subject to bound constraints
fun = @(x)100*(x(2)-x(1)^2)^2 + (1-x(1))^2;
%bound constraints are 0 <= x(1) <= 0.5, 0.2 <= x(2) <= 0.8
lb = [0,0.2]; %lower bound
ub = [0.5,0.8]; %upper bound
% Also look within an ellipse centered at [1/3,1/3] with
% major radius a = 1. and minor radius b = 2/3
nonlcon = @ellipsecon;
x = fmincon(fun,x0,A,b,Aeq,beq,lb,ub,nonlcon);
fprintf(’Minimum with nonlinear constraints at %f,%f\n’,x(1),x(2))
%Minimum with nonlinear constraints at 0.500000,0.250000
function [c,ceq] = ellipsecon(x)
c = (x(1)-1/3)^2 + 9*((x(2)-1/3)^2)/4 - 1;
ceq = [];

2
end

Minimum with nonlinear constraints at 0.500000,0.250000

You might also like