0% found this document useful (0 votes)
43 views28 pages

Global Search

The document discusses global search algorithms including simulated annealing and particle swarm optimization. Simulated annealing uses random search and probabilistic acceptance of worse solutions to avoid local minima. Particle swarm optimization updates a population of particles that interact and move toward better solutions.

Uploaded by

tanmoynath0999
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
43 views28 pages

Global Search

The document discusses global search algorithms including simulated annealing and particle swarm optimization. Simulated annealing uses random search and probabilistic acceptance of worse solutions to avoid local minima. Particle swarm optimization updates a population of particles that interact and move toward better solutions.

Uploaded by

tanmoynath0999
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 28

Global Search Algorithms:

Searching over the entire feasible set

1
Global Search Algorithms
• When we have local minima then the gradient based methods can fail as they can get stuck in those
local minima point.

• Instead of gradient information we can use a stochastic or random search where even if we reach a
minimum point we will get out to a worse solution and search again.

• This way we can make sure that we are in fact looking for a global minima and get unstuck from
any local minima point.

2
Simulated Annealing

3
Simulated Annealing

4
Simulated Annealing

5
Simulated Annealing

6
Simulated Annealing

7
Simulated Annealing

8
Simulated Annealing: Example

9
10
Simulated Annealing: Example
%%%%%% Intialize
x1 = linspace(-5,5,1000);
x2 = linspace(-5,5,1000);

gamma = 2;
iter = 5000;

x_val(1,:) = [-1 1];

i=0;
x_best= x_val(1,:);
f_best = obj(x_best(1,:));

11
Simulated Annealing: Example
for k=1:iter
%%%%% Generate neighborhood point z(k,:) at random for x_val(k,:) over the entire search space
z(k,1) = -5 + (5-(-5))*rand(1,1);
z(k,2) = -5 + (5-(-5))*rand(1,1);
fz(k) = obj(z(k,:)); %%%% Objective function at z
fx(k) = obj(x_val(k,:)); %%%% Objective function at x_val
T(k) = gamma/(log(k+2)); %%%% Temperature: Cooling as the iteration progresses

if fz(k)<fx(k) %%%%% If neighbourhood point z gives better performance replace x_val with z
x_val(k+1,:) = z(k,:);
fx(k+1) = fz(k);

12
Simulated Annealing: Example
else
p_val(k,:) = prob([fz(k),fx(k),T(k)]); %%%% If not replace with the probability (coin toss)
if p_val(k,:)>rand(1,1)
x_val(k+1,:) = z(k,:); fx(k+1) = fz(k);
else
x_val(k+1,:) = x_val(k,:); fx(k+1) = fx(k); %% If coin toss does not allow then keep as it is
end
end
if fx(k)<f_best
x_best = x_val(k,:); %%%% Keep track of the best result so far
f_best = fx(k);
end
end

13
Simulated Annealing: Example
%%%% Objective function
function f_val = obj(x)
% f_val =
3*(1-x(1)^2)*exp(-x(1)^2-(x(2)+1)^2)-10*(x(1)/5-x(1)^3-x(2)^5)*exp(-x(1)^2-x(2)^2)-(exp(-(x(1)+1)^2-x(2)^2))/3;
% f_val = (x(1)-3)^2+(x(2)+2)^2+x(1)*x(2);
f_val = 0.2*x(1)^2-0.1*cos(3*pi*x(1))+0.2*x(2)^2-0.1*cos(4*pi*x(2));
end

%%%% Probability function


function p = prob(f)
% p = min([1,exp(-(f(1)-f(2))/f(3))]);
p = exp(-(f(1)-f(2))/f(3));

end

14
Simulated Annealing: Example
f_best = -0.1615

x_best = [0.0114 0.0706]

At k = 2165

15
Particle Swarm Optimization (PSO)
• Presented by James Kennedy (a social psychologist) and Russell C. Eberhart (an engineer) in 1995.

• Instead of updating a single candidate we update a population or swarm of particles.

• We start with a randomly selected swarm where the particles interact with each other and move toward a
common solution (optimal value).

• Each particle keeps track of its best-so-far position—this is the best position it has visited so far (with
respect to the value of the objective function) and it is called a personal best (pbest).

• In contrast, the overall best-so-far position (best among all the positions encountered so far by the entire
population) is called a global best (gbest).

16
Particle Swarm Optimization (PSO)

17
Particle Swarm Optimization (PSO)

18
Particle Swarm Optimization (PSO)

19
Particle Swarm Optimization (PSO)

20
PSO: Example

21
PSO: Example
%%%%% Intialize
x1 = linspace(-5,5,1000);
x2 = linspace(-5,5,1000);
d = 10; % Particles
iter = 1000;
va = -0.1;
vb = 0.1;
x_1(1,:) = -5 + (5-(-5))*rand(10,1); %%% Intialize
x_2(1,:) = -5 + (5-(-5))*rand(10,1);
v1(1,:) = va + (vb-va)*rand(10,1); %%% Intialize
v2(1,:) = va + (vb-va)*rand(10,1);
w = 0.9;
c1 = 2; c2 = 2;

22
PSO: Example
%%%% Intial p_best and g_best selection
p1_best(1,:) = x_1(1,:);
p2_best(1,:) = x_2(1,:);
f(1) = obj([x_1(1,1),x_2(1,1)]);

for i=2:d
f(i) = obj([x_1(1,i),x_2(1,i)]);
if f(i-1)<f(i)
gbest(1,:) = [x_1(1,i-1),x_2(1,i-1)];
else
gbest(1,:) = [x_1(1,i),x_2(1,i)];
end
end

23
PSO: Example
%%%%% Update the velocities and positions
for k=1:iter
r = rand(1,10);
s = rand(1,10);

fx(k) = obj([x_1(k,i) x_2(k,i)]);

v1(k+1,:) = w*v1(k,:)+c1*r.*(p1_best(k,:)-x_1(k,:))+c2*s.*(gbest(k,1)*ones([1,10]) -x_1(k,:));


v2(k+1,:) = w*v2(k,:)+c1*r.*(p2_best(k,:)-x_2(k,:))+c2*s.*(gbest(k,2)*ones([1,10]) -x_2(k,:));

x_1(k+1,:) = x_1(k,:)+v1(k+1,:);
x_2(k+1,:) = x_2(k,:)+v2(k+1,:);

24
PSO: Example
%%%% For each generted particle limit it within the search space limits
for i=1:d

if abs(x_1(k+1,i))>5
x_1(k+1,i)=5*sign(x_1(k+1,i));
end
if abs(x_2(k+1,i))>5
x_2(k+1,i)=5*sign(x_2(k+1,i));
end

25
PSO: Example
%%%% Select p_best for iteration k
if obj([x_1(k+1,i) x_2(k+1,i)])<obj([x_1(k,i) x_2(k,i)])
p1_best(k+1,i) = x_1(k+1,i);
p2_best(k+1,i) = x_2(k+1,i);
else
p1_best(k+1,i) = p1_best(k,i);
p2_best(k+1,i) = p2_best(k,i);
end

26
PSO: Example
%%%% Select gbest for iteration k and end the loop
if obj([p1_best(k+1,i) p2_best(k+1,i)])<obj([gbest(k,1) gbest(k,2)])
gbest(k+1,:) = [p1_best(k+1,i) p2_best(k+1,i)];
fbest = obj([gbest(k+1,1) gbest(k+1,2)]);
k_best = k;
else
gbest(k+1,:) = gbest(k,:);
fbest = obj([gbest(k,1) gbest(k,2)]);
k_best = k;
end
end

end

27
PSO: Example

gbest = [-0.0091 0.0053]

fbest = -0.1994

28

You might also like