0% found this document useful (0 votes)
43 views35 pages

Particle Swarm Optimization

Uploaded by

gec.matlab
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
43 views35 pages

Particle Swarm Optimization

Uploaded by

gec.matlab
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 35

Particle Swarm Optimization

Prakash Kotecha
Debasis Maharana & Remya Kommadath
Department of Chemical Engineering
Indian Institute of Technology Guwahati

1
Swarm Intelligence
“Any attempt to design algorithms or distributed problem-solving devices inspired by the
collective behaviour of social insect colonies and other animal societies”*
Examples of Swarms
 bees swarming around their hive
 ant colony with individual agents as ants
 flock of birds is a swarm of birds
 immune system is a swarm of cells
 crowd is a swarm of people
Properties of swarm intelligent behaviour
 self-organization
• interactions are executed on the basis of purely local information without any relation to the global pattern
• positive feedback, negative feedback, fluctuations and multiple interactions
 division of labour
• tasks performed simultaneously by specialized individuals

Swarm Intelligence: From Natural to Artificial Systems, New York, NY: Oxford University Press, 1999
An idea based on honey bee swarm for numerical optimization, Technical report-TR06, Erciyes University, Engineering Faculty, Computer Engineering Department, 2005 2
Particle Swarm Optimization
Proposed by J. Kennedy and R. Eberhart in 1995

Particle Swarm Optimization, Proceedings of ICNN'95 - International Conference on Neural Networks, Australia, 1942-1948 vol.4, 1995 3
Number of publications Number of publications

0
5000
20000
25000
30000

10000
15000
2000
4000
5000
6000
7000

3000

0
1000
2020
2018
2016
2014
2012
2010
2008
Year

2006
2004
2002
2000
1998
1996

Number of publications
0
2000
3000
4000
5000
6000
7000

1000
Particle Swarm Optimization

2020
2019
2018
2017
2016
2015
2014
2013
2012
2011
2010
2009
2008
Year

2007
2006
2005
2004
2003
2002
2001
TLBO

2000
1999
1998
PSO

1997
1996
1995
4
Particle Swarm Optimization (PSO)
Models the social behaviour of bird flocking or fish schooling.
Each particle/bird has a position and velocity associated with it.
Particles change the position by adjusting their velocity to
 seek food
 avoid predators
 identify optimized environmental parameters
Each particle memorizes the best location identified by it.
Particles communicate the information regarding the best location explored by them.
Velocity of the particles are modified by using
 flying experience of the particle
 flying experience of the group

5
Particle Swarm Optimization (PSO)
Initial position and velocity of particles are generated randomly within the search space.
vi Velocity of the ith particle
Particle velocity (v) is determined as w Inertia of the particles
vi  wvi  c1r1  pbest ,i  X i   c2 r2  g best  X i  c1 and c2 Acceleration coefficients
r1 and r2 Random numbers  [0,1] of size (1xD)
Position of a particle is modified as pbest,i Personal best of ith particle

X i  X i  vi gbest Global best


Xi Position of ith particle
Evaluate the objective function fi and update the population, irrespective of the fitness

Update pbest and gbest


pbest ,i  X i  gbest  pbest ,i 
 if fi  f pbest ,i  if f pbest ,i  f gbest
f pbest ,i  fi  f gbest  f pbest ,i 

6
Velocity of a particle
vi  wvi  c1r1  pbest ,i  X i   c2 r2  g best  X i 

wvi c1r1  pbest ,i  X i  c2 r2  gbest  X i 


Momentum part Cognitive part Social part
Serves as the memory of Quantifies the performance of Quantifies the performance of
previous flight ith particle relative to its past ith particle relative to neighbors
performance
Prevent the particle from Particles are drawn towards
drastically changing the Particles are drawn back to the best position determined
direction by the group
their own best position
Biased towards the previous Resembles the group norm
direction Nostalgia of the particle that each particle seek to attain

A. P. Engelbrecht, Particle Swarm Optimization, Computational Intelligence: An Introduction, Second Edition, chapter 16, 2007 7
Possible cases
Cases Better than its Better than Remarks
Personal best Global best
Case 1 Do not update pbest and gbest
Update pbest
Case 2  Do not update gbest
Case 3   Update pbest and gbest
Case 4 (Cannot happen)  Do not update pbest or gbest

4 Case 3:
min f  x    xi2 ; 0  xi  10, i  1,2,3,4
i 1
Case 2: Let X = [1 3], f = 10
pbest = [4 5], fpbest = 41
Case 1: Let X = [4 3], f = 25 gbest = [2 3], fgbest = 13
pbest = [4 5], fpbest = 41
Let X = [5 6], f = 61 f < fpbest
gbest = [2 3], fgbest = 13
pbest = [4 5], fpbest = 41
gbest = [2 3], fgbest = 13 f < fpbest fpbest = 10 and pbest = [1 3]
f > fpbest fpbest = 25 and pbest = [4 3] f < fgbest
f > fgbest f > fgbest fgbest = 10 and gbest = [1 3]
8
Working of PSO: Sphere Function
f  x   x12  x22  x32  x42
4
Consider min f  x    x ; 0  xi  10, i  1, 2,3, 4
2
i
i 1

Decision variables: x1, x2, x3 and x4

 Step 1: Fix population size, inertia weight, acceleration coefficient, maximum iterations
NP = 5, w = 0.7, c1 = 1.5, c2 = 1.5, T = 10

 Step 2: Generate random positions within the domain. Evaluate the fitness.
4 0 0 8  80 
3 140 
1 9 7  
 
P  0 3 1 5 f   35 
   
2 1 4 9 102 
 6 2 8 3  113
9
Determine personal and global best solutions
 Step 3: Generate random velocities within the domain of the variables.
4 0 0 8  80  9 6 1 8
3 140  5 0
1 9 7   
1 3

 
P  0 3 1 5 f   35  v  7 4 1 4
     
2 1 4 9 102  3 0 2 1
 6 2 8 3  113  1 6 8 7 

 Step 4: Determine the personal and global best of all the solutions.

4 0 0 8  80 
3 1 9 7 140 
   
Pbest  P  0 3 1 5 f pbest   35  gbest   0 3 1 5 f gbest  35
   
2 1 4 9 102 
 6 2 8 3  113
10
First Solution: Generation
 Step 5: Generate two sets of random vectors w = 0.7, c1 = 1.5, c2 = 1.5, T = 10

r1 = [0.4 0.3 0.9 0.5 ] r2 = [0.8 0.2 0.7 0.4] v1   9 6 1 8


X 1   4 0 0 8 f  80
 Step 6: Determine velocity pbest ,1   4 0 0 8 f pbest1  80

v1 = 0.7 x [9 6 1 8] + gbest   0 3 1 5 f gbest  35


1.5 x [0.4 0.3 0.9 0.5 ] x ([4 0 0 8] – [4 0 0 8]) +
1.5 x [0.8 0.2 0.7 0.4] x ([0 3 1 5] – [4 0 0 8])
v1 = [1.5 5.1 1.75 3.8]
vi  wvi  c1r1  pbest ,i  X i   c2 r2  gbest  X i 
 Step 7: Determine position
X i  X i  vi
X1 = [4 0 0 8] + [1.5 5.1 1.75 3.8]
= [5.5 5.1 1.75 11.8]
11
Bounding of solution

xnew xnew xnew


lb ub lb ub lb ub

xnew is within bounds xnew violates the upper bound xnew violates the lower bound

xnew xnew

lb ub lb ub

No bounding required Shift xnew to upper bound Shift xnew to lower bound

12
First Solution: Updating
 Step 8: Check bounds, bound for violation 0  xi  10 v1  1.5 5.1 1.75 3.8
X 1   4 0 0 8 f1  80
X1 = [5.5 5.1 1.75 11.8] X1 = [5.5 5.1 1.75 10] pbest ,1   4 0 0 8 f pbest1  80
gbest   0 3 1 5 f gbest  35

 Step 9: Evaluate fitness f 1 = 5.52 + 5.12 + 1.752 + 102 = 159.32

5.5 5.1 1.75 10  159.32 


3 1 9 7  140 
   
 Step 10: Update population Pop   0 3 1 5 f   35 
   
 2 1 4 9   102 
 6 2 8 3   113 

 Step 11: No update of pbest,1 as new solution is not better. pbest ,1   4 0 0 8 f pbest1  80

 Step 12: No update in gbest as new solution is not better. gbest   0 3 1 5 f gbest  35
13
Second Solution: Generation
 Step 1: Generate two sets of random vectors w = 0.7, c1 = 1.5, c2 = 1.5, T = 10

r1 = [0.1 0.4 0.6 0.3] r2 = [0.7 0.5 0.8 0.2] v2   5 1 3 0


X 2  3 1 9 7 f  140
 Step 2: Determine velocity pbest ,2   3 1 9 7  f pbest2  140

v2 = 0.7 x [5 1 3 0] + gbest   0 3 1 5 f gbest  35


1.5 x [0.1 0.4 0.6 0.3] x ([3 1 9 7] – [3 1 9 7]) +
1.5 x [0.7 0.5 0.8 0.2] x ([0 3 1 5] – [3 1 9 7])
v2 = [0.35 2.2 -7.5 -0.6]
vi  wvi  c1r1  pbest ,i  X i   c2 r2  gbest  X i 
 Step 3: Determine position X i  X i  vi
X2 = [3 1 9 7] + [0.35 2.2 -7.5 -0.6]
= [3.35 3.2 1.5 6.4]
14
Second Solution: Updating
 Step 4: Check bounds, bound if violation 0  xi  10 v2   0.35 2.2 7.5 0.6
X 2  3 1 9 7 f 2  140
X2 = [3.35 3.2 1.5 6.4]
pbest ,2   3 1 9 7  f pbest2  140
gbest   0 3 1 5 f gbest  35
 Step 5: Evaluate fitness f 2 = 3.352 + 3.22 + 1.52 + 6.42 = 64.67

 5.5 5.1 1.75 10  159.32 


3.35 3.2 1.5 6.4   64.67 
   
Pop   0 3 1 5 f   35 
 Step 6: Update population    
 2 1 4 9   102 
 6 2 8 3   113 

pbest ,2   3.35 3.2 1.5 6.4


 Step 7: Update pbest,2 as the new solution is better than pbest,2
f pbest2  64.67

 Step 8: No update in gbest as new solution is not better gbest   0 3 1 5 f gbest  35


15
Third Solution: Generation
 Step 1: Generate two sets of random vectors w = 0.7, c1 = 1.5, c2 = 1.5, T = 10

r1 = [0.2 0.7 0.4 0.9] r2 = [0.9 0.2 0.1 0.4] v3   7 4 1 4


X 3   0 3 1 5 f  35
 Step 2: Determine velocity pbest ,3   0 3 1 5 f pbest3  35

v3 = 0.7 x [7 4 1 4] + gbest   0 3 1 5 f gbest  35


1.5 x [0.2 0.7 0.4 0.9] x ([0 3 1 5] – [0 3 1 5]) +
1.5 x [0.9 0.2 0.1 0.4] x ([0 3 1 5] – [0 3 1 5])
v3 = [4.9 2.8 0.7 2.8]
vi  wvi  c1r1  pbest ,i  X i   c2 r2  gbest  X i 
 Step 3: Determine position
X i  X i  vi
X3 = [0 3 1 5] + [4.9 2.8 0.7 2.8]
= [4.9 5.8 1.7 7.8]
16
Third Solution: Updating
 Step 4: Check bounds, bound if violation v3   4.9 2.8 0.7 2.8

X3= [4.9 5.8 1.7 7.8] 0  xi  10 X 3   0 3 1 5 f3  35


pbest ,3   0 3 1 5 f pbest3  35
gbest   0 3 1 5 f gbest  35
 Step 5: Evaluate fitness f 3 = 4.92 + 5.82 + 1.72 + 7.82 = 121.38

 5.5 5.1 1.75 10  159.32 


3.35 3.2 1.5 6.4   64.67 
   
Pop   4.9 5.8 1.7 7.8 f  121.38
 Step 6: Update population    
 2 1 4 9  102
 
 6 2 8 3   113 

 Step 7: No update of pbest,3 as new solution is not better. pbest ,3   0 3 1 5 f pbest3  35

 Step 8: No update in gbest as new solution is not better. gbest   0 3 1 5 f gbest  35


17
Fourth Solution: Generation
 Step 1: Generate two sets of random vectors w = 0.7, c1 = 1.5, c2 = 1.5, T = 10

r1 = [0.7 0.5 0.8 0.1] r2 = [0.8 0.1 0.7 0.9] v4   3 0 2 1


X 4   2 1 4 9 f  102
 Step 2: Determine velocity pbest ,4   2 1 4 9 f pbest4  102

v4 = 0.7 x [3 0 2 1] + gbest   0 3 1 5 f gbest  35


1.5 x [0.7 0.5 0.8 0.1] x ([2 1 4 9] – [2 1 4 9]) +
1.5 x [0.8 0.1 0.7 0.9] x ([0 3 1 5] – [2 1 4 9])
v4 = [-0.3 0.3 -1.75 -4.7]
vi  wvi  c1r1  pbest ,i  X i   c2 r2  gbest  X i 
 Step 3: Determine position X i  X i  vi
X4 = [2 1 4 9] + [-0.3 0.3 -1.75 -4.7]
=[1.7 1.3 2.25 4.3]
18
Fourth Solution: Updating
v4   0.3 0.3 1.75 4.7 
 Step 4: Check bounds, bound if violation 0  xi  10
X 4   2 1 4 9 f  102
pbest ,4   2 1 4 9 f pbest4  102
X4 = [1.7 1.3 2.25 4.3]
gbest   0 3 1 5 f gbest  35

 Step 5: Evaluate fitness f 4 = 1.72 + 1.32 + 2.252 + 4.32 = 28.13


 5.5 5.1 1.75 10  159.32 
3.35 3.2 1.5 6.4   64.67 
   
 Step 6: Update population Pop   4.9 5.8 1.7 7.8 f  121.38
   
 1.7 1.3 2.25 4.3  28.13 
 6 2 8 3   113 

 Step 7: Update pbest,4 as the new solution is better than pbest,4 pbest ,4  1.7 1.3 2.25 4.3
f pbest 4  28.13
 Step 8: Update gbest as the new solution is better
gbest  1.7 1.3 2.25 4.3
19
f pbest 4  28.13  gbest  35  f gbest  28.13
Fifth Solution: Generation
w = 0.7, c1 = 1.5, c2 = 1.5, T = 10
 Step 1: Generate two sets of random vectors
v5  1 6 8 7 
r1 = [0.3 0.8 0.2 0.1] r2 = [0.5 0.1 0.2 0.7]
X 5   6 2 8 3 f  113
pbest ,5   6 2 8 3 f pbest5  113
 Step 2: Determine velocity
gbest  1.7 1.3 2.25 4.3 f gbest  28.13
v5 = 0.7 x [1 6 8 7] +
1.5 x [0.3 0.8 0.2 0.1] x ([6 2 8 3] – [6 2 8 3]) +
1.5 x [0.5 0.1 0.2 0.7] x ([1.7 1.3 2.25 4.3] – [6 2 8 3])
v5 = [-2.52 4.09 3.87 6.26]
vi  wvi  c1r1  pbest ,i  X i   c2 r2  gbest  X i 
 Step 3: Determine position X i  X i  vi

X5 = [6 2 8 3] + [-2.52 4.09 3.87 6.26]


= [3.48 6.09 11.87 9.26]
20
Fifth Solution: Updating
 Step 4: Check bounds, bound if violation 0  xi  10 v5   2.52 4.09 3.87 6.26
X 5   6 2 8 3 f  113
X5 = [3.48 6.09 11.87 9.26]
pbest ,5   6 2 8 3 f pbest5  113
X5 = [3.48 6.09 10 9.26] gbest  1.7 1.3 2.25 4.3 f gbest  28.13

 Step 5: Evaluate fitness f 5 = 3.482 + 6.092 + 102 + 9.262 = 234.95

 5.5 5.1 1.75 10  159.32 


3.35 3.2 1.5 6.4   64.67 
   
 Step 6: Update population Pop   4.9 5.8 1.7 7.8  f  121.38 
   
 1.7 1.3 2.25 4.3   28.13 
3.48 6.09 10 9.26   234.95

 Step 7: No update of pbest,5 as new solution is not better. pbest ,5   6 2 8 3 f pbest5  113

 Step 8: No update of gbest,5 as new solution is not better. gbest  1.7 1.3 2.25 4.3 f gbest  28.13
21
Completion of first iteration
4 0 0 8  80  4 0 0 8  80  9 6 1 8
3 7 140  3
1 9 7 140  5 0
    gbest   0 3 1 5
1 9 1 3
     
P  0 3 1 5 f   35  Pbest  0 3 1 5  f pbest   35  v  7 4 1 4
          f gbest  35
2 1 4 9 102  2 1 4 9 102  3 0 2 1
 6 2 8 3  113  6 2 8 3  113  1 6 8 7 

 4 0 0 8  80   1.5 5.1 1.75 3.8 


 5.5 5.1 1.75 10  159.32  3.35 3.2 1.5 6.4   64.67   0.35 2.2 7.5 0.6 
3.35 3.2 1.5 6.4   64.67 
     
   
Pbest   0 3 1 5 f pbest   35  v   4.9 2.8 0.7 2.8 
P   4.9 5.8 1.7 7.8  f  121.38   
   

 1.7 1.3 2.25 4.3



 28.13

  1.7 1.3 2.25 4.3   28.13   0.3 0.3 1.75 4.7 
3.48 6.09 10 9.26   235.95  6  113   2.52 4.09 3.87 6.26 
2 8 3 

gbest  1.7 1.3 2.25 4.3 , f gbest  28.13


22
Second Iteration: Generation of first solution
w = 0.7, c1 = 1.5, c2 = 1.5, T = 10
 Step 1: Generate two sets of random vectors
 1.5 5.1 1.75 3.8 
r1 = [0.7 0.2 0.8 0.1]  5.5 5.1 1.75 10   0.35 2.2 7.5 0.6 
3.35 3.2 1.5 6.4 
   
r2 = [0.9 0.3 0.2 0.5] P   4.9 5.8 1.7 7.8  v   4.9 2.8 0.7 2.8 
   
 1.7 1.3 2.25 4.3    0.3 0.3  1.75  4.7 
3.48 6.09 10 9.26   2.52 4.09 3.87 6.26 
 Step 2: Determine velocity gbest  1.7 1.3 2.25 4.3 , f gbest  28.13
pbest ,1   4 0 0 8 f pbest  80
v1 = 0.7 x [1.5 5.1 1.75 11.8] + 1

1.5 x [0.7 0.2 0.8 0.1] x ([4 0 0 8] – [5.5 5.1 1.75 10]) +
1.5 x [0.9 0.3 0.2 0.5] x ([1.7 1.3 2.25 4.3] – [5.5 5.1 1.75 10])
v1 =[-5.65 0.33 -0.73 3.68]

 Step 3: Determine position


X1 = [5.5 5.1 1.75 10] + [- 5.65 0.33 -0.73 3.68] = [-0.15 5.43 1.02 13.68]
23
First Solution: Updating
v1   5.65 0.33 0.73 3.68
 Step 4: Check bounds, bound if violation 0  xi  10
X 1   5.5 5.1 1.75 10 f  159.32
X1 =[-0.15 5.43 1.02 13.68] X1 =[0 5.43 1.02 10] pbest ,1   4 0 0 8 f pbest1  80
gbest  1.7 1.3 2.25 4.3 f gbest  28.13

 Step 5: Evaluate fitness f 1 = 02 + 5.432 + 1.022 + 102 = 130.53

 0 5.43 1.02 10  130.53 


3.35 3.2 1.5 6.4   64.67 
   
 Step 6: Update the population P   4.9 5.8 1.7 7.8 f  121.38 
   
 3.1 7.6 2.95 5  101.07 
 2.2 6.35 10 10   245.16 

 Step 7: No update of pbest,1 as new solution is not better than pbest,1


pbest ,1   4 0 0 8 f pbest1  80
 Step 8: No update of gbest as new solution is not better. gbest  1.7 1.3 2.25 4.3 f gbest  28.13
24
Pseudocode
Input: Fitness function, lb, ub, Np, T , w, c1 and c2
pbest: Np x D, fpbest: Np x 1
1. Initialize a random population (P) and velocity (v) within the bounds gbest: 1 x D, fgbest: 1 x 1
For T iterations
2. Evaluate the objective function value (f) of P FE = Np Total FE  N p  N pT
3. Assign pbest as P and fpbest as f
4. Identify the solution with best fitness and assign that solution as gbest and fitness as fgbest
for t = 1 to T
for i = 1 to Np
Determine the velocity (vi) of ith particle
Determine the new position (Xi) of ith particle Generation vi  wvi  c1r1  pbest ,i  X i   c2 r2  g best  X i 
Bound Xi X i  X i  vi
Evaluate the objective function value (fi) of ith particle FE =1
Update the population by including Xi and fi pbest ,i  X i 
 if f i  f pbest ,i
Update pbest, i and fpbest f pbest ,i  fi 
Update gbest and fgbest Memorizing
end gbest  pbest ,i 
 if f pbest ,i  f gbest
end f gbest  f pbest ,i 

25
User-specified parameter: Acceleration coefficients (c1 and c2)
vi  wvi  c1r1  pbest ,i  X i   c2 r2  g best  X i 
X i  X i  vi

Cases Outcome
c1=c2=0 Particles move in the same direction until it hits the search space boundary
Particles are independent hill climbers
c1>0 & c2=0
Particles perform local search
Entire swarm becomes one stochastic hill climber
c1=0 & c2>0
All particles get attracted to a single point
c 1 = c2 Particle is attracted towards the average pbest,i and gbest
c1>>c2 Particles attracted towards its pbest,i which results in excessive wandering
c1<<c2 Particles attracted towards gbest and cause premature convergence towards optima
Low values
Smooth particle trajectories
of c1 and c2
High values
Abrupt movements
of c1 and c2
Computational Intelligence: An Introduction, Second Edition, John Wiley & Sons, 2007 26
Impact of c1 and c2
Np = 10, T = 50
vi  wvi  c1r1  pbest ,i  X i   c2r2  g best  X i 
X i  X i  vi

f    xi  16 xi2  5 xi 
1 D 4
2 i 1
5  xi  5 i  1,2,..., D
Global minimium
f *  39.16599
x*   2.903534,..., 2.903534 

Surface plot: https://www.sfu.ca/~ssurjano/stybtang.html 27


User-specified parameter : Inertia weight (w)
Control the impact of previous velocity in new direction
Balancing exploration and exploitation
Large inertia weight results in exploration (diverges the swarm) and small inertia causes
exploitation (decelerate the particles)
The value of w
 can be a constant w=w
 multiplied with damping ratio in every iteration a  : damping ratio
(user defined)
 linearly decreased between wmax and wmin
w = wmax -
 wmax - wmin  t
• w is frequently set as linearly decreasing from 0.9 to 0.4 b T
t : current iteration
 set using constriction coefficients
T : maximum iteration
wmin and wmax are user defined parameter
Empirical Study of Particle Swarm Optimization, Proceedings of the 1999 Congress on Evolutionary Computation-CEC99, pp. 1945-1950 Vol. 3,1999
aParticle Swarm Optimization in MATLAB - Yarpiz Video Tutorial - Part 3/3, https://www.youtube.com/watch?v=ICBYrKsFPqA
bParticle swarm optimization: developments, applications and resources, Proceedings of the CEC 2001, pp. 81-86, 2001
28
Constriction coefficients
Implemented to prevent explosion and also aid particles to converge to an optima
2k

2     2  4

0  k 1
    1 2
4 Commonly used values
k  1,   2.05,   2.05
1 2

Constriction coefficient rule


w c 1
  c 2
 
1 2

The particle swarm - explosion, stability, and convergence in a multidimensional complex space, in IEEE Transactions on Evolutionary Computation, vol. 6, no. 1, pp. 58-73. 2002
Particle Swarm Optimization in MATLAB - Yarpiz Video Tutorial - Part 3/3, https://www.youtube.com/watch?v=ICBYrKsFPqA 29
Different values of w
Np = 10, T = 50
vi  wvi  c1r1  pbest ,i  X i   c2r2  g best  X i 
X i  X i  vi

f    xi  16 xi2  5 xi 
1 D 4
2 i 1
5  xi  5 i  1,2,..., D
Global minimium
f *  39.16599
x*   2.903534,..., 2.903534 

Surface plot: https://www.sfu.ca/~ssurjano/stybtang.html 30


Varying w
Np = 10, T = 50
Damping ratio = 0.99 Varied using damping ratio Varied linearly
wmax = 0.9, wmin = 0.4
vi  wvi  c1r1  pbest ,i  X i   c2 r2  gbest  X i 
X i  X i  vi

f    xi  16 xi2  5 xi 
1 D 4
2 i 1
5  xi  5 i  1,2,..., D
Global minimium
f *  39.16599
x*   2.903534,..., 2.903534 
Surface plot: https://www.sfu.ca/~ssurjano/stybtang.html 31
Constriction coefficients
Np = 10, T = 50
k=1, ϕ1 = ϕ2 = 2.05
vi  wvi  c1r1  pbest ,i  X i   c2 r2  gbest  X i 
X i  X i  vi

f    xi  16 xi2  5 xi 
1 D 4
2 i 1
5  xi  5 i  1,2,..., D
Global minimium
f *  39.16599
x*   2.903534,..., 2.903534 
Surface plot: https://www.sfu.ca/~ssurjano/stybtang.html 32
TLBO vs PSO
TLBO PSO
Phases Teacher, Learner No phases (Position and velocity update)
Convergence Monotonic Monotonic (with gbest and pbest)
Population size, termination criteria,
Parameters Population size, termination criteria
inertia weight, acceleration coefficients
using velocity vector, personal best and
Generation of new Only using other solutions, mean and
global best (need not be the part of
solution best solution (part of population)
population)
Solution update in
Twice Once
one iteration
Selection Greedy New solution is always accepted (µ, λ)
Number of function
Np + 2NpT Np + NpT
evaluations

33
Further reading
Particle swarm optimization, Proceedings of ICNN'95 - International Conference on Neural
Networks, Perth, WA, Australia, 4, 1942-1948, 1995

The particle swarm - explosion, stability, and convergence in a multidimensional complex


space, IEEE Transactions on Evolutionary Computation, 6(1), 58-73, 2002

Handling multiple objectives with particle swarm optimization, IEEE Transactions on


Evolutionary Computation, 8(3), 256-279, 2004

A dynamic neighbourhood learning based particle swarm optimizer for global numerical
optimization, Information Sciences, 209, 16-36 ,2012

34
Thank You !!!

35

You might also like