0% found this document useful (0 votes)
37 views

The Particle Swarm Optimization Algorithm

The document discusses the Particle Swarm Optimization (PSO) algorithm. PSO is a computational method for optimizing problems by iteratively trying to improve candidate solutions. It was developed in 1995 and inspired by bird flocking behavior. PSO benefits from faster convergence and easier searching in large problem spaces compared to other algorithms like genetic algorithms. The basic principle of PSO involves particles moving toward the best positions in the search space based on their own experiences and the experiences of neighboring particles.

Uploaded by

Eki Rovianto
Copyright
© © All Rights Reserved
Available Formats
Download as PPT, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
37 views

The Particle Swarm Optimization Algorithm

The document discusses the Particle Swarm Optimization (PSO) algorithm. PSO is a computational method for optimizing problems by iteratively trying to improve candidate solutions. It was developed in 1995 and inspired by bird flocking behavior. PSO benefits from faster convergence and easier searching in large problem spaces compared to other algorithms like genetic algorithms. The basic principle of PSO involves particles moving toward the best positions in the search space based on their own experiences and the experiences of neighboring particles.

Uploaded by

Eki Rovianto
Copyright
© © All Rights Reserved
Available Formats
Download as PPT, PDF, TXT or read online on Scribd
You are on page 1/ 18

The Particle Swarm

Optimization Algorithm

Nebojša Trpković
[email protected]
10th Dec 2010
Problem Definition

optimization of continuous nonlinear functions

finding the best solution in problem space

Nebojša Trpković [email protected] Slide 2 of 18


Example

Nebojša Trpković [email protected] Slide 3 of 18


Importance

• function optimization

• artificial neural network training

• fuzzy system control

Nebojša Trpković [email protected] Slide 4 of 18


Existing Solutions

• Ant Colony (ACO)


– discrete

• Genetic Algorithms (GA)


– slow convergence

Nebojša Trpković [email protected] Slide 5 of 18


Particle Swarm Optimization

Very simple classification:

• a computational method
• that optimizes a problem
• by iteratively trying to improve a candidate solution
• with regard to a given measure of quality

Nebojša Trpković [email protected] Slide 6 of 18


Particle Swarm Optimization
Facts:

• developed by Russell C. Eberhart and James Kennedy in 1995

• inspired by social behavior of bird flocking or fish schooling

• similar to evolutionary techniques such as Genetic Algorithms (GA)

Nebojša Trpković [email protected] Slide 7 of 18


Particle Swarm Optimization
Benefits:

• faster convergence
• less parameters to tune

• easier searching in very large problem spaces

Nebojša Trpković [email protected] Slide 8 of 18


Particle Swarm Optimization
Basic principle:

let particle swarm move


towards the best position in search space,
remembering each particle’s best known position
and global (swarm’s) best known position

Nebojša Trpković [email protected] Slide 9 of 18


Velocity Change
xi – specific particle
pi – particle’s (personal) best known position
g – swarm’s (global) best known position
vi – particle’s velocity

vi ← ωvi + φprp(pi - xi) + φgrg(g - xi)


inertia cognitive social

Nebojša Trpković [email protected] Slide 10 of 18


Position Change

xi – specific particle
vi – particle’s velocity

xi ← xi + vi

Nebojša Trpković [email protected] Slide 11 of 18


Algorithm
For each particle
Initialize particle
END

Do

For each particle


Calculate fitness value
If the fitness value is better than the best personal fitness value in history, set current value
as a new best personal fitness value
End

Choose the particle with the best fitness value of all the particles, and if that fitness value is
better then current global best, set as a global best fitness value
For each particle
Calculate particle velocity according velocity change equation
Update particle position according position change equation
End

While maximum iterations or minimum error criteria is not attained

Nebojša Trpković [email protected] Slide 12 of 18


Single Particle

Nebojša Trpković [email protected] Slide 13 of 18


Parameters selection
Different ways to choose parameters:

• proper balance between exploration and exploitation


(avoiding premature convergence to a local optimum yet still ensuring a good rate of
convergence to the optimum)

• putting all attention on exploitation


(making possible searches in a vast problem spaces)

• automatization by meta-optimization

Nebojša Trpković [email protected] Slide 14 of 18


Avoiding Local Optimums

• adding randomization factor to velocity calculation

• adding random momentum in a specific iterations

Nebojša Trpković [email protected] Slide 15 of 18


Swarm

Nebojša Trpković [email protected] Slide 16 of 18


Conclusion

“This algorithm belongs ideologically to that philosophical school

that allows wisdom to emerge rather than trying to impose it,

that emulates nature rather than trying to control it,

and that seeks to make things simpler rather than more complex.”

James Kennedy, Russell Eberhart

Nebojša Trpković [email protected] Slide 17 of 18


References
• Wikipedia
http://www.wikipedia.org/
• Swarm Intelligence
http://www.swarmintelligence.org/
• Application of a particle swarm optimization algorithm for
determining optimum well location and type, Jerome Onwunalu
and Louis J. Durlofsky, 2009
• Particle Swarm Optimization, James Kennedy and Russell Eberhart,
1995
http://www.engr.iupui.edu/~shi/Coference/psopap4.html
• Robot Swarm driven by Particle Swarm Optimization
algorithm, thinkfluid
http://www.youtube.com/watch?v=RLIA1EKfSys

Nebojša Trpković [email protected] Slide 18 of 18

You might also like