0% found this document useful (0 votes)
75 views

Differential Evolution

Differential evolution is a metaheuristic optimization algorithm that iteratively improves candidate solutions by combining existing solutions in the population. It maintains a population of candidate solutions and creates new candidates by combining parameters of randomly selected candidates, accepting improved solutions. This process is repeated until a termination criterion is met, with the aim of finding the global minimum or maximum of the optimization problem. Differential evolution does not require the problem to be differentiable or continuous and can optimize noisy, changing problems treated as a black box.

Uploaded by

x456456456x
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
75 views

Differential Evolution

Differential evolution is a metaheuristic optimization algorithm that iteratively improves candidate solutions by combining existing solutions in the population. It maintains a population of candidate solutions and creates new candidates by combining parameters of randomly selected candidates, accepting improved solutions. This process is repeated until a termination criterion is met, with the aim of finding the global minimum or maximum of the optimization problem. Differential evolution does not require the problem to be differentiable or continuous and can optimize noisy, changing problems treated as a black box.

Uploaded by

x456456456x
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 4

Dierential evolution

In evolutionary computation, dierential evolution


(DE) is a method that optimizes a problem by iteratively
trying to improve a candidate solution with regard to a
given measure of quality. Such methods are commonly
known as metaheuristics as they make few or no assumptions about the problem being optimized and can search
very large spaces of candidate solutions. However, metaheuristics such as DE do not guarantee an optimal solution is ever found.

must be minimized or tness function which must be


maximized. The function takes a candidate solution as
argument in the form of a vector of real numbers and
produces a real number as output which indicates the tness of the given candidate solution. The gradient of f
is not known. The goal is to nd a solution m for which
f (m) f (p) for all p in the search-space, which would
mean m is the global minimum. Maximization can be
performed by considering the function h := f instead.

DE is used for multidimensional real-valued functions but Let x Rn designate a candidate solution (agent) in the
does not use the gradient of the problem being optimized, population. CR denotes the cross over rate. The basic
which means DE does not require for the optimization DE algorithm can then be described as follows:
problem to be dierentiable as is required by classic optimization methods such as gradient descent and quasi Initialize all agents x with random positions in the
newton methods. DE can therefore also be used on optisearch-space.
mization problems that are not even continuous, are noisy,
change over time, etc.[1]
Until a termination criterion is met (e.g. number of
iterations performed, or adequate tness reached),
DE optimizes a problem by maintaining a population of
repeat the following:
candidate solutions and creating new candidate solutions
by combining existing ones according to its simple for For each agent x in the population do:
mulae, and then keeping whichever candidate solution
has the best score or tness on the optimization problem
Pick three agents a, b , and c from the
at hand. In this way the optimization problem is treated
population at random, they must be disas a black box that merely provides a measure of quality
tinct from each other as well as from agent
given a candidate solution and the gradient is therefore
x
not needed.
Pick a random index R {1, . . . , n} ( n
DE is originally due to Storn and Price.[2][3] Books have
being the dimensionality of the problem
been published on theoretical and practical aspects of usto be optimized).
ing DE in parallel computing, multiobjective optimiza Compute the agents potentially new potion, constrained optimization, and the books also consition y = [y1 , . . . , yn ] as follows:
[4][5][6][7]
tain surveys of application areas.
Excellent sur For each i {1, . . . , n} , pick a
veys on the multi-faceted research aspects of DE can be
uniformly distributed number ri
[8][9]
found in journal articles like.
U (0, 1)
If ri < CR or i = R then set yi =
ai + F (bi ci ) otherwise set yi =
1 Algorithm
xi
(In essence, the new position is the
A basic variant of the DE algorithm works by having a
outcome of the binary crossover of
population of candidate solutions (called agents). These
agent x with the intermediate agent
agents are moved around in the search-space by using
z = a + F (b c) .)
simple mathematical formulae to combine the positions
If f (y) < f (x) then replace the agent in
of existing agents from the population. If the new pothe population with the improved candisition of an agent is an improvement it is accepted and
date solution, that is, replace x with y in
forms part of the population, otherwise the new position
the population.
is simply discarded. The process is repeated and by doing so it is hoped, but not guaranteed, that a satisfactory
Pick the agent from the population that has the highsolution will eventually be discovered.
est tness or lowest cost and return it as the best
Formally, let f : Rn R be the cost function which

found candidate solution.


1

4 SAMPLE CODE

Note that F [0, 2] is called the dierential weight and


CR [0, 1] is called the crossover probability, both these
parameters are selectable by the practitioner along with
the population size NP 4 see below.

Parameter selection

Performance landscape showing how the basic DE performs in


aggregate on the Sphere and Rosenbrock benchmark problems
when varying the two DE parameters NP and F , and keeping
xed CR =0.9.

The choice of DE parameters F, CR and NP can have a


large impact on optimization performance. Selecting the
DE parameters that yield good performance has therefore
been the subject of much research. Rules of thumb for
parameter selection were devised by Storn et al.[3][4] and
Liu and Lampinen.[10] Mathematical convergence analysis regarding parameter selection was done by Zaharie.[11]
Meta-optimization of the DE parameters was done by
Pedersen[12][13] and Zhang et al.[14]

Variants

Variants of the DE algorithm are continually being developed in an eort to improve optimization performance.
Many dierent schemes for performing crossover and
mutation of agents are possible in the basic algorithm
given above, see e.g.[3] More advanced DE variants are
also being developed with a popular research trend being
to perturb or adapt the DE parameters during optimization, see e.g. Price et al.,[4] Liu and Lampinen,[15] Qin
and Suganthan,[16] Civicioglu[17] and Brest et al.[18] There
are also some work in making a hybrid optimization
method using DE combined with other optimizers.[19]

Sample code

The following is a specic pseudocode implementation of


dierential evolution, written similar to the Java language.

For more generalized pseudocode, please see the listing


in the Algorithm section above.

//denition of one individual in population public


class Individual { //normally DierentialEvolution
uses oating point variables oat data1, data2 //but
using integers is possible too int data3 } public class
DierentialEvolution { //Variables //linked list that has
our population inside LinkedList<Individual> population=new LinkedList<Individual>() //New instance
of Random number generator Random random=new
Random() int PopulationSize=20 //dierential weight
[0,2] oat F=1 //crossover probability [0,1] oat CR=0.5
//dimensionality of problem, means how many variables
problem has. this case 3 (data1,data2,data3) int N=3;
//This function tells how well given individual performs
at given problem. public oat tnessFunction(Individual
in) { ... return tness } //this is main function of
program public void Main() { //Initialize population with
individuals that have been initialized with uniform random noise //uniform noise means random value inside
your search space int i=0 while(i<populationSize)
{ Individual individual= new Individual() individual.data1=random.UniformNoise()
individual.data2=random.UniformNoise() //integers cant take
oating point values and they need to be either rounded
individual.data3=Math.Floor( random.UniformNoise())
population.add(individual) i++ } i=0 int j //main loop
of evolution. while (!StoppingCriteria) { i++ j=0
while (j<populationSize) { //calculate new candidate solution //pick random point from population int
x=Math.oor(random.UniformNoise()%(population.size()1))
int
a,b,c
//pick
three
dierent
random
points
from
population
do{
a=Math.oor(random.UniformNoise()%(population.size()1))
}while(a==x); do{ b=Math.oor(random.UniformNoise()%(population.size
}while(b==x|
b==a);
do{
c=Math.oor(random.UniformNoise()%(population.size()1))
}while(c==x | c==a | c==b); // Pick a random index [0-Dimensionality] int R=rand.nextInt()%N;
//Compute the agents new position Individual
original=population.get(x)
Individual
candidate=original.clone()
Individual
individual1=population.get(a)
Individual
individual2=population.get(b)
Individual
individual3=population.get(c)
//if(i==R
|
i<CR)
//candidate=a+f*(b-c)
//else
//candidate=x
if(
0==R
|
random.UniformNoise()%1<CR){
candidate.data1=individual1.data1+F*(individual2.data1individual3.data1) }// else isn't needed because we cloned original to candidate if(
1==R
|
random.UniformNoise()%1<CR){
candidate.data2=individual1.data2+F*(individual2.data2individual3.data2) } //integer work same as
oating points but they need to be rounded
if(
2==R
|
random.UniformNoise()%1<CR){
candidate.data3=Math.oor(individual1.data3+F*(individual2.data3individual3.data3))
}
//see
if
is
better
than
original,
if
so
replace

3
if(tnessFunction(original)<tnessFunction(candidate)){ [10]
population.remove(original) population.add(candidate) }
j++ } } //nd best candidate solution i=0 Individual bestFitness=new Individual() while (i<populationSize)
{
Individual
individual=population.get(i) [11]
if(tnessFunction(bestFitness)<tnessFunction(individual)){
bestFitness=individual } i++ } //your solution return
bestFitness } }

See also
Articial bee colony algorithm
CMA-ES
Dierential search algorithm[17]
Evolution strategy
Genetic algorithm

References

[1] Rocca, P.; Oliveri, G.; Massa, A. (2011). Dierential Evolution as Applied to Electromagnetics. IEEE
Antennas and Propagation Magazine. 53 (1): 3849.
doi:10.1109/MAP.2011.5773566.
[2] Storn, R.; Price, K. (1997). Dierential evolution - a
simple and ecient heuristic for global optimization over
continuous spaces. Journal of Global Optimization. 11:
341359. doi:10.1023/A:1008202821328.
[3] Storn, R. (1996). On the usage of dierential evolution for function optimization. Biennial Conference of
the North American Fuzzy Information Processing Society
(NAFIPS). pp. 519523.
[4] Price, K.; Storn, R.M.; Lampinen, J.A. (2005).
Dierential Evolution: A Practical Approach to Global Optimization. Springer. ISBN 978-3-540-20950-8.
[5] Feoktistov, V. (2006). Dierential Evolution: In Search of
Solutions. Springer. ISBN 978-0-387-36895-5.
[6] G. C. Onwubolu and B V Babu, New Optimization Techniques in Engineering. Retrieved 17 September 2016.
[7] Chakraborty, U.K., ed. (2008), Advances in Dierential
Evolution, Springer, ISBN 978-3-540-68827-3
[8] S. Das and P. N. Suganthan, Dierential Evolution: A
Survey of the State-of-the-art, IEEE Trans. on Evolutionary Computation, Vol. 15, No. 1, pp. 4-31, Feb.
2011, DOI: 10.1109/TEVC.2010.2059031.
[9] S. Das, S. S. Mullick, P. N. Suganthan, Recent Advances in Dierential Evolution - An Updated Survey,
Swarm and Evolutionary Computation, doi:10.1016/j.
swevo.2016.01.004, 2016.

Liu, J.; Lampinen, J. (2002). On setting the control parameter of the dierential evolution method. Proceedings of the 8th International Conference on Soft Computing
(MENDEL). Brno, Czech Republic. pp. 1118.
Zaharie, D. (2002). Critical values for the control parameters of dierential evolution algorithms. Proceedings of the 8th International Conference on Soft Computing
(MENDEL). Brno, Czech Republic. pp. 6267.

[12] Pedersen, M.E.H. (2010). Tuning & Simplifying Heuristical Optimization (PDF) (PhD thesis). University of
Southampton, School of Engineering Sciences, Computational Engineering and Design Group.
[13] Pedersen, M.E.H. (2010). Good parameters for dierential evolution (PDF). Technical Report HL1002. Hvass
Laboratories.
[14] Zhang, X.; Jiang, X.; Scott, P.J. (2011). A Minimax
Fitting Algorithm for Ultra-Precision Aspheric Surfaces.
The 13th International Conference on Metrology and Properties of Engineering Surfaces.
[15] Liu, J.; Lampinen, J. (2005). A fuzzy adaptive dierential evolution algorithm. Soft Computing. 9 (6): 448
462. doi:10.1007/s00500-004-0363-x.
[16] Qin, A.K.; Suganthan, P.N. (2005). Self-adaptive differential evolution algorithm for numerical optimization.
Proceedings of the IEEE congress on evolutionary computation (CEC). pp. 17851791.
[17] Civicioglu, P. (2012). Transforming geocentric cartesian
coordinates to geodetic coordinates by using dierential
search algorithm. Computers & Geosciences. 46: 229
247. doi:10.1016/j.cageo.2011.12.011.
[18] Brest, J.; Greiner, S.; Boskovic, B.; Mernik, M.;
Zumer, V. (2006).
Self-adapting control parameters in dierential evolution: a comparative study
on numerical benchmark functions. IEEE Transactions on Evolutionary Computation. 10 (6): 646657.
doi:10.1109/tevc.2006.872133.
[19] Zhang, Wen-Jun; Xie, Xiao-Feng (2003). DEPSO: hybrid particle swarm with dierential evolution operator.
IEEE International Conference on Systems, Man, and Cybernetics (SMCC), Washington, DC, USA: 3816-3821.

7 External links
Storns Homepage on DE featuring source-code for
several programming languages.
Fast DE Algorithm A Fast Dierential Evolution
Algorithm using k-Nearest Neighbour Predictor.
MODE Application Parameter Estimation of a
Pressure Swing Adsorption Model for Air Separation Using Multi-objective Optimisation and Support Vector Regression Model.
The runner-root algorithm (RRA)

8 TEXT AND IMAGE SOURCES, CONTRIBUTORS, AND LICENSES

Text and image sources, contributors, and licenses

8.1

Text

Dierential evolution Source: https://en.wikipedia.org/wiki/Differential_evolution?oldid=749135744 Contributors: Michael Hardy,


David Gerard, Andreas Kaufmann, Discospinster, Rich Farmbrough, Diego Moya, Oleg Alexandrov, Alkarex, Robert K S, Ruud Koot,
Rjwilmsi, Mathbot, NawlinWiki, SmackBot, RDBury, MidgleyDJ, Hongooi, Jasonb05, Guroadrunner, Mishrasknehu, Cydebot, KrakatoaKatie, Dvunkannon, Liquid-aim-bot, Ph.eyes, Calltech, R'n'B, J.A. Vital, Athaenara, K.menin, STBotD, Jamesontai, TXiKiBoT, Kjells,
SieBot, Wmpearl, D14C050, Esa-petri, Fell.inchoate, Sun Creator, SchreiberBike, XLinkBot, Addbot, MrOllie, Chipchap, Luckas-bot,
Yobot, Jorge.maturana~enwiki, Aminrahimian, Scribbleink, Rayman60, Optimering, SporkBot, EdoBot, BG19bot, Lilingxi, Me, Myself,
and I are Here, Sharkyangliu916, Monkbot, Jacob.Wilfridovich, Voltnor, BrandonJackTar, Jeanpmartins and Anonymous: 44

8.2

Images

File:DE_Meta-Fitness_Landscape_(Sphere_and_Rosenbrock).JPG Source: https://upload.wikimedia.org/wikipedia/commons/e/e5/


DE_Meta-Fitness_Landscape_%28Sphere_and_Rosenbrock%29.JPG License: Public domain Contributors: Own work Original artist:
Pedersen, M.E.H., Tuning & Simplifying Heuristical Optimization, PhD Thesis, 2010, University of Southampton, School of Engineering
Sciences, Computational Engineering and Design Group.

8.3

Content license

Creative Commons Attribution-Share Alike 3.0

You might also like