Differential Evolution
Differential Evolution
DE is used for multidimensional real-valued functions but Let x Rn designate a candidate solution (agent) in the
does not use the gradient of the problem being optimized, population. CR denotes the cross over rate. The basic
which means DE does not require for the optimization DE algorithm can then be described as follows:
problem to be dierentiable as is required by classic optimization methods such as gradient descent and quasi Initialize all agents x with random positions in the
newton methods. DE can therefore also be used on optisearch-space.
mization problems that are not even continuous, are noisy,
change over time, etc.[1]
Until a termination criterion is met (e.g. number of
iterations performed, or adequate tness reached),
DE optimizes a problem by maintaining a population of
repeat the following:
candidate solutions and creating new candidate solutions
by combining existing ones according to its simple for For each agent x in the population do:
mulae, and then keeping whichever candidate solution
has the best score or tness on the optimization problem
Pick three agents a, b , and c from the
at hand. In this way the optimization problem is treated
population at random, they must be disas a black box that merely provides a measure of quality
tinct from each other as well as from agent
given a candidate solution and the gradient is therefore
x
not needed.
Pick a random index R {1, . . . , n} ( n
DE is originally due to Storn and Price.[2][3] Books have
being the dimensionality of the problem
been published on theoretical and practical aspects of usto be optimized).
ing DE in parallel computing, multiobjective optimiza Compute the agents potentially new potion, constrained optimization, and the books also consition y = [y1 , . . . , yn ] as follows:
[4][5][6][7]
tain surveys of application areas.
Excellent sur For each i {1, . . . , n} , pick a
veys on the multi-faceted research aspects of DE can be
uniformly distributed number ri
[8][9]
found in journal articles like.
U (0, 1)
If ri < CR or i = R then set yi =
ai + F (bi ci ) otherwise set yi =
1 Algorithm
xi
(In essence, the new position is the
A basic variant of the DE algorithm works by having a
outcome of the binary crossover of
population of candidate solutions (called agents). These
agent x with the intermediate agent
agents are moved around in the search-space by using
z = a + F (b c) .)
simple mathematical formulae to combine the positions
If f (y) < f (x) then replace the agent in
of existing agents from the population. If the new pothe population with the improved candisition of an agent is an improvement it is accepted and
date solution, that is, replace x with y in
forms part of the population, otherwise the new position
the population.
is simply discarded. The process is repeated and by doing so it is hoped, but not guaranteed, that a satisfactory
Pick the agent from the population that has the highsolution will eventually be discovered.
est tness or lowest cost and return it as the best
Formally, let f : Rn R be the cost function which
4 SAMPLE CODE
Parameter selection
Variants
Variants of the DE algorithm are continually being developed in an eort to improve optimization performance.
Many dierent schemes for performing crossover and
mutation of agents are possible in the basic algorithm
given above, see e.g.[3] More advanced DE variants are
also being developed with a popular research trend being
to perturb or adapt the DE parameters during optimization, see e.g. Price et al.,[4] Liu and Lampinen,[15] Qin
and Suganthan,[16] Civicioglu[17] and Brest et al.[18] There
are also some work in making a hybrid optimization
method using DE combined with other optimizers.[19]
Sample code
3
if(tnessFunction(original)<tnessFunction(candidate)){ [10]
population.remove(original) population.add(candidate) }
j++ } } //nd best candidate solution i=0 Individual bestFitness=new Individual() while (i<populationSize)
{
Individual
individual=population.get(i) [11]
if(tnessFunction(bestFitness)<tnessFunction(individual)){
bestFitness=individual } i++ } //your solution return
bestFitness } }
See also
Articial bee colony algorithm
CMA-ES
Dierential search algorithm[17]
Evolution strategy
Genetic algorithm
References
[1] Rocca, P.; Oliveri, G.; Massa, A. (2011). Dierential Evolution as Applied to Electromagnetics. IEEE
Antennas and Propagation Magazine. 53 (1): 3849.
doi:10.1109/MAP.2011.5773566.
[2] Storn, R.; Price, K. (1997). Dierential evolution - a
simple and ecient heuristic for global optimization over
continuous spaces. Journal of Global Optimization. 11:
341359. doi:10.1023/A:1008202821328.
[3] Storn, R. (1996). On the usage of dierential evolution for function optimization. Biennial Conference of
the North American Fuzzy Information Processing Society
(NAFIPS). pp. 519523.
[4] Price, K.; Storn, R.M.; Lampinen, J.A. (2005).
Dierential Evolution: A Practical Approach to Global Optimization. Springer. ISBN 978-3-540-20950-8.
[5] Feoktistov, V. (2006). Dierential Evolution: In Search of
Solutions. Springer. ISBN 978-0-387-36895-5.
[6] G. C. Onwubolu and B V Babu, New Optimization Techniques in Engineering. Retrieved 17 September 2016.
[7] Chakraborty, U.K., ed. (2008), Advances in Dierential
Evolution, Springer, ISBN 978-3-540-68827-3
[8] S. Das and P. N. Suganthan, Dierential Evolution: A
Survey of the State-of-the-art, IEEE Trans. on Evolutionary Computation, Vol. 15, No. 1, pp. 4-31, Feb.
2011, DOI: 10.1109/TEVC.2010.2059031.
[9] S. Das, S. S. Mullick, P. N. Suganthan, Recent Advances in Dierential Evolution - An Updated Survey,
Swarm and Evolutionary Computation, doi:10.1016/j.
swevo.2016.01.004, 2016.
Liu, J.; Lampinen, J. (2002). On setting the control parameter of the dierential evolution method. Proceedings of the 8th International Conference on Soft Computing
(MENDEL). Brno, Czech Republic. pp. 1118.
Zaharie, D. (2002). Critical values for the control parameters of dierential evolution algorithms. Proceedings of the 8th International Conference on Soft Computing
(MENDEL). Brno, Czech Republic. pp. 6267.
[12] Pedersen, M.E.H. (2010). Tuning & Simplifying Heuristical Optimization (PDF) (PhD thesis). University of
Southampton, School of Engineering Sciences, Computational Engineering and Design Group.
[13] Pedersen, M.E.H. (2010). Good parameters for dierential evolution (PDF). Technical Report HL1002. Hvass
Laboratories.
[14] Zhang, X.; Jiang, X.; Scott, P.J. (2011). A Minimax
Fitting Algorithm for Ultra-Precision Aspheric Surfaces.
The 13th International Conference on Metrology and Properties of Engineering Surfaces.
[15] Liu, J.; Lampinen, J. (2005). A fuzzy adaptive dierential evolution algorithm. Soft Computing. 9 (6): 448
462. doi:10.1007/s00500-004-0363-x.
[16] Qin, A.K.; Suganthan, P.N. (2005). Self-adaptive differential evolution algorithm for numerical optimization.
Proceedings of the IEEE congress on evolutionary computation (CEC). pp. 17851791.
[17] Civicioglu, P. (2012). Transforming geocentric cartesian
coordinates to geodetic coordinates by using dierential
search algorithm. Computers & Geosciences. 46: 229
247. doi:10.1016/j.cageo.2011.12.011.
[18] Brest, J.; Greiner, S.; Boskovic, B.; Mernik, M.;
Zumer, V. (2006).
Self-adapting control parameters in dierential evolution: a comparative study
on numerical benchmark functions. IEEE Transactions on Evolutionary Computation. 10 (6): 646657.
doi:10.1109/tevc.2006.872133.
[19] Zhang, Wen-Jun; Xie, Xiao-Feng (2003). DEPSO: hybrid particle swarm with dierential evolution operator.
IEEE International Conference on Systems, Man, and Cybernetics (SMCC), Washington, DC, USA: 3816-3821.
7 External links
Storns Homepage on DE featuring source-code for
several programming languages.
Fast DE Algorithm A Fast Dierential Evolution
Algorithm using k-Nearest Neighbour Predictor.
MODE Application Parameter Estimation of a
Pressure Swing Adsorption Model for Air Separation Using Multi-objective Optimisation and Support Vector Regression Model.
The runner-root algorithm (RRA)
8.1
Text
8.2
Images
8.3
Content license