2013 - A Conceptual Comparison of The Cuckoo-Search, Particle Swarm Optimization, Differential Evolution and Artificial Bee Colony Algorithms
2013 - A Conceptual Comparison of The Cuckoo-Search, Particle Swarm Optimization, Differential Evolution and Artificial Bee Colony Algorithms
DOI 10.1007/s10462-011-9276-0
Abstract In this paper, the algorithmic concepts of the Cuckoo-search (CK), Particle
swarm optimization (PSO), Differential evolution (DE) and Artificial bee colony (ABC)
algorithms have been analyzed. The numerical optimization problem solving successes of
the mentioned algorithms have also been compared statistically by testing over 50 differ-
ent benchmark functions. Empirical results reveal that the problem solving success of the
CK algorithm is very close to the DE algorithm. The run-time complexity and the required
function-evaluation number for acquiring global minimizer by the DE algorithm is generally
smaller than the comparison algorithms. The performances of the CK and PSO algorithms are
statistically closer to the performance of the DE algorithm than the ABC algorithm. The CK
and DE algorithms supply more robust and precise results than the PSO and ABC algorithms.
1 Introduction
Optimization is an applied science which explores the best values of the parameters of a
problem that may take under specified conditions (Corne et al. 1999; Horst et al. 2000).
Optimization, in its most simple way, aims to obtain the relevant parameter values which
enable an objective function to generate the minimum or maximum value. The design of an
optimization problem generally starts with the design of an objective function (Rashedi et al.
2009; Karaboga and Akay 2009a; Del Valle et al. 2008; Storn and Price 1997). The objective
function must correctly define the related problem mathematically and clearly express the
P. Civicioglu
Department of Aircraft Electrics and Electronics, College of Aviation, Erciyes University, Kayseri, Turkey
e-mail: [email protected]
E. Besdok (B)
Engineering Faculty, Department of Geomatics Engineering, Erciyes University, Kayseri, Turkey
e-mail: [email protected]
123
316 P. Civicioglu, E. Besdok
relation between the parameters of the problem. Furthermore, if any pre-defined constraints
are to be used in respect of the parameters of the related problem, these should be consid-
ered during the design of the optimization problem. Generally, the optimization problems
are classified under many subtitles; Linear Programming, Integer Programming, Quadratic
Programming, Combinatorial Optimization and Metaheuristics are the terms often used to
classify the optimization methods (Del Valle et al. 2008).
The classical optimization methods frequently used in scientific applications consist of
hessian matrix based methods (Min-Jea et al. 2009) and gradient based methods (Haupt
1995). In practice, it is required many of the classical optimization methods to comply with
the structure of the objective function intended to be solved. However, if the derivative of
the objective function cannot be calculated, it gets difficult to search the optimal solution
via classical optimization means (Rashedi et al. 2009). It can be proved by mathematical
methods that a solution obtained by using classical techniques is globally optimum (Nowak
and Cirpka 2004). But however, it is common to use metaheuristic algorithms in solving
non-differentiable nonlinear-objective functions the solution of which is either impossible
or extremely difficult by using the classical optimization techniques (Karaboga and Akay
2009a; Nowak and Cirpka 2004; Clerc and Kennedy 2002; Trelea 2003; Dorigo et al. 1996;
Ong et al. 2006; Storn and Price 1997; Yang and Deb 2010; Das et al. 2011; Yang 2005,
2009; Zhang et al. 2007). Since a method, which can be used to prove that a solution obtained
by metaheuristic algorithms is the optimum solution, has not been proposed by the research-
ers yet, the solutions obtained via these methods are referred to as sub-optimal solutions
(Rashedi et al. 2009; Ong et al. 2006; Storn and Price 1997; Yang and Deb 2010). The
metaheuristic optimization algorithms that are most widely used in scientific applications are
Genetic Algorithm (Deb et al. 2002), Particle swarm optimization algorithm (PSO) (Clerc
and Kennedy 2002; Trelea 2003; Yoshida et al. 2000; Juang 2004), Differential evolution
algorithm (DE) (Storn and Price 1997; Ferrante and Ville 2010; Price and Storn 1997; Storn
1999; Liu and Lampinen 2005; Ali and Torn 2004; Shahryar et al. 2008; Das and Suganthan
2009; Kaelo and Ali 2006; Swagatam et al. 2009; Janez et al. 2007), Artificial bee colony
algorithm (Karaboga and Akay 2009a; Karaboga and Basturk 2007a,b; Janez et al. 2007;
Karaboga 2009; Fei et al. 2009), Cuckoo Search Algorithm (Yang and Deb 2009; Deb et al.
2002), Gravitational Search Algorithm (Rashedi et al. 2009; Esmat et al. 2010, 2011; Duman
et al. 2010; Chaoshun and Jianzhong 2011) Harmony Search Algorithm (Geem et al. 2001;
Lee and Geem 2004, 2005; Mahdavi et al. 2007) and their derivatives.
Many metaheuristic algorithms use a pattern matrix, which includes random solutions of
the related problem. Since this situation enables the performance of information exchange
between the patterns, the success archived in the solution of the related problem promi-
nently improves. The methods which predicate on the collective search of the patterns for
the solutions of the related problem are generally known as the artificial swarm intelligence
methods (Dorigo et al. 2000, 2004; Martinoli et al. 2004; Sousa et al. 2004; Mahamed and
Mehrdad 2008; Karaboga and Akay 2009b). The basic definitions used in the development
of metaheuristic algorithms are based on the basic concepts used in Pattern Recognition
(such as pattern matrix, pattern and attribute). In the PSO algorithm, the pattern matrix
is referred to as swarm and each pattern corresponds to an artificial particle. Similarly,
each pattern corresponds to a nectar source in the ABC algorithm. A pattern is considered
as an artificial nest in the Cuckoo algorithm. In the genetic algorithm, each pattern cor-
responds to an artificial chromosome and the pattern matrix is referred to as population.
Many metaheuristic algorithms somehow use the basic genetic rules (i.e., mutation, cross-
over, selection, and adaptation) while developing the existing random solutions (Juang 2004;
Ferrante and Ville 2010; Das and Suganthan 2009; Karaboga and Akay 2009b). This situation
123
A conceptual comparison of the CK, PSO, DE and ABC algorithms 317
123
318 P. Civicioglu, E. Besdok
thus, the generalized system-equation of the randomized mutation operator of the DE/rand/1
algorithm (Storn and Price 1997; Price and Storn 1997) is achieved as,
v ← X r3 + s (X r1 − X r2 ) (4)
v ← X r3 + F ∗ ⊗ (X r1 − X r2 ) (5)
In the original ABC algorithm, s scaling factor is shown by φ (Karaboga and Akay 2009a).
In this case, the generalized system-equation of the ABC algorithm can be defined by Eq. 6
where φ ∗ ∈ R n ,
v ← X r1 + φ ∗ ⊗ (X r1 − X r2 ) (6)
v ← X r1 ,h + F (X r2 ,h − X r3 ,h ) (7)
v ← X r1 ,h + φ(X r1 ,h − X r2 ,h ) (8)
The researchers have proposed many different strategies to be used in the determination
of the value of scale factor. Generally, while the real-valued number range s ∈ [0 1] is used
in the DE algorithm and its derivatives (Ferrante and Ville 2010; Das and Suganthan 2009),
the real-valued number range s ∈ [−1 1] is used in the ABC algorithm and its variants
(Karaboga and Akay 2009a). Different researchers have proposed different methods for the
determination of F value in the DE algorithm (Karaboga and Akay 2009a; Das and Suganthan
2009; Price et al. 2005).
A metaheuristic search algorithm may also use a strategy based on using the solution of
the pattern which provides the best solution amongst the pattern matrix components while
attempting to develop a random solution. In this case, the system-equation of Eq. 9 is obtained
by generalizing Eq. 1 as;
123
A conceptual comparison of the CK, PSO, DE and ABC algorithms 319
Table 1 The Benchmark functions used to examine the numerical optimization problem solving successes
of the CK, PSO, DE and ABC algorithms and features of these functions
123
320 P. Civicioglu, E. Besdok
Table 1 continued
FNC NAME TYPE LOW UP DIM
F41 SHEKEL7 MN 0 10 4
F42 SHUBERT MN −10 10 2
F43 SIXHUMPCAMELBACK MN −5 5 2
F44 SPHERE2 US −100 100 30
F45 STEP2 US −100 100 30
F46 STEPINT US −5.12 5.12 5
F47 SUMSQUARES US −10 10 30
F48 TRID UN −36 36 6
F49 TRID UN −100 100 10
F50 ZAKHAROV UN −5 10 10
where the acceleration constants C1 , C2 and the inertia weight ω are predefined by the user
and r1 , r2 are the uniformly generated random numbers in the range of [0 1]. The Pbest
value in Eq. 11 denotes the best solution found by the pattern r1 . The system-equations of
Eq. 11 are the most general system-equations of the PSO algorithm (Del Valle et al. 2008;
Clerc and Kennedy 2002; Trelea 2003). For ω value used in the velocity system-equation
given in Eq. 11, the researchers have suggested the use of different strategies; ω = 0.60
(Karaboga and Akay 2009a), ω = rand (Del Valle et al. 2008). PSO and its variations can be
found in (Del Valle et al. 2008; Clerc and Kennedy 2002; Trelea 2003; Yoshida et al. 2000;
Juang 2004; Dorigo et al. 2004; Martinoli et al. 2004).
If the general system-equation given in Eq. 1 is generalized and rearranged, the system-
equation of Eq. 12 is obtained.
v ← v + K · (υ − X best ) (12)
123
Table 2 The MeanOpt values of the CK, PSO, DE and ABC algorithms
CK PSO DE ABC
123
321
Table 2 continued
322
CK PSO DE ABC
123
F26 −1.8210436836776824 −1.8210436836776824 −1.8210436836776824 −1.8210436836776824
F27 −4.6934684519571146 −4.6769774171754843 −4.6934684519571146 −4.6934684519571137
F28 −9.6601517156413479 −9.5127918086875667 −9.6184137190412695 −9.6601517156413479
F29 0.0000528002823306 0.0013823741086214 0.0051435833427266 0.0204750133573142
F30 0.0000001506204011 0.0000517665237642 0.0000000000000000 0.0004486249518229
F31 0.0000029552286077 0.0001616061328810 0.0001055710319695 0.0022337034076446
F32 0.0021879618606472 0.0047745343217586 0.0004074540652956 0.0129825734203320
F33 1.2828840539586595 28.0080704889407630 15.3906006782917850 0.0000000000000000
F34 0.0000000000000000 2.3638709258458954 0.0000000000000000 0.0609538747914428
F35 0.0000000000000000 0.0000000000000000 0.0000000000000000 0.0000000000000000
F36 −12,505.4587899889370000 −8,927.9796708997965000 −11,939.3023972921910000 −12,569.4866181730160000
F37 0.0000000000000000 0.0000000007389222 0.0000000000000000 47.4978770095106240
F38 0.0000000000000000 0.0000000000000000 0.0000000000000000 0.0000000000000004
F39 −10.5364098166920460 −10.5364098166920460 −10.5364098166920460 −10.5364098166920450
F40 −10.1531996790582270 −10.1531996790582270 −10.1531996790582270 −10.1531996790582270
F41 −10.4029405668186680 −10.4029405668186680 −10.4029405668186680 −10.4029405668186680
F42 −186.7309088310239800 −186.7309088235984100 −186.7309088310239500 −186.7309088310239500
F43 −1.0316284534898774 −1.0316284534898774 −1.0316284534898774 −1.0316284534898774
F44 0.0000000000000000 0.0000000000000000 0.0000000000000000 0.0000000000000004
F45 0.0000000000000000 0.0000000000000000 0.0000000000000000 0.0000000000000000
F46 0.0000000000000000 0.0000000000000000 0.0000000000000000 0.0000000000000000
F47 0.0000000000000000 0.0000000000000000 0.0000000000000000 0.0000000000000004
F48 −50.0000000000001920 −50.0000000000001920 −50.0000000000002130 −49.9999999999999080
F49 −210.0000000000028700 −210.0000000000023300 −210.0000000000036400 −209.9999999999536200
F50 0.0000000000000000 0.0000000000000000 0.0000000000000000 0.0000000000000488
P. Civicioglu, E. Besdok
A conceptual comparison of the CK, PSO, DE and ABC algorithms 323
CK PSO DE ABC
123
324 P. Civicioglu, E. Besdok
Table 3 continued
CK PSO DE ABC
In this case, the strategy for the determination of K value gains importance in order to
efficiently investigate the search space. In the CK algorithm, K value is achieved through a
substantially complex random-walk strategy (Yang and Deb 2009, 2010).
The Cuckoo search (CK) algorithm (Yang and Deb 2009, 2010), which is a population based
stochastic global search algorithm, has been proposed by Yang and Deb in (2009). In the CK
algorithm, a pattern corresponds to a nest and similarly each individual attribute of the pattern
corresponds to a Cuckoo-egg. The general system-equation of the CK algorithm is based on
the general system-equation of the random-walk algorithms, which is given in Eq. 13;
X g+1;i = X g;i + α ⊗ levy(λ) (13)
where g indicates the number of the current generation (g = 1, 2, 3, . . ., maxcycle and
maxcycle denotes the predetermined maximum generation number). In the CK algorithm,
the initial values of the jth attributes of the ith pattern, Pg=0;i = [x g=0; j,i ], have been
determined by using Eq. 14,
X g=0; j,i = rand · (upi − lowi ) + lowi (14)
where lowi and upi are the lower and upper search-space limits of jth attributes, respectively.
The CK algorithm controls the boundary conditions in each computation steps. Therefore,
when the value of an attribute overflows the allowed search space limits, then the value of
the related attribute is updated with the value of the closer limit value to the related attribute.
Before starting to iterative search process, the CK algorithm detects the most successful
pattern as X best pattern. The iterative evolution phase of the pattern matrix begins with the
detection step of the φ by using Eq. 15;
⎛ ⎞1/ β
(1 + β) · sin π · β 2
φ=⎝ β−1
⎠ (15)
1+β
2 ·β ·2 2
In the standard software implementation of the CK algorithm (Yang and Deb 2010), using
β = 1.50 has been advised. In Eq. 15, the denotes gamma function. The evolution phase
123
Table 4 The BestOpt values of the CK, PSO, DE and ABC algorithms
CK PSO DE ABC
123
325
Table 4 continued
326
CK PSO DE ABC
123
F26 −1.8210436836776824 −1.8210436836776824 −1.8210436836776824 −1.8210436836776824
F27 −4.6934684519571128 −4.6934684519571128 −4.6934684519571128 −4.6934684519571128
F28 −9.6601517156413497 −9.6601517156413479 −9.6601517156413497 −9.6601517156413497
F29 0.0000000085822797 0.0000088339092326 0.0047965266340616 0.0011011688503096
F30 0.0000000797695218 0.0000211010308969 0.0000000000000000 0.0002717789268690
F31 0.0000000509883999 0.0000000000427674 0.0000000888051571 0.0003305515663585
F32 0.0013909553459754 0.0022391455486327 0.0001538851948172 0.0059038261927937
F33 0.0003806178377204 13.9294167236679410 7.9596724567463761 0.0000000000000000
F34 0.0000000000000000 0.0000700595196601 0.0000000000000000 0.0002227389995635
F35 0.0000000000000000 0.0000000000000000 0.0000000000000000 0.0000000000000000
F36 −12,569.4866180428830000 −10,426.9771785279660000 −12,233.9086141680040000 −12,569.4866181730140000
F37 0.0000000000000000 0.0000000000001282 0.0000000000000000 10.7576179839792520
F38 0.0000000000000000 0.0000000000000000 0.0000000000000000 0.0000000000000003
F39 −10.5364098166920500 −10.5364098166920500 −10.5364098166920500 −10.5364098166920480
F40 −10.1531996790582310 −10.1531996790582310 −10.1531996790582310 −10.1531996790582310
F41 −10.4029405668186660 −10.4029405668186660 −10.4029405668186660 −10.4029405668186660
F42 −186.7309088310239800 −186.7309088310239800 −186.7309088310239800 −186.7309088310239800
F43 −1.0316284534898774 −1.0316284534898774 −1.0316284534898774 −1.0316284534898774
F44 0.0000000000000000 0.0000000000000000 0.0000000000000000 0.0000000000000003
F45 0.0000000000000000 0.0000000000000000 0.0000000000000000 0.0000000000000000
F46 0.0000000000000000 0.0000000000000000 0.0000000000000000 0.0000000000000000
F47 0.0000000000000000 0.0000000000000000 0.0000000000000000 0.0000000000000003
F48 −50.0000000000002270 −50.0000000000001710 −50.0000000000002270 −50.0000000000000570
F49 −210.0000000000036400 −210.0000000000027300 −210.0000000000036400 −209.9999999999763500
F50 0.0000000000000000 0.0000000000000000 0.0000000000000000 0.0000000000000083
P. Civicioglu, E. Besdok
A conceptual comparison of the CK, PSO, DE and ABC algorithms 327
of the X i pattern begins by defining the donor vector υ, where υ = X i . After this step, the
required stepsi ze value has been computed by using Eq. 16;
1/ β
uj
stepsi ze j = 0.01 · · (υ − X best ) (16)
vj
where u = φ · randn [D] and v = randn [D]. The randn[D] function generates a uniform
integer between [1 D]. In the next step of the CK algorithm, the donor pattern υ is randomly
mutated by using Eq. 17;
The update process of the X best pattern in the CK algorithm is defined by Eq. 18.
The unfeasible patterns are manipulated by using the crossover operator given in Eq. 19;
X i + rand · (X r1 − X r2 ) randi > p0
υi := (19)
Xi else
In the last step of the iterative computations, a simple rule given in Eq. 2 has been used for
evolution of the pattern X i . The algorithmic control parameters of the CK algorithm are the
scale factor (β) and mutation probability value ( p0 ). In this paper, β = 1.50 and p0 = 0.25
have been used as in (Yang and Deb 2010).
D E/rand/1 v = X r1 + F · (X r2 − X r3 )
D E/rand/2 v = X r1 + F · (X r2 − X r3 ) + F · (X r4 − X r5 )
D E/best/1 v = X best + F · (X r1 − X r2 )
D E/best/2 v = X best + F · (X r1 − X r2 ) + F · (X r3 − X r4 )
D E/target-to-best/1 v = X i + F · (X best − X i ) + F · (X r1 − X r2 ) (20)
123
328 P. Civicioglu, E. Besdok
Table 5 Comparison of the performances of the CK, PSO, DE and ABC algorithms for the statistical param-
eters of MeanOpt
CK PSO DE ABC
SAME 41 (for the functions 26 (for the functions 38 (for the functions 23 (for the functions
of: F1, F2, F3, F4, of: F1, F6, F7, F8, of: F1, F3, F4, F5, of: F1, F7, F8, F10,
F5, F6, F7, F8, F9, F9, F10, F11, F12, F6, F7, F8, F9, F10, F11, F13, F14, F15,
F10, F11, F12, F15, F14, F15, F19, F21, F11, F12, F14, F15, F18, F19, F20, F22,
F16, F18, F19, F20, F22, F25, F26, F35, F16, F17, F19, F22, F25, F26, F28, F33,
F21, F22, F23, F24, F38, F39, F40, F41, F23, F25, F26, F27, F35, F36, F40, F41,
F25, F26, F27, F28, F43, F44, F45, F46, F30, F32, F34, F35, F43, F45, F46)
F29, F31, F34, F35, F47, F50) F37, F38, F39, F40,
F37, F38, F39, F40, F41, F43, F44, F45,
F41, F42, F43, F44, F46, F47, F48, F49,
F45, F46, F47, F50) F50)
DIFF. 9 (for the functions of: 24 (for the functions 12 (for the functions 27 (for the functions
F13, F14, F17, F30, of: F2, F3, F4, F5, of: F2, F13, F18, of: F2, F3, F4, F5,
F32, F33, F36, F48, F13, F16, F17, F18, F20, F21, F24, F28, F6, F9, F12, F16,
F49) F20, F23, F24, F27, F29, F31, F33, F36, F17, F21, F23, F24,
F28, F29, F30, F31, F42) F27, F29, F30, F31,
F32, F33, F34, F36, F32, F34, F37, F38,
F37, F42, F48, F49) F39, F42, F44, F47,
F48, F49, F50)
SAME The number of benchmark functions that the CK, PSO, DE and ABC algorithms provide equal values
for the minimum MeanOpt parameter, DIFF. The number of benchmark functions that the CK, PSO, DE and
ABC algorithms provide different values from the minimum MeanOpt
There are many mutation algorithms developed to be used with the DE algorithm. The
most common one used in the literature is the DE/rand/1/bin algorithm which basically
intends to search the global optimum by using three elements randomly selected from the
population (Storn and Price 1997; Price and Storn 1997; Price et al. 2005). The weighted
difference of two of the three randomly selected solutions is added to the third solution to
obtain the donor solution, which is then mutated with a 4th randomly selected solution, and
if the objective function value of the obtained mutant solution provides a better solution
than the objective function value of the selected original 4th solution, the mutant solution
is selected instead of the mentioned 4th solution within the next population. The detailed
performed test results have shown that DE/rand/1/bin is excessively susceptible to the size
of population, total number of iterations, crossover value (CR) and the weighting value (F).
The DE algorithm has three algorithmic control parameters; which are the crossover value
Cr ∈ [0 1], the scale factor F ∈ [0 2], and the size of population value N P (Storn and
Price 1997; Price et al. 2005; Das and Suganthan 2009; Swagatam et al. 2009). Taking the
problem size as D, it is suggested that the 3D ≤ N P ≤ 10D value is used for the DE
algorithm. In solution of many engineering problems, it has been suggested to use either
F = 0.60 (Karaboga and Akay 2009a) or F = 0.5 + (1 − rand) (Das and Suganthan 2009).
For DE/rand/1/bin used in this paper, the values of parameters have been used as follows:
C R = 0.5, F = 0.90 as in (Karaboga and Akay 2009a; Das and Suganthan 2009; Corne
et al. 1999).
The problem solving success of DE algorithm is affected directly by the mutation, size of
population and the crossover strategies used. The success of the DE algorithm in solving a
numeric optimization problem is rather sensitive to the initial values of the Cr, F and N P
(Das and Suganthan 2009; Kaelo and Ali 2006; Swagatam et al. 2009; Janez et al. 2007;
123
A conceptual comparison of the CK, PSO, DE and ABC algorithms 329
Table 6 Comparison of the performances of the CK, PSO, DE and ABC algorithms for the statistical param-
eters of STD
CK PSO DE ABC
SAME 36 (for the functions 26 (for the functions 35 (for the functions 21 (for the functions
of: F1, F2, F3, F4, of: F1, F2, F5, F6, of: F1, F3, F4, F5, of: F1, F3, F7, F8,
F5, F6, F7, F8, F9, F7, F8, F9, F10, F6, F7, F8, F9, F10, F10, F11, F14, F15,
F10, F11, F12, F15, F11, F12, F14, F15, F11, F12, F13, F14, F18, F19, F20, F25,
F16, F18, F21, F22, F21, F22, F25, F26, F15, F16, F17, F22, F26, F27, F33, F35,
F23, F24, F25, F26, F35, F38, F41, F43, F23, F25, F26, F30, F36, F40, F43, F45,
F28, F29, F31, F34, F44, F45, F46, F47, F32, F34, F35, F37, F46)
F35, F37, F38, F41, F48, F50) F38, F39, F41, F43,
F42, F43, F44, F45, F44, F45, F46, F47,
F46, F47, F50) F49, F50)
DIFF. 14 (for the functions 24 (for the functions 15 (for the functions 29 (for the functions
of: F13, F14, F17, of: F3, F4, F13, F16, of: F2, F18, F19, of: F2, F4, F5, F6,
F19, F20, F27, F30, F17, F18, F19, F20, F20, F21, F24, F27, F9, F12, F13, F16,
F32, F33, F36, F39, F23, F24, F27, F28, F28, F29, F31, F33, F17, F21, F22, F23,
F40, F48, F49) F29, F30, F31, F32, F36, F40, F42, F48) F24, F28, F29, F30,
F33, F34, F36, F37, F31, F32, F34, F37,
F39, F40, F42, F49) F38, F39, F41, F42,
F44, F47, F48, F49,
F50)
SAME The number of benchmark functions that the CK, PSO, DE and ABC algorithms provide equal values
for the minimum STD parameter, DIFF. The number of benchmark functions that the CK, PSO, DE and ABC
algorithms provide different values from the minimum STD
Table 7 Comparison of the performances of the CK, PSO, DE and ABC algorithms for the statistical param-
eters of BestOpt
CK PSO DE ABC
SAME 43 (for the functions 36 (for the functions 45 (for the functions 25 (for the functions
of: F1, F3, F4, F5, of: F1, F3, F4, F6, of: F1, F2, F3, F4, of: F1, F7, F8, F9,
F6, F7, F8, F9, F10, F7, F8, F9, F10, F5, F6, F7, F8, F9, F10, F11, F13, F14,
F11, F12, F14, F15, F11, F12, F14, F15, F10, F11, F12, F14, F15, F18, F19, F22,
F16, F17, F18, F19, F16, F18, F19, F20, F15, F16, F17, F18, F25, F26, F27, F28,
F20, F21, F22, F23, F21, F22, F23, F24, F19, F20, F21, F22, F33, F35, F36, F40,
F24, F25, F26, F27, F25, F26, F27, F31, F23, F24, F25, F26, F41, F42, F43, F45,
F28, F29, F34, F35, F35, F38, F39, F40, F27, F28, F30, F32, F46)
F37, F38, F39, F40, F41, F42, F43, F44, F34, F35, F37, F38,
F41, F42, F43, F44, F45, F46, F47, F50) F39, F40, F41, F42,
F45, F46, F47, F48, F43, F44, F45, F46,
F49, F50) F47, F48, F49, F50)
DIFF. 7 (for the functions of: 14 (for the functions 5 (for the functions of: 25 (for the functions
F2, F13, F30, F31, of: F2, F5, F13, F13, F29, F31, F33, of: F2, F3, F4, F5,
F32, F33, F36) F17, F28, F29, F30, F36) F6, F12, F16, F17,
F32, F33, F34, F36, F20, F21, F23, F24,
F37, F48, F49) F29, F30, F31, F32,
F34, F37, F38, F39,
F44, F47, F48, F49,
F50)
SAME The number of benchmark functions that the CK, PSO, DE and ABC algorithms provide equal values
for the minimum BestOpt parameter, DIFF. The number of benchmark functions that the CK, PSO, DE and
ABC algorithms provide different values from the minimum BestOpt
123
330 P. Civicioglu, E. Besdok
Price et al. 2005; Vesterstrom and Thomsen 2004; Bin et al. 2010). Moreover, the process of
determining the optimum mutation and crossover strategies for the problem structure in the
DE algorithm is time-consuming (Das and Suganthan 2009).
Readers who wish to obtain more detailed information about the DE algorithm are rec-
ommended to examine (Ferrante and Ville 2010; Kaelo and Ali 2006; Das and Suganthan
2009).
The particle swarm optimization (PSO) algorithm is a population-based, stochastic and multi-
agent parallel global-search technique (Del Valle et al. 2008; Clerc and Kennedy 2002; Trelea
2003; Yoshida et al. 2000; Juang 2004; Sousa et al. 2004). Unlike the genetic algorithm and
the DE algorithm, the PSO algorithm has no crossover and mutation operators. The PSO algo-
rithm is based on the mathematical modeling of various collective behaviors of the living
creatures that display complex social behaviors. In the PSO algorithm, while a pattern (i.e.,
particle) is developing a new situation, both the cognitive component of the relative particle
and the social component generated by the swarm are used. This situation enables the PSO
algorithm to effectively develop the local solutions into global optimum solutions. However,
the PSO algorithm is significantly affected by the initial values of the parameters used in the
weighting of the cognitive and social components and the weighting strategy of the veloc-
ity vector. The Lbest and Gbest topologies are the mostly used topologies in the standard
PSO algorithm. In this paper a modernized implementation of the PSO algorithm with Lbest
topology, which is known as PSO-2007, has been used (Bin et al. 2010).
The success of the PSO algorithm in finding the global optimum depends extremely on
the initial values of the control parameters (c1 , c2 , ω) of the PSO algorithm, the size of
swarm value, and the maximum iteration number. In the tests made in this paper, control
parameter values of the PSO algorithm have been selected as the same with the values given
in (Karaboga and Akay 2009a); c1initial = 1.80, c2initial = 1.80, and ω = 1+rand 2 as in
(Eberhart and Shi 2001).
For more detailed information on the PSO algorithm, please refer to the study given in
(Del Valle et al. 2008; Clerc and Kennedy 2002; Trelea 2003; Ratnaweera et al. 2004; Eberhart
and Shi 2001).
123
A conceptual comparison of the CK, PSO, DE and ABC algorithms 331
Fig. 1 Anova tests of the global minimum values, which are computed by using the CK, PSO, DE and ABC
algorithms (F1–F10)
123
332 P. Civicioglu, E. Besdok
Fig. 2 Anova tests of the global minimum values, which are computed by using the CK, PSO, DE and ABC
algorithms (F11–F20)
123
A conceptual comparison of the CK, PSO, DE and ABC algorithms 333
Fig. 3 Anova tests of the global minimum values, which are computed by using the CK, PSO, DE and ABC
algorithms (F21–F30)
123
334 P. Civicioglu, E. Besdok
Fig. 4 Anova tests of the global minimum values, which are computed by using the CK, PSO, DE and ABC
algorithms (F31–F40)
123
A conceptual comparison of the CK, PSO, DE and ABC algorithms 335
Fig. 5 Anova tests of the global minimum values, which are computed by using the CK, PSO, DE and ABC
algorithms (F41–F50)
123
336 P. Civicioglu, E. Besdok
CK PSO DE ABC
F1 – * * * *
F2 2.3527E-15 DE, ABC ABC CK, ABC CK, PSO, DE
F3 4.3336E-03 PSO CK, DE, ABC PSO PSO
F4 – * * * *
F5 4.6356E-72 PSO, ABC CK, PSO, ABC PSO, ABC CK, PSO, DE
F6 4.8111E-37 ABC ABC ABC CK, PSO, DE
F7 – * * * *
F8 – * * * *
F9 1.1381E-09 ABC ABC ABC CK, PSO, DE
F10 3.4875E-13 ABC ABC ABC CK, PSO, DE
F11 – * * * *
F12 1.0263E-16 ABC ABC ABC CK, PSO, DE
F13 – * * * *
F14 1.1401E-16 PSO, DE, ABC CK CK CK
F15 4.2246E-16 ABC ABC ABC CK, PSO, DE
F16 2.5259E-03 ABC ABC ABC CK, PSO, DE
F17 2.0879E-05 ABC ABC ABC CK, PSO, DE
F18 6.7357E-07 ABC ABC ABC CK, PSO, DE
F19 – * * * *
F20 3.1494E-05 PSO CK, DE, ABC PSO PSO
F21 6.0150E-14 ABC ABC ABC CK, PSO, DE
F22 – * * * *
F23 – * * * *
F24 9.1529E-05 PSO, ABC CK, DE PSO CK
F25 2.1747E-13 ABC ABC ABC CK, PSO, DE
F26 – * * * *
F27 8.4588E-07 PSO CK, DE, ABC PSO PSO
F28 2.3781E-10 PSO CK, DE, ABC PSO PSO
F29 2.8953E-13 ABC ABC ABC CK, PSO, DE
F30 9.9297E-57 PSO, ABC CK, DE, ABC PSO, ABC CK, PSO, DE
F31 1.9907E-15 ABC ABC ABC CK, PSO, DE
F32 1.8096E-39 PSO, DE, ABC CK, DE, ABC CK, PSO, ABC CK, PSO, DE
F33 1.0385E-32 PSO, DE CK, DE, ABC CK, PSO, ABC PSO, DE
F34 1.3360E-05 ABC ABC ABC CK, PSO, DE
F35 – * * * *
F36 1.5576E-40 PSO, DE CK, DE, ABC CK, PSO, ABC PSO, DE
F37 4.4219E-24 ABC ABC ABC CK, PSO, DE
F38 2.9078E-54 ABC ABC ABC CK, PSO, DE
F39 – * * * *
F40 – * * * *
123
A conceptual comparison of the CK, PSO, DE and ABC algorithms 337
Table 8 continued
Fnc. P-value Algorithms
CK PSO DE ABC
F41 – * * * *
F42 – * * * *
F43 – * * * *
F44 3.1873E-55 ABC ABC ABC CK, PSO, DE
F45 – * * * *
F46 – * * * *
F47 1.2363E-51 ABC ABC ABC CK, PSO, DE
F48 – * * * *
F49 – * * * *
F50 1.3159E-11 ABC ABC ABC CK, PSO, DE
∗ No algorithm has significantly different for this benchmark function
and Akay 2009a). Following the generation of initial nectar resources, the ABC algorithm
starts to search for the solution of the numeric optimization problem using the employed
bee, onlooker bee, and scout-bee tools. The employed bee tries to develop the nectar source
to which it is assigned using the other nectar sources as well. If the employed bee finds a
better nectar source, it memorizes the new nectar source to use it instead of the old one. This
process is modeled in Eq. 21;
where φi, j is a random number generated in the range of [−1 1]. X r1 , j and X r2 , j indicate
the jth parameters of the r1 th and r2 th patterns (i.e., nectar sources in the ABC algorithm)
respectively.
If the vr1 value has a better objective function value than the X r1 value, the X r1 value is
updated as X r1 := vr1 and the failure variable becomes f ailur er1 = 0. If the vr1 value does
not have a better objective function value than X r1 , the employed bee continues to go to the
X r1 source, and since the X r1 solution cannot be developed, the f ailur er1 value that is the
development meter related to the nectar source X r1 increases by one unit.
Using the objective function value of all nectar sources, the probability values, pi , to be
used by the onlooker bees are obtained by using Eq. 22;
f itnessi
pi = S N (22)
i=1 f itnessi
where
1
fi ≥ 0
f itnessi = 1+ f i (23)
1 + | fi | fi < 0
As the f itnessi value given in Eq. 23 increases, the number of employed bees that will
select this region of nectar source will increase. The ABC algorithm selects the nectar sources
to be visited by the bees using the roulette selection technique used in the genetic algorithms;
a random number within the range of [0 1] is generated for each nectar source, if the pi value
is higher than the generated random number, the onlooker bees search for new nectar sources
123
338 P. Civicioglu, E. Besdok
Table 9 The MeanFncEvol values of the CK, PSO, DE and ABC algorithms
CK PSO DE ABC
123
A conceptual comparison of the CK, PSO, DE and ABC algorithms 339
Table 9 continued
CK PSO DE ABC
to develop the nectar source X i using the Eqs. 22, 23. If an X i source has a f ailur ei value
higher than a certain threshold value, that X i source is left, and the employed bee assigned
hereto goes to a random nectar source generated newly. The success of the ABC algorithm
in finding the global optimum is sensitive to the control parameters of the algorithm (i.e, the
limit value, the number of the employed-bees, the population size, and the maximum cycle
value). The ABC algorithm’s control parameter of limit used in the tests conducted in this
paper are the same with the value used in (Karaboga and Akay 2009a); (limit = S N · D as
in Eq. 9 of (Karaboga and Akay 2009a).
The local-search ability of the ABC algorithm is sufficiently strong for various problem
types. The ABC algorithm selects the pattern used in defining the search-direction using the
probability values and the roulette-selection rule used in the genetic-algorithm (see, Eqs. 6,
7 of Karaboga and Akay 2009a). In the ABC algorithm, the probability values are calcu-
lated using the fitness values (see, Eq. 6 of Karaboga and Akay 2009a, Eq. 3 of Akay and
Karaboga 2010 and Eq. 2.1 of Karaboga and Basturk 2007b). The mathematical model used
in the standard ABC algorithm while calculating the fitness values causes it to produce equal
probability values for the local solutions that have equal absolute values but different signs
(see, Eq. 6 of Karaboga and Akay 2009a). This decreases the probability of a pattern that
provides a relatively better solution being selected to define the search-direction.
The modified-ABC algorithm calculates the fitness values for different local-solutions by
using different methods (see, Eq. 3 of Akay and Karaboga 2010). Therefore, it is appropri-
ate to sort the patterns in the probability values of pattern matrix acquired in the end just
nonlinearly according to the quality of the solution they acquire. Thus, the strategies used to
calculate the probability values in the ABC algorithm produce the pseudo-probability value
that is suitable to grade the patterns in the pattern matrix just nonlinearly. This affects the
problem solving success of the ABC algorithm significantly, and decreases its local-search
ability (Karaboga and Akay 2009a; Karaboga and Basturk 2007b; Akay and Karaboga 2010).
For more detailed information on the ABC algorithm, please refer to the study given in
(Karaboga and Akay 2009a; Karaboga and Basturk 2007a,b; Karaboga 2009; Fei et al. 2009;
Karaboga and Akay 2009b).
3 Experiments
In this paper, various features of the benchmark functions used for testing the successes of
the CK, PSO, DE and ABC algorithms are given in Table 1. Mathematical descriptions of
123
340 P. Civicioglu, E. Besdok
Table 10 The MinRuntime values of the CK, PSO, DE and ABC algorithms
CK PSO DE ABC
123
A conceptual comparison of the CK, PSO, DE and ABC algorithms 341
Table 10 continued
CK PSO DE ABC
the benchmark functions can be found in (Karaboga and Akay 2009a; Karaboga and Basturk
2007a,b). The benchmark functions used in the tests consist of Unimodal (U), Multimodal
(M), Separable (S), and Non-Separable (N) functions of different features. The problem
dimensions of the benchmark functions used vary between 2 and 30, as in (Karaboga and
Akay 2009a; Ferrante and Ville 2010; Zhang and Sanderson 2009; Vesterstrom and Thomsen
2004).
In all experiments in this paper, the values of the common control parameters of the men-
tioned algorithms such as the size of pattern matrix and the maximum function evaluation
number were chosen to be the same. The size of the pattern matrix has been fixed as 50 and
the maximum function evaluation number was set to 2,000,000. In order to make comparison
coherently, the global minimum values below 10−16 are assumed to be 0 in all experiments.
The setting values of algorithmic control parameters of the mentioned algorithms are given
below:
– CK Settings: β = 1.50 and p0 = 0.25 have been used as recommended in (Yang and Deb
2010).
– DE Settings: In DE, the D E/rand/1 mutation strategy with binomial crossover oper-
ator has been used. The algorithmic control parameters of D E/rand/1 has been used
as F = 0.50 and Cr = 0.90 as recommended in (Ferrante and Ville 2010; Das and
Suganthan 2009; Karaboga and Akay 2009a).
– PSO Settings: C1 = C2 = 1.80 and ω = 0.60 have been used as recommended in
(Karaboga and Akay 2009a).
– ABC Settings: limit = 50D has been used as recommended in (Karaboga and Akay
2009a).
The global minimum values of each of the benchmark functions used in this paper have
been solved 20 times by the mentioned algorithms using a different initial population at every
turn. The run-time and the minimum function evaluation number of the best solution, and
the final global minimum value have been recorded during experiments for further statistical
analysis. Subsequently, the mean-value of the global minimum values (MeanOpt), standard
deviation value of MeanOpt (STD) and best solution (BestOpt) values have been computed
by analyzing the recorded global minimum values during experiments. The MeanOpt, STD
and BestOpt values are given in Tables 2, 3, and 4. The multiple comparison results for the
minimum-MeanOpt values are given in Table 5, where the minimum-MeanOpt value denotes
the minimum value of the MeanOpt values computed by the CK, PSO, DE and ABC algo-
rithms for a certain benchmark function. As it is seen from Table 5, the performances of the
123
342 P. Civicioglu, E. Besdok
Table 11 Comparison of the performances of the CK, PSO, DE and ABC algorithms for the statistical
parameters of MeanFncEvol
CK PSO DE ABC
SAME 0 3 (for the functions of: 39 (for the functions 8 (for the functions of:
F27, F46, F48) of: F1, F2, F3, F4, F7, F20, F29, F31,
F5, F6, F8, F9, F10, F32, F33, F36, F45)
F11, F12, F13, F14,
F15, F16, F17, F18,
F19, F21, F22, F23,
F24, F25, F26, F28,
F30, F34, F35, F37,
F38, F39, F40, F41,
F42, F43, F44, F47,
F49, F50)
DIFF. 50 (for the functions 47 (for the functions 11 (for the functions 42 (for the functions
of: F1, F2, F3, F4, of: F1, F2, F3, F4, of: F7, F20, F27, of: F1, F2, F3, F4,
F5, F6, F7, F8, F9, F5, F6, F7, F8, F9, F29, F31, F32, F33, F5, F6, F8, F9, F10,
F10, F11, F12, F13, F10, F11, F12, F13, F36, F45, F46, F48) F11, F12, F13, F14,
F14, F15, F16, F17, F14, F15, F16, F17, F15, F16, F17, F18,
F18, F19, F20, F21, F18, F19, F20, F21, F19, F21, F22, F23,
F22, F23, F24, F25, F22, F23, F24, F25, F24, F25, F26, F27,
F26, F27, F28, F29, F26, F28, F29, F30, F28, F30, F34, F35,
F30, F31, F32, F33, F31, F32, F33, F34, F37, F38, F39, F40,
F34, F35, F36, F37, F35, F36, F37, F38, F41, F42, F43, F44,
F38, F39, F40, F41, F39, F40, F41, F42, F46, F47, F48, F49,
F42, F43, F44, F45, F43, F44, F45, F47, F50)
F46, F47, F48, F49, F49, F50)
F50)
SAME The number of benchmark functions that the CK, PSO, DE and ABC algorithms provide equal values
for the MeanFncEvol parameter, DIFF. The number of benchmark functions that the CK, PSO, DE and ABC
algorithms provide different values from the MeanFncEvol
CK and DE algorithms are better than the performances of PSO and ABC. The CK and DE
algorithms find out the minimum MeanOpt values in 41 and 38 functions, respectively. The
multiple comparison results for the minimum-STD values are given in Table 6, where the
minimum-STD value denotes the minimum value of the STD values computed by the CK,
PSO, DE and ABC algorithms for a certain benchmark function. The multiple comparison
results for the BestOpt values are given in Table 7, where the BestOpt value denotes the best
solution of the global minimizer values computed by the CK, PSO, DE and ABC algorithms
for a certain benchmark function. As it is seen from Table 7, the performances of the CK,
DE and PSO algorithms are better than the performance of the ABC aspect of the BestOpt
values.
In addition to the basic statistical analyzes given above (i.e., MeanOpt, STD and BestOpt),
analysis of variance (Anova) test has also been carried out for multiple comparison of the
performances of the CK, PSO, DE and ABC algorithms. The null hypothesis is determined as
‘there is no difference in the minimum-MeanOpt performances of the CK, PSO, DE and ABC
algorithms’ and α = 0.05 (for a 95% confidence) has been used at Anova test. The graphical
analysis results of the Anova test are illustrated in Figs. 1, 2, 3, 4, and 5 and significantly
different algorithms for the benchmark functions are tabulated in Table 8. Mean-function
evaluation numbers (MeanFncEvol) have also been analyzed, where MeanFncEvol denotes
the mean of the best function evaluation numbers of an experiment (an experiment involves
20 trials as mentioned above). The MeanFncEvol values have been tabulated in Table 9.
123
A conceptual comparison of the CK, PSO, DE and ABC algorithms 343
Table 12 Comparison of the performances of the CK, PSO, DE and ABC algorithms for the statistical
parameters of MinRuntime
CK PSO DE ABC
SAME 0 2 (for the functions of: 40 (for the functions 8 (for the functions of:
F27, F46) of: F1, F2, F3, F4, F7, F20, F29, F31,
F5, F6, F8, F9, F10, F32, F33, F36, F45)
F11, F12, F13, F14,
F15, F16, F17, F18,
F19, F21, F22, F23,
F24, F25, F26, F28,
F30, F34, F35, F37,
F38, F39, F40, F41,
F42, F43, F44, F47,
F48, F49, F50)
DIFF. 50 (for the functions 48 (for the functions 10 (for the functions 42 (for the functions
of: F1, F2, F3, F4, of: F1, F2, F3, F4, of: F7, F20, F27, of: F1, F2, F3, F4,
F5, F6, F7, F8, F9, F5, F6, F7, F8, F9, F29, F31, F32, F33, F5, F6, F8, F9, F10,
F10, F11, F12, F13, F10, F11, F12, F13, F36, F45, F46) F11, F12, F13, F14,
F14, F15, F16, F17, F14, F15, F16, F17, F15, F16, F17, F18,
F18, F19, F20, F21, F18, F19, F20, F21, F19, F21, F22, F23,
F22, F23, F24, F25, F22, F23, F24, F25, F24, F25, F26, F27,
F26, F27, F28, F29, F26, F28, F29, F30, F28, F30, F34, F35,
F30, F31, F32, F33, F31, F32, F33, F34, F37, F38, F39, F40,
F34, F35, F36, F37, F35, F36, F37, F38, F41, F42, F43, F44,
F38, F39, F40, F41, F39, F40, F41, F42, F46, F47, F48, F49,
F42, F43, F44, F45, F43, F44, F45, F47, F50)
F46, F47, F48, F49, F48, F49, F50)
F50)
SAME The number of benchmark functions that the CK, PSO, DE and ABC algorithms provide equal values
for the MinRuntime parameter, DIFF. The number of benchmark functions that the CK, PSO, DE and ABC
algorithms provide different values from the MinRuntime
The run-time complexities are given as seconds in Table 10 where MinRuntime denotes the
run-time of the best function evaluation number of an experiment. The multiple comparison
results for the MeanFncEvol and MinRuntime have been given in Tables 11, 12.
4 Conclusion
In this paper, the numerical optimization problem solving successes of the CK, PSO, DE
and ABC algorithms have been compared statistically. Statistical analysis relieved that the
problem solving success of the CK and DE algorithms are quite better than the PSO (i.e.,
PSO2007) and ABC algorithms. Although there are several improved versions of PSO in
the literature, the PSO-2007 implementation has been preferred in the tests due to its high
performance.
The PSO algorithm is successful in the solution of many benchmark functions, but its
well-known stability problem restricts the success rate of this algorithm against the CK and
DE algorithms. Since the PSO algorithm maintains its stochastic behavior capacity better
than the ABC algorithm while searching for the global optimum value, it provides more suc-
cessful results than the ABC algorithm. The ABC algorithm basically has a search strategy,
which is considerably similar to the standard DE algorithm (i.e., DE/rand/1). However, the
ABC algorithm has a very successful decision mechanism that decides which areas within
123
344 P. Civicioglu, E. Besdok
the search space require to be surveyed in more detail. The strategy of the ABC algorithm
used to discover new nectar sources within the ABC algorithm and manage the capacity of
the nectar sources discovered is also substantially powerful.
Acknowledgments The authors would like to thank to the referees who have contributed to enhancement
of the technical contents of this paper. The studies in this paper have been supported within the scope of
the scientific research projects of 110Y309 supported by TUBITAK and FBA-9-1131 supported by Erciyes
University.
References
Akay B, Karaboga D (2010) A modified artificial bee colony algorithm for real-parameter optimization. Inf
Sci (in press, online version)
Ali MM, Torn A (2004) Population set-based global optimization algorithms: some modifications and numer-
ical studies. Comput Oper Res 31(10):1703–1725
Bin X, Jie C, Zhi-Hong P, Feng P (2010) An adaptive hybrid optimizer based on particle swarm and differential
evolution for global optimization. Sci China Inf Sci 53(5):980–989
Chaoshun L, Jianzhong Z (2011) Parameters identification of hydraulic turbine governing system using im-
proved gravitational search algorithm. Energy Convers Manag 52(1):374–381
Clerc M, Kennedy J (2002) The particle swarm—explosion, stability, and convergence in a multidimensional
complex space. IEEE Trans Evol Comput 6(1):58–73
Corne D, Dorigo M, Glover F (1999) New ideas in optimization. McGraw-Hill, USA
Das S, Suganthan P (2009) Differential evolution: a survey of the state-of-the-art. IEEE Trans Evol Comput
15(1):4–31
Das S, Mukhopadhyay A, Roy A, Abraham A, Panigrahi BK (2011) Exploratory power of the Harmony search
algorithm: analysis and improvements for global numerical optimization. IEEE Trans Syst Man Cybern
Part B Cybern 4(1):89–106
Deb K, Pratap A, Agarwal S et al (2002) A fast and elitist multiobjective genetic algorithm: Nsga-ii. IEEE
Trans Evol Comput 6(2):182–197
Del Valle Y, Venayagamoorthy GK, Mohagheghi S, Hernandez JC, Harley RG (2008) Particle swarm optimi-
zation: basic concepts, variants and applications in power systems. IEEE Trans Evol Comput 12(2):171–
195
Dorigo M, Maniezzo V, Colorni A (1996) Ant system: optimization by a colony of cooperating agents. IEEE
Trans Syst Man Cybern Part B Cybern 26(1):29–41
Dorigo M, Bonabeau E, Theraulaz G (2000) Ant algorithms and stigmergy. Future Gener Comput Syst
16(8):851–871
Dorigo M, Trianni V, Sahin E et al (2004) Evolving self-organizing behaviors for a swarm-bot. Auton Robots
17(2–3):223–245
Duman S, Guvenc U, Yorukeren N (2010) Gravitational search algorithm for economic dispatch with valve-
point effects. Int Rev Electr Eng Iree 5(6):2890–2895
Eberhart RC, Shi Y (2001) Tracking and optimizing dynamic systems with particle swarms. In: Proceedings
of IEEE congress on evolutionary computation vol 1, pp 94–100
Esmat R, Hossein NP, Saeid S (2010) Bgsa: binary gravitational search algorithm. Nat Comput 9(3):727–745
Esmat R, Hossien NP, Saeid S (2011) Filter modeling using gravitational search algorithm. Eng Appl Artif
Intell 24(1):117–122
Fei K, Junjie L, Qing X (2009) Structural inverse analysis by hybrid simplex artificial bee colony algorithms.
Comput Struct 87(13–14):861–870
Ferrante N, Ville T (2010) Recent advances in differential evolution: a survey and experimental analysis. Artif
Intell Rev 33(1–2):61–106
Geem ZW, Kim JH, Loganathan G (2001) A new heuristic optimization algorithm: Harmony search. Simu-
lation 76(2):60–68
Haupt R (1995) Comparison between genetic and gradient-based optimization algorithms for solving electro-
magnetics problems. IEEE Trans Magn 31(3):1932–1935
Horst R, Pardalos PM, Thoai NV (2000) Introduction to global optimization. Kluwer Academic Publishers,
Dordrecht, The Netherland
Janez B, Borko B, Saso G et al (2007) Performance comparison of self-adaptive and adaptive differential
evolution algorithms. Soft Comput 11(7):617–629
123
A conceptual comparison of the CK, PSO, DE and ABC algorithms 345
Juang C (2004) A hybrid of genetic algorithm and particle swarm optimization for recurrent network design.
IEEE Trans Syst Man Cybern Part B Cybern 34(2):997–1006
Kaelo P, Ali MM (2006) A numerical study of some modified differential evolution algorithms. Eur J Oper
Res 169(3):1176–1184
Karaboga D, Akay B (2009a) A comparative study of artificial bee colony algorithm. Appl Math Comput
214(12):108–132
Karaboga D, Akay B (2009b) A survey: algorithms simulating bee swarm intelligence. Artif Intell Rev 31(1–
4):61–85
Karaboga D, Basturk B (2007a) Artificial bee colony (abc) optimization algorithm for solving constrained
optimization problems. Lecture Notes Comput Sci 4529:789–798
Karaboga D, Basturk B (2007b) A powerful and efficient algorithm for numerical function optimization:
artificial bee colony (abc) algorithm. J Glob Optim 39(3):459–471
Karaboga N (2009) A new design method based on artificial bee colony algorithm for digital iir filters. J Frankl
Inst Eng Appl Math 346(4):328–348
Lee K, Geem ZW (2004) A new structural optimization method based on the Harmony search algorithm.
Comput Struct 82(9–10):781–798
Lee K, Geem ZW (2005) A new meta-heuristic algorithm for continuous engineering optimization: Harmony
search theory and practice. Comput Methods Appl Mech Eng 194(36–38):3902–3933
Liu J, Lampinen J (2005) A fuzzy adaptive differential evolution algorithm. Soft Comput 9(6):448–462
Mahamed GO, Mehrdad M (2008) Global-best Harmony search. Appl Math Comput 198(2):643–656
Mahdavi M, Fesanghary M, Damangir E (2007) An improved Harmony search algorithm for solving optimi-
zation problems. Appl Math Comput 188(2):1567–1579
Martinoli A, Easton K, Agassounon W (2004) Modeling swarm robotic systems: a case study in collaborative
distributed manipulation. Int J Robot Res 23(4-5):415–436
Mersha AG, Dempe S (2011) Direct search algorithm for bilevel programming problems. Comput Optim Appl
49(1):1–15
Nowak W, Cirpka OA (2004) A modified levenberg-marquardt algorithm for quasi-linear geostatistical
inversing. Adv Water Resour 27(7):737–750
Ong YS, Lim MH, Zhu N et al (2006) Classification of adaptive memetic algorithms: a comparative study.
IEEE Trans Syst Man Cybern Part B Cybern 36(1):141–152
Price K, Storn R (1997) Differential evolution. Dr Dobbs J 22(4):18–24
Price K, Storn R, Lampinen J (2005) Differential evolution: a practical approach to global optimization.
Springer, Berlin, Germany
Rashedi E, Nezamabadi-pour H, Saryazdi S (2009) Gsa: a gravitational search algorithm. Inf Sci
179(13):2232–2248
Ratnaweera A, Halgamuge SK, Watson HC (2004) Self-organizing hierarchical particle swarm optimizer with
time-varying acceleration coefficients. IEEE Trans Evol Comput 8(3):240–255
Shahryar R, Hamid RT, Magdy MAS (2008) Opposition-based differential evolution. IEEE Trans Evol Com-
put 12(1):64–79
Sousa T, Silva A, Neves A (2004) Particle swarm based data mining algorithms for classification tasks. Comput
Optim Appl 30(5–6):767–783
Storn R (1999) System design by constraint adaptation and differential evolution. IEEE Trans Evol Comput
3(1):22–34
Storn R, Price K (1997) Differential evolution—a simple and efficient heuristic for global optimization over
continuous spaces. J Glob Optim 11(4):341–359
Swagatam D, Ajith A, Uday KC et al (2009) Differential evolution using a neighborhood-based mutation
operator. IEEE Trans Evol Comput 13(3):526–553
Tahk MJ, Park MS, Woo HW, Kim HJ (2009) Hessian approximation algorithms for hybrid optimization
methods. Eng Optim 41(7):609–633
Trelea IC (2003) The particle swarm optimization algorithm: convergence analysis and parameter selection.
Inf Process Lett 85(6):317–325
Vesterstrom J, Thomsen R (2004) A comparative study of differential evolution particle swarm optimiza-
tion and evolutionary algorithms on numerical benchmark problems. Congr Evol Comput, CEC2004
2:1980–1987
Yang X (2005) Engineering optimizations via nature-inspired virtual bee algorithms. Lecture Notes Comput
Sci 3562:317–323
Yang X, Deb S (2009) Cuckoo search via levey flights. World congress on nature and biologically inspired
computing’NABIC-2009, vol 4. Coimbatore, pp 210–214
Yang XS (2009) Firefly algorithms for multimodal optimization. Lecture Notes Comput Sci 5792:169–178
123
346 P. Civicioglu, E. Besdok
Yang XS, Deb S (2010) Engineering optimisation by Cuckoo search. Int J Math Modell Numer Optim
1(4):330–343
Yoshida H, Kawata K, Fukuyama Y et al (2000) A particle swarm optimization for reactive power and voltage
control considering voltage security assessment. IEEE Trans Power Syst 15(4):1232–1239
Zhang J, Sanderson A (2009) Tracking and optimizing dynamic systems with particle swarms. IEEE Trans
Evol Comput 13(5):945–958
Zhang J, Chung H, Lo W (2007) Engineering optimizations via nature-inspired virtual bee algorithms. IEEE
Trans Evol Comput 11(3):326–335
Zhua G, Kwongb S (2010) Gbest-guided artificial bee colony algorithm for numerical function optimization.
Appl Math Comput 217(7):3166–3173
123