Firefly Algorithm
Firefly Algorithm
Xin-She Yang
School of Science and Technology,
Middlesex University, The Burroughs, London NW4 4BT, UK.
Xingshi He
School of Science, Xi’an Polytechnic University,
No. 19 Jinhua South Road, Xi’an 710048, P. R. China.
Abstract
Nature-inspired metaheuristic algorithms, especially those based on swarm intelli-
gence, have attracted much attention in the last ten years. Firefly algorithm appeared in
about five years ago, its literature has expanded dramatically with diverse applications.
In this paper, we will briefly review the fundamentals of firefly algorithm together with
a selection of recent publications. Then, we discuss the optimality associated with bal-
ancing exploration and exploitation, which is essential for all metaheuristic algorithms.
By comparing with intermittent search strategy, we conclude that metaheuristics such
as firefly algorithm are better than the optimal intermittent search strategy. We also
analyse algorithms and their implications for higher-dimensional optimization problems.
1 Introduction
Metaheuristic algorithms form an important part of contemporary global optimisation al-
gorithms, computational intelligence and soft computing. These algorithms are usually
nature-inspired with multiple interacting agents. A subset of metaheuristcs are often re-
ferred to as swarm intelligence (SI) based algorithms, and these SI-based algorithms have
been developed by mimicking the so-called swarm intelligence charateristics of biological
agents such as birds, fish, humans and others. For example, particle swarm optimisation
was based on the swarming behaviour of birds and fish [24], while the firefly algorithm was
based on the flashing pattern of tropical fireflies [32, 33] and cuckoo search algorithm was
inspired by the brood parasitism of some cuckoo species [37].
In the last two decades, more than a dozen new algorithms such as particle swarm
optimisation, differential evolution, bat algorithm, firefly algorithm and cuckoo search have
appeared and they have shown great potential in solving tough engineering optimisation
problems [32, 5, 16, 27, 34, 35, 17]. Among these new algorithms, it has been shown that
firefly algorithm is very efficient in dealing with multimodal, global optimisation problems.
In this paper, we will first outline the fundamentals of firefly algorithm (FA), and then
review the latest developments concerning FA and its variants. We also highlight the reasons
1
why FA is so efficient. Furthermore, as the balance of exploration and exploitation is
important to all metaheuristic algorithms, we will then discuss the optimality related to
search landscape and algorithms. Using the intermittent search strategy and numerical
experiments, we show that firefly algorithm is significantly more efficient than intermittent
search strategy.
• Fireflies are unisex so that one firefly will be attracted to other fireflies regardless of
their sex.
• The attractiveness is proportional to the brightness, and they both decrease as their
distance increases. Thus for any two flashing fireflies, the less brighter one will move
towards the brighter one. If there is no brighter one than a particular firefly, it will
move randomly.
• The brightness of a firefly is determined by the landscape of the objective function.
2
where α0 is the initial randomness scaling factor, and δ is essentially a cooling factor. For
most applications, we can use δ = 0.95 to 0.97 [32].
Regarding the initial α0 , simulations show that FA will be more efficient if α0 is asso-
ciated with the scalings of design variables. Let L be the average scale of the problem of
interest, we can set α0 = 0.01L initially. The factor 0.01 comes from the fact that random
walks requires a number of steps to reach the target while balancing the local exploitation
without jumping too far in a few steps [33, 34].
The parameter β controls the attractiveness, and parametric studies suggest that β0 = 1
can be used for most applications.
√ However, γ should be also related to the scaling L. In
general, we can set γ = 1/ L. If the scaling variations are not significant, then we can set
γ = O(1).
For most applications, we can use the population size n = 15 to 100, though the best
range is n = 25 to 40 [32, 33].
3
Classifications and clustering are another important area of applications of FA with
excellent performance [31, 28]. For example, Senthilnath el al. provided an extensive per-
formance study by compared FA with 11 different algorithms and concluded that firefly
algorithm can be efficiently used for clustering [31]. In most cases, firefly algorithm outper-
form all other 11 algorithms. In addition, firefly algorithm has also been applied to train
neural networks [25].
For optimisation in dynamic environments, FA can also be very efficient as shown by
Farahani et al. [13, 14] and Abshouri et al. [1].
4
3
0
5
0
5
−5 0
−5
Genetic algorithms required 25412 ± 1237 evaluations to get an accuracy of 10−5 of the
optimal solution, while PSO needed 17040 ± 1123 evaluations. For FA, we achieved the
same accuracy by 5657 ± 730. This save about 78% and 67% computational cost, compared
to GA and PSO, respectively.
For Yang’s forest function
d
X h d i
sin(x2i ) ,
X
f (x) = |xi | exp − −2π ≤ xi ≤ 2π, (5)
i=1 i=1
GA required 37079 ± 8920 with a success rate of 88% for d = 16, and PSO required
19725 ± 3204 with a success rate of 98%. FA obtained a 100% success rate with just
5152 ± 2493. Compared with GA and PSO, FA saved about 86% and 74%, respectively, of
overall computational efforts.
As an example for automatic subdivision, we now use the FA to find the global maxima
of the following function
d
X d
x2i ,
X
f (x) = |xi | exp − (6)
i=1 i=1
with the domain −10 ≤ xi ≤ 10 for all (i = 1, 2, ..., d) where d is the number of dimensions.
This function has multiple global optima [36]. In the case of d = 2, we have 4 equal maxima
√
f∗ = 1/ e ≈ 0.6065 at (1/2, 1/2), (1/2, −1/2), (−1/2, 1/2) and (−1/2, −1/2) and a unique
global minimum at (0, 0).
In the 2D case, we have a four-peak function are shown in Fig. 1, and these global
maxima can be found using the implemented Firefly Algorithms after about 500 function
5
5 5
0 0
−5 −5
−5 0 5 −5 0 5
Figure 2: Initial locations of 25 fireflies (left) and their final locations after 20 iterations
(right).
4 Search Optimality
4.1 Intensification versus Diversification
The main components of any metaheuristic algorithms are: intensification and diversifi-
cation, or exploitation and exploration [9, 39]. Diversification means to generate diverse
solutions so as to explore the search space on the global scale, while intensification means
to focus on the search in a local region by exploiting the information that a current good
solution is found in this region. This is in combination with the selection of the best solu-
tions.
Exploration in metaheuristics can be achieved often by the use of randomization [9, 32,
33], which enables an algorithm to have the ability to jump out of any local optimum so as
to explore the search globally. Randomization can also be used for local search around the
current best if steps are limited to a local region. When the steps are large, randomization
can explore the search space on a global scale. Fine-tuning the right amount of randomness
and balancing local search and global search are crucially important in controlling the
performance of any metaheuristic algorithm.
Exploitation is the use of local knowledge of the search and solutions found so far so
that new search moves can concentrate on the local regions or neighborhood where the
optimality may be close; however, this local optimum may not be the global optimality.
Exploitation tends to use strong local information such as gradients, the shape of the mode
such as convexity, and the history of the search process. A classic technique is the so-called
hill-climbing which uses the local gradients or derivatives intensively.
Empirical knowledge from observations and simulations of the convergence behaviour
of common optimisation algorithms suggests that exploitation tends to increase the speed
of convergence, while exploration tends to decrease the convergence rate of the algorithm.
6
On the other hand, too much exploration increases the probability of finding the global
optimality, while strong exploitation tends to make the algorithm being trapped in a local
optimum. Therefore, there is a fine balance between the right amount of exploration and
the right degree of exploitation. Despite its importance, there is no practical guideline for
this balance.
7
Assuming that the search steps have a uniform velocity u at each step on average, the
minimum times required for each phase can be estimated as
D ln2 (b/a)
τamin ≈ , (10)
2u2 [2 ln(b/a) − 1]
and r
a 1
τbmin
≈ ln(b/a) − . (11)
u 2
When u → ∞, these relationships lead to the above optimal ratio of two stages. It is worth
pointing out that the above result is only valid for 2D cases, and there is no general results
for higher dimensions, except in some special 3D cases [7]. Now let us use this limited results
to help choose the possible values of algorithm-dependent parameters in firefly algorithm
[32, 33], as an example.
For higher-dimensional problems, no result exists. One possible extension is to use
extrapolation to get an estimate. Based on the results on 2D and 3D cases [8], we can
estimate that for any d-dimensional cases d ≥ 3
τ1 D b b
∼ O , τm ∼ O ( )d−1 , (12)
τ22 a2 u a
where τm the mean search time or average number of iterations. This extension may not
be good news for higher dimensional problems, as the mean number of function evaluations
to find optimal solutions can increase exponentially as the dimensions increase. However,
in practice, we do not need to find the guaranteed global optimality, we may be satisfied
with suboptimality, and sometimes we may be ‘lucky’ to find such global optimality even
with a limited/fixed number of iterations. This may indicate there is a huge gap between
theoretical understanding and the observations as well as run-time behaviour in practice.
More studies are highly needed to address these important issues.
5 Numerical Experiments
5.1 Landscape-Based Optimality: A 2D Example
If we use the 2D simple, isotropic random walks for local exploration to demonstrate
landscape-based optimality, then we have
s2
D≈ , (13)
2
where s is the step length with a jump during a unit time interval or each iteration step.
From equation (9), the optimal ratio of exploitation and exploration in a special case of
b ≈ 10a becomes
τa
≈ 0.2. (14)
τb2
In case of b/a → ∞, we have τa /τb2 ≈ 1/8. which implies that more times should spend
on the exploration stage. It is worth pointing out that the naive guess of 50-50 probability
in each stage is not the best choice. More efforts should focus on the exploration so that
the best solutions found by the algorithm can be globally optimal with possibly the least
computing effort. However, this case may be implicitly linked to the implicit assumptions
8
Table 1: Variations of Q and its effect on the solution quality.
Q 0.4 0.3 0.2 0.1 0.05
fmin 9.4e-11 1.2e-12 2.9e-14 8.1e-12 9.2e-11
that the optimal solutions or search targets are multimodal. Obviously, for a unimodal
problem, once we know its modality, we should focus more on the exploitation to get quick
convergence.
In the case studies to be described below, we have used the firefly algorithm to find
the optimal solutions to the benchmarks. If set τb = 1 as the reference timescale, then we
found that the optimal ratio is between 0.15 to 0.24, which are roughly close to the above
theoretical result.
which is multimodal with many local peaks and valleys. It has a unique global minimum
at fmin = 0 at (π, π, ..., π) in the domain −20 ≤ xi ≤ 20 where i = 1, 2, ..., d and β = 15. In
this case, we can estimate that R = 20 and a ≈ π/2, this means that R/a ≈ 12.7, and we
have in the case of d = 2
1
pe ≈ τoptimal ≈ ≈ 0.19. (16)
2[2 − 1/ ln(R/a)]2
This indicate that the algorithm should spend 80% of its computational effort on global
explorative search, and 20% of its effort on local intensive search.
For the firefly algorithm, we have used n = 15 and 1000 iterations. We have calculated
the fraction of iterations/function evaluations for exploitation to exploration. That is, Q =
exploitation/exploration, thus Q may affect the quality of solutions. A set of 25 numerical
experiments have been carried out for each value of Q and the results are summarized in
Table 1.
This table clearly shows that Q ≈ 0.2 provides the optimal balance of local exploitation
and global exploration, which is consistent with the theoretical estimation.
Though there is no direct analytical results for higher dimensions, we can expect that
more emphasis on global exploration is also true for higher dimensional optimisation prob-
lems. Let us study this test function for various higher dimensions.
9
4
x 10
10
Intermittent
Firefly Algorithm
8
0
2 4 6 8 10
Figure 3: Comparison of the actual number of iterations with the theoretical results by
the intermittent search strategy. This clearly show that firefly algorithm is better than the
intermittent search strategy.
2b2 q
τm ln(b/a),
= (18)
(d=2) au
2.2b b 2
τm = ( ) . (19)
(d=3) u a
For higher dimensions, we can only estimate the main trend based on the intermittent
search strategy. That is,
τ1 D b b
2 ∼O 2 , τm ∼ O ( )d−1 , (20)
τ2 a u a
which means that number of iterations may increase exponentially with the dimension d. It
is worth pointing out that the optimal ratio between the two stage should be independent
of the dimensions. In other words, once we find the optimal balance between exploration
and exploitation, we can use the algorithm for any high dimensions.
Now let us use firefly algorithm to carry out search in higher dimensions for the above
standing wave function and compare its performance with the implication of intermittent
search strategy. For the case of b = 20, a = π/2 and u = 1, Fig. 3 shows the comparison of
the numbers of iterations suggested by intermittent search strategy and the actual numbers
of iterations using firefly algorithm to obtain the globally optimal solution with a tolerance
or accuracy of 5 decimal places. It can be seen clearly that the number of iterations needed
by the intermittent search strategy increases exponentially versus the number of dimensions,
while the actual number of iterations used in the algorithm only increases slightly, seemingly
weakly a low-order polynomial. This suggests that firefly algorithm is very efficient and
requires far fewer (and often many orders lower) number of function evaluations.
10
6 Conclusions
Nature-inspired metaheuristic algorithms have gained popularity, which is partly due to
their ability of dealing with nonlinear global optimisation problems. We have reviewed the
fundamentals of firefly algorithm, the latest developments with diverse applications. As the
time of writing, a quick Google search suggests that there are about 323 papers on firefly
algorithms from 2008. This review can only cover a fraction of the literature. There is no
doubt that firefly algorithm will be applied in solving more challenging problems in the near
future, and its literature will continue to expand.
On the other hand, we have also highlighted the importance of exploitation and explo-
ration and their effect on the efficiency of an algorithm. Then, we use the intermittent
search strategy theory as a preliminary basis for analyzing these key components and ways
to find the possibly optimal settings for algorithm-dependent parameters.
With such insight, we have used the firefly algorithm to find this optimal balance,
and confirmed that firefly algorithm can indeed provide a good balance of exploitation
and exploration. We have also shown that firefly algorithm requires far fewer function
evaluations. However, the huge differences between intermittent search theory and the
behaviour of metaheuristics in practice also suggest there is still a huge gap between our
understanding of algorithms and the actual behaviour of metaheuristics. More studies in
metaheuristics are highly needed.
It is worth pointing out that there are two types of optimality here. One optimality
concerns that for a given algorithm what best types of problems it can solve. This is
relatively easy to answer because in principle we can test an algorithm by a wide range of
problems and then select the best type of the problems the algorithm of interest can solve.
On other hand, the other optimality concerns that for a given problem what best algorithm
is to find the solutions efficiently. In principle, we can compare a set of algorithms to solve
the same optimisation problem and hope to find the best algorithm(s). In reality, there
may be no such algorithm at all, and all test algorithms may not perform well. Search for
new algorithms may take substantial research efforts.
The theoretical understanding of metaheuristics is still lacking behind. In fact, there is
a huge gap between theory and applications. Though theory lags behind, applications in
contrast are very diverse and active with thousands of papers appearing each year. Fur-
thermore, there is another huge gap between small-scale problems and large-scale problems.
As most published studies have focused on small, toy problems, there is no guarantee that
the methodology that works well for such toy problems will work for large-scale problems.
All these issues still remain unresolved both in theory and in practice.
As further research topics, most metaheuristic algorithms require good modifications
so as to solve combinatorial optimisation properly. Though with great interest and many
extensive studies, more studies are highly needed in the area of combinatorial optimisation
using metaheuristic algorithms. In addition, most current metaheuristic research has fo-
cused on small scale problems, it will be extremely useful if further research can focus on
large-scale real-world applications.
References
[1] A. A. Abshouri, M. R. Meybodi and A. Bakhtiary, New firefly algorithm based on
multiswarm and learning automata in dynamic environments, Third Int. Conference on
11
Signal Processing Systems (ICSPS2011), Aug 27-28, Yantai, China, pp. 73-77 (2011).
[2] Sina K. Azad, Saeid K. Azad, Optimum Design of Structures Using an Improved
Firefly Algorithm, International Journal of Optimisation in Civil Engineering, 1(2),
327-340(2011).
[3] Apostolopoulos T. and Vlachos A., (2011). Application of the Firefly Al-
gorithm for Solving the Economic Emissions Load Dispatch Problem, In-
ternational Journal of Combinatorics, Volume 2011, Article ID 523806.
http://www.hindawi.com/journals/ijct/2011/523806.html
[4] H. Banati and M. Bajaj, Firefly based feature selection approach, Int. J. Computer
Science Issues, 8(2), 473-480 (2011).
[6] B. Basu and G. K. Mahanti, Firefly and artificial bees colony algorithm for synthesis
of scanned and broadside linear array antenna, Progress in Electromagnetic Research
B., 32, 169-190 (2011).
[12] K. Durkota, Implementation of a discrete firefly algorithm for the QAP problem within
the sage framework, BSc thesis, Czech Technical University, (2011).
12
[15] I. Fister Jr, I. Fister, J. Brest, X. S. Yang, Memetic firefly algorithm for combi-
natorial optimisation, in: Bioinspired Optimisation Methods and Their Applications
(BIOMA2012) edited by B. Filipič and J. Šilc, 24-25 May 2012, Bohinj, Slovenia, pp.
75-86 (2012).
[20] M.-H. Horng, Y.-X. Lee, M.-C. Lee and R.-J. Liou, Firefly metaheuristic algorithm for
training the radial basis function network for data classification and disease diagnosis,
in: Theory and New Applications of Swarm Intelligence (Edited by R. Parpinelli and
H. S. Lopes), pp. 115-132 (2012).
[21] M.-H. Horng, Vector quantization using the firefly algorithm for image compression,
Expert Systems with Applications, 39, pp. 1078-1091 (2012).
[22] M.-H. Horng and R.-J. Liou, Multilevel minimum cross entropy threshold selection
based on the firefly algorithm, Expert Systems with Applications, 38, pp. 14805-14811
(2011).
[23] G. K. Jati and S. Suyanto, Evolutionary discrete firefly algorithm for travelling sales-
man problem, ICAIS2011, Lecture Notes in Artificial Intelligence (LNAI 6943), pp.393-
403 (2011).
[24] J. Kennedy and R. Eberhart, Particle swarm optimisation, in: Proc. of the IEEE Int.
Conf. on Neural Networks, Piscataway, NJ, pp. 1942-1948 (1995).
[28] A. Rajini, V. K. David, A hybrid metaheuristic algorithm for classification using micro
array data, Int. J. Scientific & Engineering Research, 3(2), 1-9 (2012).
13
[29] B. Rampriya, K. Mahadevan and S. Kannan, U nit commitment in deregulated power
system using Lagrangian firefly algorithm, Proc. of IEEE Int. Conf. on Communica-
tion Control and Computing Technologies (ICCCCT2010), pp. 389-393 (2010).
[30] Sayadi M. K., Ramezanian R. and Ghaffari-Nasab N., (2010). A discrete firefly
meta-heuristic with local search for makespan minimization in permutation flow shop
scheduling problems, Int. J. of Industrial Engineering Computations, 1, 1–10.
[31] J. Senthilnath, S. N. Omkar, V. Mani, Clustering using firely algorithm: performance
study, Swarm and Evolutionary Computation, 1(3), 164-171 (2011).
[32] X. S. Yang, Nature-Inspired Metaheuristic Algorithms, Luniver Press, UK, (2008).
[33] X. S. Yang, Firefly algorithms for multimodal optimisation, Proc. 5th Symposium
on Stochastic Algorithms, Foundations and Applications, (Eds. O. Watanabe and T.
Zeugmann), Lecture Notes in Computer Science, 5792: 169-178 (2009).
[34] X. S. Yang, Engineering Optimisation: An Introduction with Metaheuristic Applica-
tions, John Wiley and Sons, USA (2010).
[35] X. S. Yang, A new metaheuristic bat-inspired algorithm, in: Nature Inspired Cooper-
ative Strategies for Optimisation (NICSO 2010) (Eds. J. R. Gonzalez et al.), Springer,
SCI Vol. 284, 65-74 (2010).
[36] X. S. Yang, Firefly algorithm, stochastic test functions and design optimisation, Int.
J. Bio-Inspired Computation, 2(2), 78-84 (2010).
[37] X. S. Yang and S. Deb, Cuckoo search via Lévy flights, Proceeings of World Congress
on Nature & Biologically Inspired Computing (NaBIC 2009, India), IEEE Publications,
USA, pp. 210-214 (2009).
[38] X. S. Yang, Chaos-enhanced firefly algorithm with automatic parameter tuning, Int.
J. Swarm Intelligence Research, 2(4), pp. 1-11 (2011).
[39] X. S. Yang, Swarm-based metaheuristic algorithms and no-free-lunch theorems, in:
Theory and New Applications of Swarm Intelligence (Eds. R. Parpinelli and H. S.
Lopes), Intech Open Science, pp. 1-16 (2012).
[40] X. S. Yang, S. Deb and S. Fong, (2011). Accelerated particle swarm optimization and
support vector machine for business optimization and applications, Networked Digital
Technologies (NDT’2011), Communications in Computer and Information Science,
Vol. 136, Part I, pp. 53-66.
[41] X. S. Yang, Multiobjective firefly algorithm for continuous optimization, Engineering
with Computers, Online Fist, DOI: 10.1007/s00366-012-0254-1 (2012).
[42] A. Yousif, A. H. Abdullah, S. M. Nor, A. A. abdelaziz, Scheduling jobs on grid comput-
ing using firefly algorithm, J. Theoretical and Applied Information Technology, 33(2),
155-164 (2011).
[43] M. A. Zaman and M. A. Matin, Nonuniformly spaced linear antenna array design
using firefly algorithm, Int. J. Microwave Science and Technology, Vol. 2012, Article
ID: 256759, (8 pages), 2012. doi:10.1155/2012/256759
14