Fireworks Algorithm For Unconstrained Function Optimization Problems
Fireworks Algorithm For Unconstrained Function Optimization Problems
net/publication/317617615
CITATIONS READS
6 430
1 author:
Evans Baidoo
Hohai University
21 PUBLICATIONS 75 CITATIONS
SEE PROFILE
All content following this page was uploaded by Evans Baidoo on 16 June 2017.
Evans BAIDOO*
FIREWORKS ALGORITHM
FOR UNCONSTRAINED
FUNCTION OPTIMIZATION PROBLEMS
Abstract
Modern real world science and engineering problems can be classified
as multi-objective optimisation problems which demand for expedient
and efficient stochastic algorithms to respond to the optimization needs.
This paper presents an object-oriented software application that implements
a firework optimization algorithm for function optimization problems.
The algorithm, a kind of parallel diffuse optimization algorithm is based
on the explosive phenomenon of fireworks. The algorithm presented
promising results when compared to other population or iterative based
meta-heuristic algorithm after it was experimented on five standard ben-
chmark problems. The software application was implemented in Java with
interactive interface which allow for easy modification and extended expe-
rimentation. Additionally, this paper validates the effect of runtime on the al-
gorithm performance.
1. INTRODUCTION
*
Kwame Nkrumah University of Science and Technology, Department of Computer Science,
PMB, KNUST, Ghana, Email: [email protected]
61
In such instances conversion from constraint function optimization problem into
an unconstrained problem is implemented by focusing on the designed special
operators and penalty functions so as to enable the feasibility of the solution at
all times.
There are a wide range of mathematical programming algorithms which offer
various techniques to control various optimization problems as numerical,
discrete or combinatorial optimization problems, but most of the methods at most
times fail to return satisfactory results. In operations research, as an alternative
to the mathematical programming methods, bio-inspired optimization algorithms
have become very popular. Over the last decade, researchers are paying attention
to nature-inspired heuristics. These algorithms centre on the cooperative
intellectual behaviours of animal groups, insects, bee or ant colonies etc and its
problem solution abilities. Swarm system as mostly referred have lots of ad-
vantages in finding solutions to many optimization problems (Bonabeau, Dorigo
& Theraulaz, 1999). As applied in many technical fields, such as data mining,
signal processing, network routing, pattern recognition and others, this system
is a kind of random search algorithm that simulates the biological population
evolution and hence solves complex stochastic optimization problems through
cooperation of individuals and species competition (Yuan, de Oca, Birattari
& Stutzle, 2012). Typical among the swarm intelligence algorithms are the ant
colony optimization (ACO) algorithm (Chandra et al., 2012), the bee colony
(ABC) algorithm (Karaboga & Basturk, 2007) particle swarm optimization
(PSO) algorithm (Kennedy & Eberhart, 1995), and the genetic algorithm (GA)
(Tang, Man, Kwong & He, 1996) enthused by the Darwinian law. Among the
listed SI algorithms, PSO is one of the largely accepted algorithms for probing
optimal locations in an undefined dimensional space.
A new swarm intelligence algorithm which aroused worldwide concern
in 2010 with an excellent optimisation performance was one proposed by Ying
Tan. This algorithm which inspired by the emergent swarm behaviour
of fireworks is referred to as Fireworks algorithms. This algorithm follows in the tra-
dition of swarm intelligence (Tan & Zhu, 2010).
In this paper an object-oriented implementation of fireworks algorithm
is tested on unconstrained function optimization problems. A software program
was developed in Java to solve the function optimisation problem, test the results
of benchmark functions and also to test the system robustness and performances.
The remaining of the paper is organised as follows: In section 2 Fireworks
algorithm is introduced followed by section 3 with a brief explanation of five
standard unconstrained Benchmark functions. Section 4 details the software
implementation of the fireworks algorithm with Section 5 presenting the
simulation experiment and results. Finally the paper concludes with Section 6.
2. FIREWORKS ALGORITHM
62
2.1. Background
A novel swarm intelligence algorithm, Fireworks Algorithm (FA), over the past
decade has been implored to solve universal optimisation problems although
research of various works according to survey, have shown few implementation.
Proposed by Tan and Zhu, this algorithm mimics fireworks explosion activity
and behaviour. Its first implementation was to identify its superior performance
over Standard PSO and Clonal PSO (Tan & Zhu, 2010). Zheng et al. (2012)
put forward a hybrid fireworks-differential evolution (FWA-DE) algorithm.
They use the crossover, mutation and selection operators in the Differential
evolution algorithm. An enhanced fireworks algorithm (EFWA) presented
by Zheng et al. (2013) was used to improve the least radius detection rate, the
rules of mapping and spark selection approach of the explode sparks. In 2014,
a Hybrid approach which put together FWA and differential mutation (FWA-
DM), was formulated. Having successfully experimented on CEC 2014
benchmark functions, it proves to be a good solution. Another proposed solution
is the dynamic search firework algorithm (DFWA) (Zheng et al., 2014).
This algorithm works by dividing the fireworks population into basic fireworks
with optimal fitness value and non-core fireworks. This strategy enables the al-
gorithm to undertake local search and global search efficiently and effectively.
As put forward by Ding et al. (2013), the parallelized GPU-based Fireworks
Algorithm (GPU-FWA) proficiently exploit graphical processing unit (GPU),
to solve large-scale problems. In a research paper by Li et al. (2014), they
propose the adaptive firework algorithm (AFWA) to carry out self-tuning
of blast radius. They measure the distance involving the best individual and the
discussed individual by setting the distance as the next blast radius of the best
individual. From this kind of technique, the adaptive step size adjustment shows
good optimization performance on the improved FWA.
The focal source of motivation for the FA is the process of starting out
a firework. Any time a firework is start out, shower of sparks consume the local
space in the region of it. Tan disclose that, the explosion progression
of a firework can be analyse as an exploration in the local space about an exact
point where the firework left through the sparks created in the explosion
(Tan & Zhu, 2010). The explosion of Firework reveals two specific behaviours.
A healthy created fireworks, presents numerous sparks and the sparks tends to be
engulf in the centre of the explosion. The fireworks is mostly found in areas
in the search space which tends to promising and may possibly be close to optimal
solution. Therefore it is ideal to produce enough sparks to locate the best spot
in the region of the firework. In contrast, poor quality fireworks demonstrate
divergent behaviour. Mostly in such instances, sparks generated are relatively
few and scattered in the local space. This activity indicates that, the best solution
63
to the problem is distant away from the spot of the firework and for that matter,
the radius of the search could be larger.
64
Two important elements of the FA are the number of sparks and the ampli-
tude of explosion. To illustrate the first element of the algorithm, assuming that
FA is intended for some kind of optimisation problem:
𝑦𝑚𝑎𝑥 − 𝑓(𝑥𝑖 )+ 𝜉
𝑆𝑖 = 𝑚 ∑𝑛
, (2)
𝑖=1(𝑦𝑚𝑎𝑥 − 𝑓(𝑥𝑖 ))+ 𝜁
Where m symbolize a parameter which take care of the general number of sparks
generated by n fireworks, with ymax = max( f(xi )) ( i = 1, 2, … , n) being the
worst value of the target function amongst the n fireworks, and ξ, is the least
computer constant exploited to evade zero-division-error. From experimen-
tations by (Tan & Zhu, 2010), Si needs to be smaller so as to avoid devastating
outcomes under terrible firework explosions, therefore bounds on Si are created
as expressed in 3.
𝑟𝑜𝑢𝑛𝑑(𝑎 ∙ 𝑚) 𝑖𝑓 𝑆𝑖 < 𝑎 ∙ 𝑚
𝑆̂𝑖 = { 𝑟𝑜𝑢𝑛𝑑(𝑏 ∙ 𝑚) 𝑖𝑓 𝑆𝑖 > 𝑏 ∙ 𝑚 (3)
𝑟𝑜𝑢𝑛𝑑(𝑆𝑖 ) 𝑜𝑡ℎ𝑒𝑟𝑤𝑖𝑠𝑒 𝑎 < 𝑏 < 1,
𝑓(𝑥𝑖 )− 𝑦𝑚𝑖𝑛 + 𝜉
𝐴𝑖 = 𝐴̂ ∗ ∑𝑛
, (4)
𝑖=1( 𝑓(𝑥𝑖 ))− 𝑦𝑚𝑖𝑛 + 𝜁
where 𝐴̂ refers to the value of the highest explosion amplitude with 𝑦𝑚𝑖𝑛 = 𝑚𝑖𝑛
(𝑓(𝑥𝑖 ))( 𝑖 = 1, 2, … , 𝑛) representing best firework of the target function in n
fireworks. In the course of an explosion, the z direction (dimension) of sparks is
affected. The number of randomly affected directions is obtained by
65
where d denotes the optimisation problem number dimension of location x,
with 𝜒 being a uniform distribution of a random number ranging from 0 and 1.
With the aim of determining the 𝑥𝑖 firework location of a spark, a spark location
𝑥𝑗 is first generated. The entire process is made known in pseudo-code 1.
Pseudo-code 1
Find the initial spark’s location: 𝑥𝑗 = 𝑥𝑖
Choose random z dimensions of 𝑥𝑗 by adopting Eq. (5)
Compute the displacement: h = 𝐴𝑖 ∙ 𝜎;
𝑗
for each chosen dimension 𝑥𝑘 of 𝑥𝑗 do
𝑗 𝑗
𝑥𝑘 = 𝑥𝑘 + ℎ
𝑗 𝑗
If 𝑥𝑘 < 𝑥𝑘𝑚𝑖𝑛 𝑜𝑟 𝑥𝑘 > 𝑥𝑘𝑚𝑎𝑥
𝑗
map 𝑥𝑘 to the promising space:
𝑗 𝑗
𝑥𝑘 = 𝑥𝑘𝑚𝑖𝑛 + |𝑥𝑘 |%(𝑥𝑘𝑚𝑎𝑥 − 𝑥𝑘𝑚𝑖𝑛 );
end if
end for
Pseudo-code 2
Establish the initial spark’s location: 𝑥̂𝑗 = 𝑥𝑖
Choose arbitrary z dimensions of 𝑥̂𝑗 by adopting Eq. (5)
Compute the coefficient of Gauss explosion: g = Gaussian(1, 1)
𝑗
for each chosen dimension 𝑥̂𝑘 of 𝑥̂𝑗 do
𝑗 𝑗
𝑥̂𝑘 = 𝑥̂𝑘 ∙ 𝑔
𝑗 𝑗
If 𝑥̂𝑘 < 𝑥𝑘𝑚𝑖𝑛 𝑜𝑟 𝑥̂𝑘 > 𝑥𝑘𝑚𝑎𝑥 in that case
𝑗
map 𝑥𝑘 to the promising space:
𝑗 𝑗
𝑥̂𝑘 = 𝑥𝑘𝑚𝑖𝑛 + |𝑥̂𝑘 |%(𝑥𝑘𝑚𝑎𝑥 − 𝑥𝑘𝑚𝑖𝑛 );
end if
end for
At the start of every iteration, n fireworks locations are chosen. The finest
position 𝑥 ∗ according to the best target function 𝑓(𝑥 ∗ ) is always kept and
reassigned for the next iteration. At this point, the selection of n–1 locations are
66
chosen rooted on their distance with other locations to maintain sparks diversity.
The measure of distance between a location or spot and other spots/locations
is generally determined in FA as:
where K is the set of all present locations of both fireworks and sparks. At this
point the probability of selecting location 𝑥𝑖 can be expressed in (7) as:
𝑅(𝑥𝑖 )
𝑃(𝑥𝑖 ) = ∑ , (7)
𝑗∈𝑘 𝑅(𝑥𝑗 )
where 𝑃(𝑥𝑖 ) indicate the probability that the location 𝑥𝑖 will be chosen. Putting
all together Pseudo-code 3 illustrates the build up of Fireworks algorithm in high
level language.
The function estimations of the FA are such that there are about n + m + m ̂
carried out by each generation. Assuming the optimum of a target function can
be identified in generation T, the FA complexity will be ο(n + m + m ̂ ).
A further explanation of the behaviour of the algorithm process is demonstrated
in the graph in Figure 2 taken from James McCaffrey (2016).
67
Fig. 1. Fireworks optimization algorithm
In summing up, pseudo-code 1 and 2, presents two sorts of sparks which are
respectively generated in each of the iteration. In the first spark kind, the sparks
number and explosion amplitude depend to a larger extent the worth of the
fireworks. This is sharply in contrast with other sparks which are created using
the Gaussian explosion process for firework searching in the local Gauss space.
Subsequently, the n positions of the succeeding explosion are chosen once the
two kinds of spark positions are obtained. Pseudo-code 3 put together the overall
structure of FA.
3. BENCHMARK FUNCTIONS
Numerous benchmark functions have been reported in literature over the years
yet; there have not been an accepted standard list. This paper experimented five
rich set of popular benchmark functions with varied qualities in terms of valley
landscape, separability, and modality to assess the character traits of the solution
algorithm adopted – in this case check its correctness, toughness and general
performance of the implementation program as adopted in Virtual Library of Si-
mulation Experiments: “Test Functions and Datasets” (2016) and Bacanin et al
(2014). In this paper the unconstrained function optimization problems used
is of the form:
𝑚𝑖𝑛𝑖𝑚𝑖𝑧𝑒
̇
𝑥
𝑓(𝑥)
Subject to: 𝑥 ∈ 𝜔 ≤ 0, 𝑖 = 1 … 𝑚
𝑥𝑗𝑙 ≤ 𝑥𝑗 ≤ 𝑥𝑗𝑢 , j = 1…n
68
Where, f(x) represent the objective function to be minimized, with x being the
continuous vector variable of domain ω ⊂ Rn and f(x): ω → R representing a
continuous real-valued function. The lower and upper bounds defined within
each function dimension is represented by ω.
The first is the Dixon-Price test function, 𝑓1. This function is a continuous,
non-separable and multimodal minimization test operation. It search domain lies
in −10 ≤ 𝑥𝑖 ≤ 10, 𝑖 = 1, 2, … , 𝑛 with its global minimum, f(x) at 0.
Griewank function, 𝑓2 which has its global minimum value at 0 with the
function initialization range from [−600,600], is second. It is a continuous and
differentiable function which has a corresponding universal optimum solution as
𝑥𝑜𝑝𝑡 = (𝑥1 , 𝑥2 , … , 𝑥𝑛 ) = (100, 100, … , 100). This function although multimodal,
its multimodality diminishes with high dimensionalities (n > 30) and therefore
appears uni-modal.
Rosenbrock, 𝑓3 makes the list as third. It is a well-known traditional
optimization problem with a 2 dimensional function which describe a deep
valley with a parabola form of the shape 𝑥12 = 𝑥2 that results to the global
minimum. Owing to the non-linearity of the valley, lots of algorithms converge
slowly since they vary the direction of the search constantly and for this reason
this problem has been repetitively used in assessing gradient-based optimization
algorithms performance. Valley function is unimodal with the initialization
interval of X [−30, 30].
The fourth test function is the Schwefel function, 𝑓4 . This function is comp-
lex, with many local minima. Initialization range for the function is [−500, 500].
The surface of Schwefel function is made up of a large amount of peaks
and valleys. It is a deceptive function which possesses two global minimum
with its global minimum over the parameter space from the succeeding best
local minima geometrically far-off. Thus its search algorithm is ably susceptible
to converging in a wrong direction. It has its global minima 𝑥 ∗ 𝑎𝑡 =
= ±[𝜋(0.5 + 𝑘)]2 , 𝑓4 (𝑥 ∗ ) = −418.983. The difficulty with this test function
is that its gradient cannot bend along their axis owing to the epitasis with their
variables. For this reason, most algorithms that make use of the gradient
leisurely converge.
The Sphere concludes the benchmark functions. This function has the
properties of being separable, scalable, continuous and multimodal. Its interval
range lies in the region of 0 ≤ 𝑥𝑖 ≤ 10 with its universal minimal value at 0
and optimum solution being𝑥𝑜𝑝𝑡 = (𝑥1 , 𝑥2 , … , 𝑥𝑛 ) = ( 0, 0, … , 0).
The functions expression, its initialization and intervals are further expressed
in Table 1.
69
Tab. 1. Benchmark functions expression and initialisation
4. eFIREWORKS IMPLEMENTATION
70
The software application is deeply adopts the theory of tightly connected
abstract classes and inheritance. This concept enables easy adaptation to new
functional problems and the future extendibility of the program. eFireworks is
developed using java 1.7.0.25, NetBeans 7.1. 2 with Windows 8 operating
system of x64 bit. Figure 2 illustrate a Screenshot of the vital Graphical user
interface (GUI) of eFireworks.
71
total number of regular sparks, m = 50, special Gaussian spark, 𝑚 ̂ = 5,
amplitude, A = 40, maximum spark of firework, b = 0.8 and minimum spark of
firework, a = 0.04. Results of test values are shown in Table 2 for runtime of 10
and Table 3 for runtime of 30.
As can be identified from Table 2 and 3, the optimization obtained with Fire-
works algorithm presents satisfactory results for all under listed benchmark
problems. Comparison can be drawn with other famous heuristic or bio-inspired
algorithms and software systems such as Karaboga and Basturk (2007)
& Bacanin (2014). The algorithm proves to be robust in its operations and
presents optimal results.
Drawing comparison from Table 2 and Table 3 it can be deduce that as the
number of runs are increased, results obtained tends to be slightly better. It can
therefore be concluded that the performance of the optimization solution,
Fireworks algorithm is marginally affected by alteration of number of runs
although it may be disregarded since the results presented is of tiny deviation.
72
6. CONCLUSIONS
REFERENCES
Bacanin, N., Tuba, M., & Stanarevic, N. (2012). Artificial Fish Swarm Algorithm for Unconstrained
Optimization Problems. Applied Mathematics in Electrical and Computer Engineering, 405–410.
Bonabeau, E., Dorigo, M., & Theraulaz, G. (1999). Swarm Intelligence: From Natural to Artificial
Systems. New York: Oxford University Press Inc.
Ding, K., Zheng, S. Q., & Tan, Y. (2013). A GPU-based Parallel Fireworks Algorithm for Optimization.
Gecco'13: Proceedings of the 2013 Genetic and Evolutionary Computation Conference, 9–16.
Karaboga, D., & Basturk, B. (2007). A powerful and efficient algorithm for numerical function
optimization: artificial bee colony (ABC) algorithm. Journal of Global Optimization, 39(3),
459-471. doi:10.1007/s10898-007-9149-x
Kennedy, J., Eberhart, R. C. (1995). Particle swarm optimization. Proceedings of IEEE International
Conference on Neural Networks, 4, 1942–1948.
Li, J., Zheng, S., & Tan, Y. (2014). Adaptive Fireworks Algorithm. 2014 IEEE Congress on
Evolutionary Computation (CEC), 3214–3221. doi:10.1109/CEC.2014.6900418
McCaffrey, J. (2016, September). Fireworks Algorithm Optimization. Retrieved from
https://msdn.microsoft.com/en-us/magazine/dn857364.aspx
Mohan, B. C., & Baskaran, R. (2012). A survey: Ant Colony Optimization based recent research
and implementation on several engineering domain. Expert Systems with Applications, 39(4),
4618-4627. doi:10.1016/j.eswa.2011.09.076
Ren, Y., & Wu, Y. (2013). An efficient algorithm for high-dimensional function optimization. Soft
Computing, 17, 995-1004. doi:10.1007/s00500-013-0984-z
Tan, Y., & Zhu, Y. (2010). Fireworks Algorithm for Optimization. In: Y. Tan, Y. Shi, & K.C. Tan
(Eds.), Advances in Swarm Intelligence. ICSI 2010. Lecture Notes in Computer Science (vol.
6145, pp. 355–364). Springer.
Tang, K. S., Man, K. F., Kwong, S., & He, Q. (1996). Genetic algorithms and their applications.
IEEE Signal Processing Magazine, 13(6), 22-37. doi:10.1109/79.543973
Virtual Library of Simulation Experiments: Test Functions and Datasets (n.d.). Retrieved August,
2016, from https://www.sfu.ca/~ssurjano/optimization.html
Yuan, Z., de Oca, M. A. M., Birattari, M., & Stutzle, T. (2012). Continuous optimization algorithms for
tuning real and integer parameters of swarm intelligence algorithms. Swarm Intelligence, 6(1),
49–75. doi:10.1007/s11721-011-0065-9
Zheng, S. Q., Janecek, A., Li, J. Z., & Tan, Y. (2014). Dynamic Search in Fireworks Algorithm.
2014 IEEE Congress on Evolutionary Computation (Cec), 3222–3229.
73
Zheng, S., Janecek, A., & Tan, Y. (2013). Enhanced Fireworks Algorithm. 2013 IEEE Congress
on Evolutionary Computation, 2069-2077. doi:10.1109/CEC.2013.6557813
Zheng, Y. J., Xu, X. L., & Ling, H. F. (2012). A hybrid fireworks optimization method with differential
evolution operators. Neurocomputing, 148, 75–80. doi:10.1016/j.neucom.2012.08.075
74