A Comparative Study On Prominent Nature Inspired
A Comparative Study On Prominent Nature Inspired
functions and hence identify the most prominent method. and its own best solution. Population of particles is distributed
Section also includes the experimental setup and analysis. uniformly for multi-dimension search space optimization
problem. Equation (1) is to calculate velocities of particles and
II. PROMINENT NATURE INSPIRED ALGORITHMS (2) to update positions.
A large number of NIAs have been developed in recent
= ∗ + ∗ − + ∗ − (1)
years. We have considered prominent to draw a comparison of
their performance in function optimization. The following = + (2)
subsections give brief description of the NIAs to make the paper
self-contained. In (1) the global best location is denoted by and
represents the best location ever encountered by this particle.
A. Genetic Algorithm (GA)
An inertia weight is included in (1) to avoid the swarm being
GA is a search and optimization technique which was trapped into a local minimum. Both and are learning
developed based on the principle of Natural Selection by parameters and , are random parameters in a range of [0,
Darwin [6]. GA belongs to the larger class of evolutionary 1].
algorithms (EA). GA generates solutions to optimization
problems using techniques inspired by natural evolution. Such C. Artificial Bee Colony (ABC)
natural evolution techniques are inheritance, mutation, ABC algorithm [12] is developed by Dervis Karaboga in
selection, and crossover [7]. The common features of this 2005. In the ABC algorithm, there are three groups of bees are
algorithm are: populations of chromosomes (i.e., solutions), considered: employed bees, onlookers and scouts. An employed
selection according to fitness, crossover to produce new bee goes to the food source to visit it. An onlooker bee waits on
offspring, and random mutation of new offspring. the dance area to make the decision for choosing a food source.
In GA, solutions are encoded as genes. Several GA In this algorithm, the number of food sources around the hive is
operations [7] are performed on the solutions evolved in GA equal to the number of employed bees. An employed bee
toward better solutions. Among various encoding techniques, becomes scout when its food source is exhausted by the
value encoding technique is commonly used for FO. The employed and onlooker bees. Scout bee carries out a random
evolution process normally starts from a population of search.
randomly generated individuals. After that GA operations are In ABC, the position of a food source represents a possible
repeated [7] through an iterative process (called generation). In outcome to the problem and the amount of nectar of the source
each generation, the fitness of every individual is evaluated. represents its quality (fitness). In the population, number of the
The fitness is usually the value of derived from the objective employed bees or the onlooker bees is equal to the number of
function evolved in the optimization problem to be solved. solutions. Initially, solutions are generated randomly. An
Selection process is occurred in each generation to emphasize onlooker bee chooses a food source according to the probability
on fitter individuals in the population and it is expected that associated with that food source . This is calculated by the
their off springs have better fitness. There are different types of following expression in (3).
selection which are tournament selection, roulette wheel
selection etc. After selection, crossover will occur on selected =∑ (3)
population. The main objective of crossover is to breed new
solutions (i.e., offspring) from existing ones. Mutation [6, 9] Where, represents the fitness value of th bee and it is
process is also considered to get new solution by randomly
proportional to the nectar amount of the food source in the
changing an existing solution. The best chromosome might be
position . N is the number of food sources. With a view to
lost while creating a new population by crossover or mutation.
producing a candidate food position from the old one, ABC uses
For this reason to keep the best solutions Elitism [6, 8] is used.
the expression shown in (4).
It copies the best chromosome to the new offspring before
crossover and mutation. , = , + , , − , (4)
B. Particle Swarm Optimization(PSO) In (4), i and are chosen randomly and {1, 2, . . . } denotes
PSO was proposed by Eberhart and Kennedy [10]. The the dimensions. In this algorithm k is determined randomly but
algorithms is evaluated according to the idea of swarm it needs to be different from . ψ , is also a random number
intelligence based on the swarming habits by certain kinds of between [-1, 1]. It not only controls the production of neighbor
animals (such as birds and fish). The basic operations of PSO food sources around , but also represents the comparison of
are performed simultaneously maintaining several candidate two food positions. A food source is abandoned if the position
solutions in the search space. During each iteration of the which cannot be improved further through a predetermined
algorithm, the fitness of each candidate solution is determined number of cycles. This predetermined number of cycles is a
by the objective function being optimized. In PSO, each particle major control parameter called ‘‘limit”. This limit is used for
represents a potential solution. At every iteration each particle abandonment. If the abandoned food source is and j ϵ {1, 2. .
moves to a new position (i.e., search a new point) based on the . M} then scout discovers a new food source which is to be
calculated velocity. replaced with expressed in (5).
In PSO, each particle updates position based on the
= + Rand [0, 1]*( – ) (5)
calculated velocity comparing the best solution of population
D. Firefly Algorithm (FFA) The ѐ ℎ essentially provides a random walk while the
FFA is developed in 2009 by Xin-She Yang [13]. Fireflies random step length is drawn from a Levy distribution with a
produce short and rhythmic flashes. For a particular species the heavy tail. In fact, Levy flights have been observed among
pattern of flashes is often unique. The rhythmic flash, the rate foraging patterns of albatrosses, fruit flies and spider monkeys
of flashing and the amount of time form part of the signal [14].
system, helps to bring both male and female fireflies together. F. Group Search Optimization (GSO)
In the same species females give the response to a male’s
unique pattern of flashing. GSO is a novel optimization technique [15] had been
developed inspired by animal searching behavior by S. He et al.
There are three idealized rules in FFA: i) all fireflies are in 2009. Animal searching behavior may be described as an
unisex so that one firefly will be attracted to other fireflies active movement through which animal can find resources such
regardless of their sex; ii) attractiveness is proportional to their as foods, mates, nesting sites. One major consequence of living
brightness so that the less brighter one will move towards the together is that it is group searching strategy which allows
brighter one; and iii) the brightness of a firefly is affected or group members to increase patch finding rates [16]. Simply this
determined by the landscape of the objective function. The has led to the adoption of two foraging strategies within groups
brightness I of a firefly at a particular location X which can be which are 1) producing, e.g., searching for food; and 2)
chosen as I(X) ∞ F(X). The attractiveness β is relative and it is scrounging, e.g., joining resources uncovered by others. The
seen by other fireflies. As a result, it will vary with the distance second one is also referred to as conspecific attraction,
, between firefly i and firefly j. A firefly’s attractiveness β is kleptoparasitism [17].
proportional to the light intensity which is seen by the adjacent In GSO algorithm population is called a group and each
fireflies and it is defined by = . The distance individual animal is called a member. In an n dimensional
between any two fireflies i and j at and in Cartesian form search space, the ith member at the kth searching bout has a
is current position which is ∈ and a head angle =
,…., ( ) .The search direction of the ith member
, = − = ∑ , − , (6)
is a unit vector ( )= ,…., ( ) which is
Here, , is the k-th component of spatial coordinate of i-th measured from polar to Cartesian coordinate transformation
firefly. In 2-D case distance is , = − + − . The [18].
movement of a firefly i is attracted to another more attractive
(brighter) firefly j which is determined by (7). = Cos
= (1 − ) ∗ + ∗ + ∗( − ) (7)
The parameter α determines the maximum radius of the random = Sin . Cos ( = 2, … . . , − 1)
( )
step in (7) and determines the step size towards the better
solution.
= Sin ( )
(9)
E. Cuckoo Search Optimization (CSO)
CSO is invented by Xin-she Yang and Suash Deb in 2009 Practically in a 3-D search space, if at the k-th searching bout,
[14]. The algorithm is inspired by the brood parasitism behavior the ith member’s head angle is = ( /3, /4), then the search
seen in some species of cuckoo. They lay their eggs in other direction unit vector is = (1/2, √6/4, √2/2). Each iteration,
bird's nest. When the host bird finds out about this then it will a group member located in the most promising area and which
either throw away the cuckoo’s egg or simply abandon the nest confers the best fitness value chosen as the producer. It then
and build a new one. Naturally the cuckoo’s eggs hatch slightly stops and scans the environment to seek resources actually that
earlier than their host eggs. When the first cuckoo chick is is the optima. At the kth iteration producer will scan at zero
hatched then its first instinct action is to evict the host eggs. It degree in (10) and then scan laterally by randomly sampling
blindly propels the eggs out of the nest. other two points in right according to (11) and left by (12) [20].
Basic rules of CSO [14] are firstly, each cuckoo lays one = + ( ) (10)
egg at a time and dumps it in a randomly chosen nest. Secondly,
the best nests with the best quality of eggs will be brought to = + ( + /2) (11)
the next generation. And the number of host nests is fixed. The
probability of a host bird will discover cuckoo’s egg = + ( − /2) (12)
is (0,1). The host bird can throw away the eggs or it can is a normally distributed random number with mean 0
build a new nest. An egg which is in the nest represents a and standard deviation 1 and is random number
solution to the problem and at each time step (8) is used to sequence in the range (0,1). If the resource of the best point
generate new solutions (cuckoo’s eggs). among the three is better than producer’s current position, then
= + ( ѐ ℎ )∗ (8) it will fly to that point; otherwise it will have to stay in its
present position and turn its head to a new randomly generated
TABLE 2: COMPARISON AMONG THE NIAS ON THE BASIS OF FITNESS ACHIEVED FOR THE BENCHMARK FUNCTIONS.
F fmin GA PSO ABC FFA CSO GSO GWO
F1 0 15.7926 4.24E-17 0.000133253 0.00284646 0.0525012 1.05E-08 0
F2 0 1.28491 1.31E-09 5.06E-05 0.217751 3.98673 3.68E-05 3.07E-243
F3 0 4609.02 0.894981 18695.8 968.129 612.874 30.6306 3.42E-216
F4 0 447.935 17.4573 226.237 27.5278 58.1186 22.4814 25.0925
F5 0 18 0 0 0 1 0 0
F6 0 0.0335914 0.0149869 0.0372523 0.0224415 0.0211475 0.0163996 4.49E-05
F7 0 2.63916 1.69E-16 8.83E-06 0.0483066 0.00545648 2.60E-09 0
F8 -12569.5 -12532.7 -8304.92 -6037.71 -7772.46 -9219.58 -12569.5 -8164.71
F9 0 7.54775 32.8336 167.403 24.543 77.0304 2.01486 0
F10 0 2.05604 2.93E-09 0.00439897 0.0218787 5.39665 7.38E-05 7.99E-15
F11 0 1.19676 0 0.00511547 0.00234851 0.146045 4.38E-08 0
F12 0 0.0808211 7.15E-20 6.67666 0.000144563 0.99701 6.34E-11 0.0130216
F13 0 1.06799 3.82E-18 10.5903 0.00129743 0.673448 9.46E-10 1.01E-06
F14 1 0.99801 0.998004 0.998004 1.99203 0.998004 0.998004 0.998004
F15 -1.031 -1.0315 -1.03163 -1.03163 -1.03163 -1.03163 -1.03163 -1.03163
F16 0.398 0.398013 0.397887 0.397887 0.397887 0.397887 0.397887 0.397887
F17 3 3.00366 3 3 3 3 3 3
F18 -3.86 -0.289072 -0.300479 -0.300479 -0.300479 -0.300479 -0.300033 -0.300479
F19 -3.32 -3.32183 -3.322 -3.322 -3.322 -3.322 -3.322 -3.32199
F20 -10.1532 -10.0993 -10.1532 -10.1532 -10.1532 -10.1532 -10.1532 -10.1532
F21 -10.4028 -9.70687 -10.4029 -10.4029 -10.4029 -10.4029 -10.4029 -10.4029
F22 -10.5363 -10.5246 -10.5364 -10.5364 -10.5364 -10.5364 -10.5364 -10.5363
Optimal Count 0 6 7 6 6 8 11
Best Count
2 13 9 8 8 10 17
among NAIs
ss 20000
Fitness
e Iteration=1000 Iteration=1000
tn
i 15000 100
F
10000
50
5000
0 0
0 100 200 300 400 500 0 100 200 300 400 500
Population Population
(a) Fitness varying population. (a) Fitness varying population.
40000 350
PSO ABC PSO ABC
GWO FFA 300 GWO FFA
30000 GSO CSO GSO CSO
GA 250 GA
Fitness
Fitness
200
20000 Population=100
150 Population=100
100
10000
50
0 0
0 200 400 600 800 1000 0 200 400 600 800 1000
Iteration Iteration
(b) Fitness varying iteration. (b) Fitness varying iteration.
Figure 1. Effect of Population and Iteration on Schwefel 1.2. Figure 2. Effect of Population and Iteration on Rastrigin.
the other hand, GWO, GSO and CSO are shown to perform well [10] J. Kennedy, and R. C. Eberhart. (1995). Particle swarm optimization. In
with relatively small population than others. For iteration Proceedings of the 1995 IEEE International Conference on Neural
Networks, Vol. 4, pp. 1942-1948
variation, ABC and CSO are shown very bad for small iteration
[11] James Blodin, Particle Swarm Optimization: A Tutorial, Sep. 4, 2009.
with respect to other NIAs. GWO and GSO are found to
[12] D. Karaboga, An Idea Based On Honey Bee Swarm For Numerical
perform well with less number of iteration. Finally, GWO is Optimization,Technical Report-TR06, Erciyes University, Engineering
able to solve a function effectively with small population as Faculty, Computer Engineering Department, 2005.
well as small iteration. [13] X.-S. Yang, “Firefly algorithms for multimodal optimization”, in:
Stochastic Algorithms: Foundations and Applications, SAGA 2009,
REFERENCES Lecture Notes in Computer Sciences, vol. 5792, pp. 169-178, 2009.
[1] Mathematical optimization, Available: https://en.wikipedia.org/wiki/ [14] X.-S. Yang, S. Deb. (2009). Cuckoo search via Levy flights. Proc. of
Mathematical_optimization. [Accessed: 10- Apr- 2016]. World Congress on Nature Biologically Inspired Computing, pp. 210-214
[2] Optimization problem, Available: https://en.wikipedia.org/ [15] S. He, Q. H. Wu and J. R. Saunders, “Group Search Optimizer: An
wiki/Optimization_problem. [Accessed: 10- Apr- 2016]. Optimization Algorithm Inspired by Animal Searching Behavior”, IEEE
Transactions On Evolutionary Computation, vol. 13, no. 5, October 2009.
[3] Saibal. K. Pal, C.S. Rai, Amrit Pal Singh, “ Comparative Study of Firefly
Algorithm and Particle Swarm Optimization for Noisy Non-Linear [16] H. R. Pulliam and G. E. Millikan, “Social organization in the non-
Optimization Problems”, International Journal on Intelligent Systems and reproductive season,” Animal Behavior, vol. 6, pp. 169–197, 1983.
Applications, 2012, 10, pp. 50-57, DOI: 10.5815/ijisa.2012.10.06 [17] J. Brockmann and C. J. Barnard, “Kleptoparasitism in birds,” Animal
[4] Vesterstrøm, Jakob, and Rene Thomsen. "A comparative study of Behavior, vol. 27, pp. 546–555, 1979.
differential evolution, particle swarm optimization, and evolutionary [18] D. Mustard, “Numerical integration over the n-dimensional spherical
algorithms on numerical benchmark problems." Evolutionary shell,” Math. Comput., vol. 18, no. 88, pp. 578-589, Oct. 1964.
Computation, 2004. CEC2004. Congress on. Vol. 2. IEEE, 2004. [19] J. W. Bell, Searching Behavior–The Behavioral Ecology of Finding
[5] Ab Wahab MN, Nefti-Meziani S, Atyabi A (2015) A Comprehensive Resources (Chapman and Hall Animal Behavior Series). London, U.K.:
Review of Swarm Optimization Algorithms. PLoS ONE 10(5): e0122827. Chapman and Hall, 1990.
doi:10.1371/journal.pone.0122827 [20] W. J. O’Brien, B. I. Evans, and G. L. Howick, “A new view of the
[6] D. E. Goldberg, ‘Genetic Algorithm In Search, Optimization And predation cycle of a planktivorous fish, white crappie (pomoxis
Machine Learning’, New York: Addison – Wesley 1989. annularis),” Canadian J. Fisheries Aquatic Sci., vol. 43, pp. 1894–1899,
[7] Genetic Algoeithm, Available: https://en.wikipedia.org/wiki/Genetic_alg 1986.
orithm [Accessed: 12- Apr- 2016]. [21] S. Mirjalili , S. M. Mirjalili, A. Lewis, “Grey Wolf Optimizer”, Advances
in Engineering Software 69 (2014) 46–61.
[8] John H. Holland ‘Genetic Algorithms’, Scientific American Journal, July
1992. Muro C, Escobedo R, Spector L, Coppinger R. Wolf-pack (Canis lupus)
hunting strategies emerge from simple rules in computational simulations.
[9] Mitchell Melanie, An Introduction to Genetic algorithms, ISBN BehavProcess 2011; 88:192–7.
0−262−13316−4 (HB), 0−262−63185−7 (PB), 1996