0% found this document useful (0 votes)
20 views7 pages

A Hybrid Bat Algorithm: Iztok Fister JR., Du San Fister, Xin-She Yang

Hybrid bat
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
20 views7 pages

A Hybrid Bat Algorithm: Iztok Fister JR., Du San Fister, Xin-She Yang

Hybrid bat
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 7

E LEKTROTEHNI ŠKI VESTNIK 80(1-2): 1–7, 2013

O RIGINAL S CIENTIFIC PAPER

A Hybrid Bat Algorithm

Iztok Fister Jr.1 , Dušan Fister1 , Xin-She Yang2


1
University of Maribor
Faculty of electrical engineering and computer science, Smetanova 17, 2000 Maribor, Slovenia
E-mail: [email protected]
2
University of Middlesex
School of Science and Technology, Middlesex University, London NW4 4BT, United Kingdom
E-mail: [email protected]

Abstract. Swarm intelligence is a very powerful technique appropriate to optimization. In this paper, we present
a new swarm intelligence algorithm, which is based on the bat algorithm. Bat algorithm has been hybridized
with differential evolution strategies. This hybridization showed very promising results on standard benchmark
functions and also significantly improved the original bat algorithm.
Keywords: swarm intelligence, bat algorithm, differential evolution, optimization

Hibridni algoritem na osnovi obnašanja netopirjev and differential evolution [29], [3], [6] work on real-
valued solutions, genetic programming [21] acts on pro-
Inteligenca rojev, (angl. Swarm Intelligence) postaja zelo
grams in Lisp, while the evolutionary programming [13]
pomembna optimizacijska tehnika. V članku predstavljamo
nov algoritem inteligence rojev, ki temelji na osnovi obnašanja behaves with finish state automata. Evolutionary algo-
netopirjev, (angl. Bat Algorithm) in je hibridiziran s strategi- rithms have been applied to a wide range of areas
jami diferencialne evolucije. Poleg zelo obetavnih rezultatov of optimization, modeling, and simulation. Essentially,
na testnih primerih, (angl. Benchmark Functions), hibridizacija differential evolution has sucessfully been employed
prav tako občutno izboljša originalni netopirski algoritem.
in the following areas of optimization: function opti-
mization [28], large-scale global optimization [4], graph
1 I NTRODUCTION coloring [8], chemical process optimization [1].
Swarm intelligence is the collective behaviour of
Nature has always been an inspiration for researchers. decentralized, self-organized systems, either natural or
In the past, many new nature-inspired algorithms have artificial. Swarm intelligence was introduced by Beny
been developed to solve hard problems in optimization. in 1989. A lot of algorithms were proposed since then.
In general, there are two main concepts developed in Swarm intelligence algorithms were applied on continu-
bio-inspired computation: ous as well as combinatorial optimization problems [25].
1) evolutionary algorithms, The most well-known classes of swarm intelligence
2) swarm intelligence algorithms. algorithms are as follows: particle swarm optimization,
Evolutionary algorithms are optimization tech- ant colony optimization, artificial bee colony, firefly
niques [7] that base on Darwin’s principle of survivor of algorithm, cuckoo search and bat algorithm.
the fittest [5]. It states that in nature, the fittest individ- Particle swarm optimization has been successfully
uals have the greater chances to survive. Evolutionary applied in problems of antenna design [17] and electro-
algorithms consist of the following disciplines: genetic magnetics [27]. Ant colony algorithms were also used
algorithms, evolution strategies, genetic programming, in many areas of optimization [20] [26] [22].Artificial
evolutionary programming, differential evolution. bee colony showed good performance in numerical
Although all these algorithms or methods were de- optimization [18] [19], in large-scale global optimiza-
veloped independently, they share similar characteristics tion [10], and also in combinatorial optimization [24]
(like variation operators, selection operators), when solv- [9] [30].
ing problems. In fact, the evolutionary algorithms are Cuckoo search algorithm is a very strong method
distinguished by their representation of solutions. For for function optimization and also for engineering op-
example, genetic algorithms [15], [16] support the binary timization problems[34] [33]. Firefly algorithm showed
representation of solution, evolution strategies [2], [11] promising results in function optimization and it showed
good results also in combinatorial optimization [12].
Received 12 April 2013 Echolocation is an important feature of bat behaviour.
Accepted 26 April 2013 That means, bats emit a sound pulse and listen to
2 FISTER, FISTER, YANG

the echo bouncing back from obstacles whilst flying. 2) Bats fly randomly with velocity vi at position xi
This phenomenon has been inspired Yang [36] to de- with a fixed frequency fmin , varying wavelength
velop the Bat Algorithm (BA). The algorithm obtained λ and loudness A0 to search for prey. They can
good results where dealing with lower-dimensional op- automatically adjust the wavelength (or frequency)
timization problems, but may become problematic for of their emitted pulses and adjust the rate of pulse
higher-dimensional problems because it tends to con- emission r ∈ [0, 1], depending on the proximity
verge very fast initially. On the other hand, differential of their target.
evolution [23] is a typical evolutionary algorithm with 3) Although the loudness can vary in many ways,
differential mutation, crossover and selection that was we assume that the loudness varies from a large
successfully applied to continuous function optimiza- (positive) A0 to a minimum constant value Amin .
tion.
In order to improve bat algorithm behaviour for Algorithm 1 Original Bat Algorithm
higher-dimensional problems, the original bat algorithm
1: Objective function f (x), x = (x1 , ..., xd )T
were hybridized with differential-evolution strategies, in 2: Initialize the bat population xi and vi for i = 1 . . . n
this paper. This Hybrid Bat Algorithm (HBA) has been 3: Define pulse frequency Qi ∈ [Qmin , Qmax ]
tested on a standard set of benchmark functions taken 4: Initialize pulse rates ri and the loudness Ai
from literature. Our results of numerical experimental 5: while (t < Tmax ) // number of iterations
6: Generate new solutions by adjusting frequency, and
show that the proposed HBA can significantly improve
7: updating velocities and locations/solutions [Eq.(2) to (4)]
the performance of the original bat algorithm, which can 8: if(rand(0, 1) > ri )
be very useful for the future. 9: Select a solution among the best solutions
The structure of the paper is as follows. In Section 10: Generate a local solution around the best solution
2, the original bat algorithm together with differential 11: end if
12: Generate a new solution by flying randomly
evolution algorithm are introduced. In line with this,
13: if(rand(0, 1) < Ai and f (xi ) < f (x))
some biological foundations of bat behaviour are ex- 14: Accept the new solutions
plained. Section 3 describes our proposed novel ap- 15: Increase ri and reduce Ai
proach of hybridizing the bat algorithm with differential 16: end if
evolution strategies. Section 4 illustrates experiments 17: Rank the bats and find the current best
18: end while
and discusses the results. At the end of the paper, we
19: Postprocess results and visualization
conclude with future directions and developments with
HBA.
The original bat algorithm is illustrated in Algo-
2 BAT ALGORITHM rithm 1. In this algorithm bat behaviour is captured into
fitness function of problem to be solved. It consists of
Bat algorithm has been developed by Xin-She Yang the following components:
in 2010 [35]. The algorithm exploits the so called
• initialization (lines 2-4),
echolocation of bats. Bats use sonar echoes to detect
• generation of new solutions (lines 6-7),
and avoid obstacles. It is generally known, that sound
• local search (lines 8-11),
pulses are transformed to frequency which reflects from
• generation of a new solution by flying randomly
obstacle. Bats can use time delay from emission to
(lines 12-16),
reflection and use it for navigation. They typically emit
• find the current best solution.
short loud, sound impulses. The pulse rate is usually
defined as 10 to 20 times per second. After hitting Initialization of the bat population is performed ran-
and reflecting, bats transform their own pulse to useful domly. Generating the new solutions is performed by
information to gauge how far away the prey is. Bats are moving virtual bats according the following equations:
using wavelengths, that vary from range [0.7,17] mm or (t)
Qi = Qmin + (Qmax − Qmin )U (0, 1),
inbound frequencies [20,500] kHz. By implementation, (t+1) (t)
pulse frequency and rate has to be defined. Pulse rate vi = vit + (xti − best)Qi , (1)
can be simply determined from range 0 to 1, where 0 (t+1) (t) (t)
xi = xi + vi ,
means there is no emission and by 1, bats are emitting
maximum [14], [31], [37]. where U (0, 1) is a uniform distribution. A random walk
This behaviour can be used to formulate the new bat with direct exploitation is used for local search that
algorithm. Yang [35] used three generalized rules for bat modifies the current best solution according to equation:
algorithms: (t)
x(t) = best + Ai (2U (0, 1) − 1), (2)
1) All bats use echolocation to sense distance, and
(t)
they also guess the difference between food/prey where  is the scaling factor, and Ai the loudness. The
and background barriers in a some magical way. local search is launched with the proximity depending
HYBRID BAT ALGORITHM 3

on the pulse rate ri . The term in line 13 is similar to the In technical sense, crossover and mutation can be
simulated annealing behavior, where the new solution is performed on many ways in differential evolution.
accepted with some proximity depending on parameter Therefore, a specific notation was used to describe a
Ai . In line with this, the rate of pulse emission ri variety of these methods (also strategies) in general. For
increases and the loudness Ai decreases. Both charac- example, ”DE/rand/1/bin” denotes that the base vector is
teristics imitate natural bats, where the rate of pulse randomly selected, 1 vector difference is added to it, and
emission increases and the loudness decreases when a the number of modified parameters in mutation vector
bat finds a prey. Mathematically, these characteristics are follows binomial distribution.
captured with following equations:
Ai
(t+1)
= αAi ,
(t) (t) (0)
ri = ri [1 − exp(−γ)], (3) 4 H YBRID BAT A LGORITHM
where α and γ are constants. Actually, the α parameter As we mentioned before, a new bat algorithm, called
plays a similar role as the cooling factor in simulated Hybrid Bat Algorithm (HBA) is proposed in this paper.
annealing algorithm that controls the convergence rate That is, the original bat algorithm was hybridized using
of this algorithm. the differential evolution strategies. The pseudo-code of
the HBA is illustrated in Algorithm 2.
3 D IFFERENTIAL EVOLUTION
Algorithm 2 Hybrid Bat Algorithm
Differential evolution(DE)[29][6] is a technique for op-
1: Objective function f (x), x = (x1 , ..., xd )T
timization which was introduced by Storn and Price in 2: Initialize the bat population xi and vi for i = 1 . . . n
1995. DE optimizes a problem by maintaining a popu- 3: Define pulse frequency Qi ∈ [Qmin , Qmax ]
lation of candidate solutions and creating new candidate 4: Initialize pulse rates ri and the loudness Ai
solutions by combining existing ones according to its 5: while (t < Tmax ) // number of iterations
6: Generate new solutions by adjusting frequency, and
simple formulae, and then keeping whichever candidate
7: updating velocities and locations/solutions [Eq.(2) to (4)]
solution has the best score or fitness on the optimization 8: if(rand(0, 1) > ri )
problem at hand. 9: Modify the solution using ”DE/rand/1/bin”
DE supports a differential mutation, a differential 10: end if
crossover and a differential selection. In particular, the 11: Generate a new solution by flying randomly
12: if(rand(0, 1) < Ai and f (xi ) < f (x))
differential mutation randomly selects two solutions and
13: Accept the new solutions
adds a scaled difference between these to the third 14: Increase ri and reduce Ai
solution. This mutation can be expressed as follows: 15: end if
16: Rank the bats and find the current best
(t) (t) (t) (t) 17: end while
ui = wr0 + F · (wr1 − wr2 ), for i = 1 . . . N P, (4) 18: Postprocess results and visualization
where F ∈ [0.1, 1.0] denotes the scaling factor as a
positive real number that scales the rate of modification As a result, HBA differs from the original BA in lines
while r0, r1, r2 are randomly selected vectors in the 9, where solution is modified using ”DE/rand/1/bin”
interval 1 . . . N P . strategy.
Uniform crossover is employed as a differential
crossover by the DE. The trial vector is built out
of parameter values that have been copied from two
5 E XPERIMENTS AND RESULTS
different solutions. Mathematically, this crossover can A goal of experiments was to show that HBA sig-
be expressed as follows: nificantly improves the results of the original BA. In
( line with this, two bat algorithms were implemented
(t) according to specifications in Algorithms 1 and 2 so
ui,j randj (0, 1) ≤ CR ∨ j = jrand ,
zi,j = (t) (5) that a well-selected set of test functions in the literature
wi,j otherwise,
are used for optimization benchmarks.
where CR ∈ [0.0, 1.0] controls the fraction of parame- Parameters of both bat algorithms were the same.
ters that are copied to the trial solution. Note, the relation Dimension of the problem has a crucial impact on
j = jrand assures that the trial vector is different from the results of optimization. In order to test how the
the original solution Y (t) . dimension influences on the results, three different sets
Mathematically, differential selection can be ex- of dimensions were taken into account, i.e., D = 10,
pressed as follows: D = 20, and D = 30. The functions with dimension
(
(t) (t) D = 10 were limited to 1,000 maximal number of
(t+1) zi if f (Z (t) ) ≤ f (Yi ), generations, the functions with dimension D = 20 twice
wi = (t) (6)
wi otherwise . as much, while the functions with dimension D = 30 to
4 FISTER, FISTER, YANG

3,000. The initial loudness was set to A0 = 0.5 same as able to cross the valley among the optima and achieve
the initial pulse rate (r0 = 0.5). The frequency was taken better results.
from interval Qi ∈ [0.0, 2.0]. Algorithms optimized each n−1 q
−0.2 0.5(x2 2
i+1 +xi )
X
function 25 times and results were measured according f5 (x) = [20+e − 20e −
(11)
to the best, worst, mean, and medium values in these i=1
0.5(cos(2πxi+1 )+cos(2πxi ))
runs. In addition, the standard deviation of mean values e ],
were calculated as well. where −32.00 ≤ xi ≤ 32.00. The global minimum of
5.1 Test suite this function is at 0.

Test suite consists of five standard function taken 5.2 PC configuration


from the literature [32]. Functions in this test suite are Configuration of PC, on which experiments have been
represented in the rest of paper. executed was as follows:
5.1.1 Griewangk’s function: The aim of the func- • HP Pavilion g4,
tion is overcoming failures, that are optimizing each • processor Intel(R) Core(TM) i5 @ 2.40 GHz,
variable independently. This function is multimodal, • memory 8 GB,
since the number of local optima increases with the • implemented in C++.
dimensionality. After sufficiently high dimensionalities
(n > 30), multimodality seems to disappear and the 5.3 The results
problem seems unimodal. The results of our extensive numerical experiments
D D can be summarized in Table 1. The table represents re-
x2i
  X
Y xi sults of BA and HBA algorithms (column 1) solving the
f1 (~x) = − cos √ + + 1, (7)
i=1
i i=1
4000 test suite of five functions (denoted as f1 , f2 , f3 , f4 , f5 )
with dimensions D = 10, 20 and 30, respectively,
where −600 ≤ xi ≤ 600. The function has the global according to best, worst, mean, median, and standard
minimum at 0. deviation values.
5.1.2 Rosenbrock’s function: The Rosenbrock func- The results of HBA show that this algorithm signif-
tion, similarly to Rastrigin’s has its value 0 at global icantly improved the results of original BA according
minimum. The global optimum is located inside a to almost all measures except the standard deviation in
parabolic, narrow shaped flat valley. Variables are some cases (e.g., by Ackley function). The statistical
strongly dependent from each other, since it is difficult analysis of the results has not been performed because
to converge the global optimum. these are evident better by HBA than by BA.
D−1
X In order to observe how the results of both algorithms
f2 (x~i ) = 100 (xi+1 − x2i )2 + (xi − 1)2 , (8) (i.e., BA and HBA) modified with the dimensions of
i=1 the functions, a mean value of functions f1 and f3 with
where −15.00 ≤ xi ≤ 15.00. dimensions D = 10, D = 20, and D = 30 are presented
5.1.3 Sphere function: in Figs. 1-3. The logarithmic scale is used to display
the mean value on y-axis. Higher the mean value, more
D
X difficult the function is to solve.
f3 (x~i ) = x2i , (9)
i=1

where −15.00 ≤ xi ≤ 15.00. 1.0e+02


5.1.4 Rastrigin’s function: Based of the Sphere func- 1.0e+00
Mean value

tion, Rastrigin function adds cosine modulation to create


many local minima. Because of this feature, the function 1.0e-02
is multimodal. Its global minimum is at the value 0. 1.0e-04
n
X 1.0e-06
f4 (x~i ) = n ∗ 10 + (x2i − 10 cos(2πxi )), (10) BA
1.0e-08
i=1 HBA
1.0e-10
where −15.00 ≤ xi ≤ 15.00.
D=10 D=20 D=30
5.1.5 Ackley’s function: The complexity of this func-
tion is moderated, since there is exponential term that Dimensions
covers its surface with numerous local minima. It is
Figure 1. Mean value of function f1 with various dimensions.
based on the gradient slope. Only the algorithm that uses
the gradient steepest descent will be trapped in a local From Fig. 1 it can be seen that the best results
optima. Search strategy, analyzing wider area, will be are obtained by optimizing the function f1 with the
HYBRID BAT ALGORITHM 5

Table 1. The results of experiments


Alg. D Value f1 f2 f3 f4 f5
Best 3.29E+01 1.07E+04 5.33E+01 6.07E+01 1.37E+01
Worst 1.73E+02 1.58E+06 3.11E+02 5.57E+02 2.00E+01
10 Mean 8.30E+01 5.53E+05 1.44E+02 2.27E+02 1.75E+01
Median 3.91E+01 4.69E+05 6.44E+01 1.06E+02 1.68E+00
StDev 6.94E+01 4.71E+05 1.48E+02 2.17E+02 1.73E+01
Best 8.77E+01 3.41E+02 2.24E+02 7.28E+01 2.15E+02
Worst 1.43E+02 1.02E+03 5.72E+02 2.02E+02 5.87E+02
BA 20 Mean 1.46E+00 6.87E+02 3.56E+02 1.82E+02 3.38E+02
Median 1.90E+05 3.16E+06 1.08E+06 9.20E+05 7.50E+05
StDev 1.64E+01 2.00E+01 1.85E+01 1.21E+00 1.80E+01
Best 1.58E+02 4.95E+02 3.29E+02 8.74E+01 3.39E+02
Worst 4.18E+02 1.67E+03 7.80E+02 2.64E+02 7.82E+02
30 Mean 1.51E+02 1.01E+03 5.17E+02 2.12E+02 4.67E+02
Median 4.66E+05 6.23E+06 2.10E+06 1.26E+06 2.06E+06
StDev 1.52E+01 2.00E+01 1.79E+01 1.25E+00 1.76E+01
Best 2.25E-09 6.34E-02 4.83E-09 5.12E+00 6.31E-04
Worst 3.97E-05 5.10E+02 2.89E-03 2.38E+01 2.00E+01
10 Mean 3.18E-06 6.22E+01 1.26E-04 1.55E+01 1.16E+01
Median 8.66E-06 1.15E+02 5.66E-04 4.46E+00 9.26E+00
StDev 1.14E-07 7.73E+00 1.66E-07 1.69E+01 1.78E+01
Best 1.01E-07 9.73E-03 4.83E-04 1.89E-03 3.70E-05
Worst 2.96E+01 9.24E+01 5.47E+01 1.77E+01 5.48E+01
HBA 20 Mean 8.56E-07 1.10E-01 5.87E-03 2.18E-02 3.82E-05
Median 3.60E+01 1.44E+03 2.53E+02 3.10E+02 1.41E+02
StDev 2.17E+00 2.00E+01 1.60E+01 6.18E+00 1.95E+01
Best 6.38E-06 8.28E+00 3.37E-01 1.62E+00 5.43E-04
Worst 3.57E+01 2.17E+02 9.97E+01 3.98E+01 9.85E+01
30 Mean 6.42E-05 6.59E+01 3.09E+00 1.29E+01 2.53E-03
Median 5.99E+01 4.00E+03 7.67E+02 1.26E+03 2.15E+02
StDev 3.12E+00 2.00E+01 1.72E+01 5.03E+00 1.94E+01

dimension D = 20, while the worst by optimizing the


same function with the highest dimension D = 30. 1.0e+02
1.0e+00
Mean value

1.0e-02
1.0e+02
1.0e-04
1.0e+00
Mean value

1.0e-06
1.0e-02
1.0e-08 BA
1.0e-04 HBA
1.0e-10
1.0e-06 D=10 D=20 D=30
1.0e-08 BA
HBA Dimensions
1.0e-10
D=10 D=20 D=30 Figure 3. Mean value of function f5 with various dimensions.
Dimensions
When comparing the results of the BA with the HBA,
Figure 2. Mean value of function f3 with various dimensions. it can be observed that results of HBA significantly
Interestingly, the mean value of a function f5 with outperformed the results of the original BA algorithm
dimension D = 10 is the most difficult for the HBA by optimizing the functions f1 , f3 , and f5 . Functions
algorithm, while the same function with dimension f2 and f4 by HBA are also better, but the difference is
D = 20 is the easiest to solve. In contrast, the results not outstanding.
of the original BA algorithm showed that increasing the
dimensions also the results become worse. 6 C ONCLUSION
As can be seen from Fig. 2, difficulty to solve the
function f3 is increased with increasing the dimensional- In this paper, we have improved the bat algorithm by
ity of the problem. As a result, the most difficult function developing a new variant, called hybrid bat algorithm.
to solve is the function f3 with dimension D = 30. This new HBA is a hybrid of BA with DE strategies.
6 FISTER, FISTER, YANG

Experiments has shown that this algorithm improves [21] John R. Koza. Genetic programming 2 - automatic discovery
significantly the original version of the bat algorithm. of reusable programs. Complex adaptive systems. MIT Press,
1994.
In the future, hybrid bat algorithm would be tested on
large-scale global optimization. We will also do more [22] D. Merkle, M. Middendorf, and H. Schmeck. Ant colony
extensive testing using more diverse test function sets, optimization for resource-constrained project scheduling. Evo-
lutionary Computation, IEEE Transactions on, 6(4):333–346,
together with a detailed parametric study. 2002.

[23] F. Neri and V. Tirronen. Recent advances in differential evolu-


R EFERENCES tion: a survey and experimental analysis. Artificial Intelligence
Review, 33(1–2):61–106, 2010.
[1] BV Babu and R. Angira. Modified differential evolution (mde)
for optimization of non-linear chemical processes. Computers & [24] Q.K. Pan, M. Fatih Tasgetiren, P.N. Suganthan, and T.J. Chua. A
chemical engineering, 30(6):989–1002, 2006. discrete artificial bee colony algorithm for the lot-streaming flow
[2] Thomas Bäck. Evolutionary algorithms in theory and practice shop scheduling problem. Information sciences, 181(12):2455–
- evolution strategies, evolutionary programming, genetic algo- 2468, 2011.
rithms. Oxford University Press, 1996.
[3] J. Brest, S. Greiner, B. Boskovic, M. Mernik, and V. Zumer. Self- [25] RS Parpinelli and HS Lopes. New inspirations in swarm
adapting control parameters in differential evolution: A compar- intelligence: a survey. International Journal of Bio-Inspired
ative study on numerical benchmark problems. Evolutionary Computation, 3(1):1–16, 2011.
Computation, IEEE Transactions on, 10(6):646–657, 2006.
[4] J. Brest, A. Zamuda, I. Fister, and M.S. MaucĚŚec. Large scale [26] R.S. Parpinelli, H.S. Lopes, and A.A. Freitas. Data mining with
global optimization using self-adaptive differential evolution an ant colony optimization algorithm. Evolutionary Computa-
algorithm. In Evolutionary Computation (CEC), 2010 IEEE tion, IEEE Transactions on, 6(4):321–332, 2002.
Congress on, pages 1–8. IEEE, 2010.
[5] Charles Darwin. The origin of species. John Murray, London, [27] J. Robinson and Y. Rahmat-Samii. Particle swarm optimization
UK, 1859. in electromagnetics. Antennas and Propagation, IEEE Transac-
tions on, 52(2):397–407, 2004.
[6] S. Das and P.N. Suganthan. Differential evolution: A survey of
the state-of-the-art. Evolutionary Computation, IEEE Transac-
[28] Y. Shi, H. Teng, and Z. Li. Cooperative co-evolutionary differ-
tions on, 15(1):4–31, 2011.
ential evolution for function optimization. Advances in natural
[7] A.E. Eiben and J.E. Smith. Introduction to Evolutionary Com-
computation, pages 428–428, 2005.
puting. Springer-Verlag, Berlin, 2003.
[8] I. Fister and J. Brest. Using differential evolution for the graph [29] R. Storn and K. Price. Differential evolution–a simple and
coloring. In Differential Evolution (SDE), 2011 IEEE Symposium efficient heuristic for global optimization over continuous spaces.
on, pages 1–7. IEEE, 2011. Journal of global optimization, 11(4):341–359, 1997.
[9] I. Fister, I. Fister, and J. Brest. A hybrid artificial bee colony
algorithm for graph 3-coloring. Swarm and Evolutionary Com- [30] M.F. Tasgetiren, Q.K. Pan, P.N. Suganthan, and A.H.L. Chen.
putation, pages 66–74, 2012. A discrete artificial bee colony algorithm for the permutation
[10] I. Fister, I. Fister Jr, and J.B.V. Zumer. Memetic artificial flow shop scheduling problem with total flowtime criterion.
bee colony algorithm for large-scale global optimization. In In Evolutionary Computation (CEC), 2010 IEEE Congress on,
Evolutionary Computation (CEC), 2012 IEEE Congress on, pages 1–8. IEEE, 2010.
pages 1–8. IEEE, 2012.
[11] I. Fister, M. Mernik, and B. Filipič. Graph 3-coloring with [31] P.W. Tsai, J.S. Pan, B.Y. Liao, M.J. Tsai, and V. Istanda. Bat
a hybrid self-adaptive evolutionary algorithm. Computational algorithm inspired algorithm for solving numerical optimization
optimization and applications, pages 1–32, 2012. problems. Applied Mechanics and Materials, 148:134–137,
[12] I. Fister Jr, X.S. Yang, I. Fister, and J. Brest. Memetic 2012.
firefly algorithm for combinatorial optimization. arXiv preprint
arXiv:1204.5165, 2012. [32] X.-S. Yang. Appendix a: Test problems in optimization. In X.-
[13] L. J. Fogel, A. J. Owens, and M. J. Walsh. Artificial Intelligence S. Yang, editor, Engineering Optimization, pages 261–266. John
through Simulated Evolution. John Wiley, New York, USA, Wiley & Sons, Inc., Hoboken, NJ, USA, 2010.
1966.
[14] A.H. Gandomi, X.S. Yang, A.H. Alavi, and S. Talatahari. Bat [33] Xin-She Yang and Suash Deb. Cuckoo search via lévy flights. In
algorithm for constrained optimization tasks. Neural Computing Nature & Biologically Inspired Computing, 2009. NaBIC 2009.
& Applications, pages 1–17, 2012. World Congress on, pages 210–214. IEEE, 2009.
[15] D. Goldberg. Genetic Algorithms in Search, Optimization, and
Machine Learning. Addison-Wesley, MA, 1996. [34] Xin-She Yang and Suash Deb. Engineering optimisation by
[16] John H. Holland. Adaptation in Natural and Artificial Systems: cuckoo search. International Journal of Mathematical Modelling
An Introductory Analysis with Applications to Biology, Control and Numerical Optimisation, 1(4):330–343, 2010.
and Artificial Intelligence. MIT Press, Cambridge, MA, USA,
1992. [35] X.S. Yang. A new metaheuristic bat-inspired algorithm. Nature
[17] N. Jin and Y. Rahmat-Samii. Advances in particle swarm Inspired Cooperative Strategies for Optimization (NICSO 2010),
optimization for antenna designs: Real-number, binary, single- pages 65–74, 2010.
objective and multiobjective implementations. Antennas and
Propagation, IEEE Transactions on, 55(3):556–567, 2007. [36] X.S. Yang. Bat algorithm for multi-objective optimisation.
[18] D. Karaboga and B. Basturk. A powerful and efficient algorithm International Journal of Bio-Inspired Computation, 3(5):267–
for numerical function optimization: artificial bee colony (abc) 274, 2011.
algorithm. Journal of Global Optimization, 39(3):459–471, 2007.
[19] D. Karaboga and B. Basturk. On the performance of artificial bee [37] X.S. Yang. Review of meta-heuristics and generalised evolu-
colony (abc) algorithm. Applied Soft Computing, 8(1):687–697, tionary walk algorithm. International Journal of Bio-Inspired
2008. Computation, 3(2):77–84, 2011.
[20] P. Korošec, J. Šilc, and B. Filipič. The differential ant-stigmergy
algorithm. Information Sciences, 2010.
HYBRID BAT ALGORITHM 7

Iztok Fister Jr. was born in 1989, received his B.Sc. from Computer
Science in 2011. Currently, he is working towards his M.Sc. degree.
His research activities encompasses swarm intelligence, pervasive
computing and programming languages.

Dušan Fister was born in 1993 and is a student of first Year of


Mechatronics at the University of Maribor. His research activities en-
compasses GPS solutions, swarm intelligence and operating systems.

Xin-She Yang received his DPhil in Applied Mathematics from


University of Oxford. Now he is a Reader in Modelling and Simulation
at Middlesex University, UK, an Adjunct Professor at Reykjavik
University, Iceland, and a Distuiguished Guest Professor at Xi’an
Polytechnic University, China. He is also the IEEE CIS Chair for the
Task Force on Business Intelligene and Knowledge Management, and
the Editor-in-Chief of International Journal of Mathematical Modelling
and Numerical Optimisation (IJMMNO).

You might also like