An Improved Cuckoo Search Algorithm
An Improved Cuckoo Search Algorithm
Abstract—Swarm intelligence optimization algorithm is a new random walk link, performance of the improved algorithm is
type of optimization algorithm, which connects individuals into better than the standard CS algorithm. The Powell method
groups through team cooperation, generates swarm intelligence, is [8] integrated into the CS algorithm, a Powell Cuckoo
and to solve practical problems. Cuckoo search algorithm is
one of the typical swarm intelligence algorithms. It has the Search algorithm, and from the results, the convergence speed
2021 7th International Conference on Systems and Informatics (ICSAI) | 978-1-6654-2624-4/21/$31.00 ©2021 IEEE | DOI: 10.1109/ICSAI53574.2021.9664073
characteristics of high efficiency and simple implementation, and accuracy of the algorithm can not meet the current
but it has the problems of slow convergence speed and low optimization requirements. To improve the performance of
accuracy. To overcome these shortcomings, this paper proposes a convergence speed and optimal accuracy,this paper proposes
cuckoo search algorithm based on multi nest update, which uses a new bird nest search algorithm based on multi-bird nest
three better nests to update other nests, so as to improve the
optimal value accuracy and convergence speed of the algorithm. guided update (Multiple Guides Cuckoo Search,MGCS) to
Experimental results show that the performance of the improved update the location of the bird nest by feedback of the optimal
cuckoo search algorithm is significantly improved compared with solution in the population.The standard test function verifies
the original algorithm. that the MGCS algorithm can better solve the problems of low
optimization accuracy and slow convergence in the late stage
I. I NTRODUCTION
of the CS algorithm .
Group Intelligence algorithms connect simple individuals The main contributions of this paper are as follows: (1)
together by teamwork and organization to generate group adjust the update mechanism of the original cuckoo algorithm
intelligence to solve practical problems[1].Over the past few to optimize the performance of the algorithm; (2) the improved
decades, many swarm intelligence algorithms such as particle algorithm can help the classification prediction and parameter
swarm optimization (Particle Swarm Optimization,PSO), ant adjustment.
colony optimization (Ant Colony Optimization,ACO), genetic
algorithm (Genetic Algorithm, GA), firefly algorithm (Firefly
II. CS A LGORITHM
Algorithm,FA) have been put up one after another. Group
intelligence algorithm is robust[2] and versatile, which can
Xin-She Yang and Suash Deb presented cuckoo algorithm
solve complex optimization problems.
in Cuckoo Search via Levy Flightsin 2009. According to
Cuckoo Search algorithm (CS) simulating cuckoo nesting the nesting habits of cuckoo, the CS algorithm assumes the
behavior,and introduced the Levy flight. The optimal solution following three conditions:
of continuous optimization problem can be found effectively, 1) Cuckoo gives birth to only one egg at a time, and the
is one of the typical swarm intelligence algorithms. CS al- selection of bird nests is random when hatching;
gorithm has few control parameters, simple structure, strong 2) For each group of nests, the best nests can be left to the
global search ability, more efficient in practical applications. next generation;
However, it is easy to precocious, slowconvergence rate, 3) The number of bird nests to be selected is certain, and the
search imbalance and low population diversity. Around these owner of the nest is P. likely to detect the eggs of the cuckoo
shortcomings, a large number of scholars study CS algorithms. If the host bird finds that the eggs may be thrown away, or
Wang et al .[3] improve the probability P of discovery, and a new nest may be built in a new place; if not, the eggs will
added obedience [0,1] to the levy flight update formula, successfully hatch and choose a new hatching site by Levy
uniformly distributed compression factors, The effect is better flight.
than the standard CS algorithm. Jin[4] the dynamic adaptive
Based on the above three rules, the updated formula in this
improvement of the discovery probability P and the step size of
paper for the path and location of the cuckoo search nest (1)
the Levy flight, improve the search performance of the original
is as follows:
algorithm. Lian and others [5] add the PSO particle position M
update to the Levy flight,NCS algorithm, the experimental xt+1
i = xti + α L(λ) (1)
results show that the NCS algorithm is superior to the CS
algorithm, but its solution accuracy is low. Walton[6] et al. which xi is Location of the i nest
L t iteration;α is the step size
proposed an improved version for random step size in levy factor;n is the number of nests; representative point;L(λ) is
flight to enhance local search. Tuba[7] and others propose Levy random search paths,and L(λ) µ−λ (1 < λ ≤ 3),where
an improved version of population ranking in the preference µ represents the random step size obtained by Levy flight.
Authorized licensed use limited to: Universidad Tecnologica de Pereira. Downloaded on February 19,2023 at 00:01:47 UTC from IEEE Xplore. Restrictions apply.
2021 7th International Conference on Systems and Informatics (ICSAI)
In 1937, article [9] identified the integral form of the III. MGCS A LGORITHM
symmetric Levy stable distribution as formula (2) : The cuckoo algorithm has a strong global search ability,
1 +∞ but its local search ability is relatively weak, and it has a slow
Z
Levy(s) = exp(−β|k|λ )cos(ks)dk (2) convergence rate in its later stage .For the problem of late con-
π 0
vergence of the cuckoo algorithm, the cuckoo search(MGCS)
Since it is difficult to calculate the integral of the Levy dis- algorithm based on multi-nest guide update(proposed).
tribution, the Mantegana algorithm is proposed in paper[10], In the CS algorithm, the position update of the nest is
that is, the following formula (3) based on the current optimal nest position. this method can
find the optimal solution, but its accuracy is not high and the
xt+1
i = xti + stepsize, i = 1, 2, ..., n (3) convergence speed is slow. In order to solve this problem,
this paper improves the position of the bird’s nest from one
which stepsize = a · S (xti − xtbest ),a usually takes
L bird’s nest guide to three bird’s nests, which are the first
0.01,xtbest for the Generation t,S is obtained by formula (4) : three nests with better fitness value in the current population.
µ The improved cuckoo algorithm can be more efficient. The
2 2
S= 1 , µ N (0, σµ ), ν N (0, σν ) (4) improved position is updated to:
|ν| β
a · S (xti − xtα ) + a · S (xti − xtβ ) + a · S (xti − xtδ )
L L L
which σν = 1, β = 1.5,σµ is obtained by formula (5): t+1 t
xi = xi +
3
Γ(1 + β) · sin(πβ/2) 1 (7)
σµ = [ (β−1)/2
]β (5) which xα ,xβ and xδ are the first , second and third optimal
Γ[(1 + β)/2] · β · 2
solutions in the nest respectively .By updating the position of
Then a random number r satisfying [0,1] uniform distribution
each nest by the first three nests with optimal fitness value, the
is obtainedand compared with the discovery probability P ,if
local search ability of the algorithm can be strengthened and
r > P ,obtain the new solution as the formula (6):
the convergence speed can be accelerated in the later stage.
xt+1
i = xti + r(xtj − xtk ), i = 1, 2, ..., n (6) This method effectively solves the problems of low precision
and slow convergence of cuckoo algorithm.In this paper, the
which xtj and xtk represents the t generation of random bird
corresponding weights are set to 5,3, and 2, respectively, and
nestsConversely,if r ≤ P ,xt+1 i remain the same. The new the readjusted MGCS algorithm is experimentally compared
solution obtained adopts greedy selection operation, that is, with the unadjusted weights.
after obtaining the new solution, the fitness function values
of the new solution and the original solution are compared. IV. E XPERIMENTAL V ERIFICATION
Compared with the previous solution, if the new solution is In this paper , the test functions have simple modal functions
better, the new solution is replaced by the original solution, and complex multi modal functions , including Sphere func-
whereas the original solution remains unchanged. tion, Rosenbrock function , Rasrigin function .The parameters
Considering the above two update methods of CS algorithm of each test function are shown in Table 1 and the algorithm
, the CS algorithm process can be obtained as follows : are set to :bird’s Nest Number n=30probability of detection
1)Initialization algorithm parameters: number of nest- P=0.25[11]the number Space exploration of iterations is 1000.
s n,probability of detection Pmaximum number of itera-
tions T,n random initial position of the bird’s nest xi = TABLE I
T EST FUNCTION AND PARAMETER SETTING
(x1 , x2 , ..., xn )T , i = 1, 2, ..., n,d is the dimension of the
problem to be solved, the fitness function value of the initial Space Exploration Theory Optimal Dimension
position of the bird’s nest is solved, and the initial optimal Sphere [-100,100] 0 10/30
Schwefel P2.22 [-10,10] 0 10/30
fitness function value is found; Rosenbrock [-100,100] 0 10/30
2)Update the current nest position according to (3)-(5) formu- Rasrigin [-100,100] 0 10/30
la; Griewank [-600,600] 0 10/30
3)Find the fitness values for all current nestsIf the new nest
has better fitness than the original nest , the new bird’s nest This paper compares the MGCS algorithm with the CS
replaces the original nest algorithm in different dimensions, to verify that the MGCS
4)Random number r and discovery probability P are com- algorithm has a comparative advantage. The algorithm runs 30
paredif r > P ,update the current nest position according to times independently, the termination condition of the iteration
formula (6); is that the iteration reaches 1000 times. To evaluate the
5)The optimal fitness value of the current iteration is obtained, effectiveness of the algorithm, Here are three criteria :1) global
compared with the optimal fitness value of the previous optimal solution (GlobaBst), the global optimal solution is
generation, the bird’s nest position of the optimal fitness value obtained after the algorithm runs 30 times; 2) average optimal
is retained, and the number of iterations is added 1; value (MeanBest), The algorithm runs 30 times to get the
6)If the maximum number of iterations is not reached 2); expectation of the optimal value, in order to measure the
otherwise, the global optimal solution is obtained. average quality of the algorithm; 3) standard deviation (Std),
Authorized licensed use limited to: Universidad Tecnologica de Pereira. Downloaded on February 19,2023 at 00:01:47 UTC from IEEE Xplore. Restrictions apply.
2021 7th International Conference on Systems and Informatics (ICSAI)
the standard deviation between the optimal value and the Table 2 and 3 reflect the characteristics of this function, the
average optimal value of the algorithm after 30 runs, the effect of CS algorithm optimization is not obvious, while the
optimization stability of the algorithm is evaluated. MGCS algorithm is further optimization.
The experimental results of the algorithm in the 10- Rasrigin and Griewank functions are multi-peak functions.
dimensional case are shown in Table 2 , in order to reflect These two test functions are typical multimodal nonlinear
the advantages of MGCS in convergence speed and optimiza- global optimization functions. The peak shape of the function
tion accuracy,the transverse coordinates in the image are the is undulating and oscillatory. There are many local minimum
number of iterations, and the ordinate is the fitness value of values in the search space. It is suitable to measure the global
the objective function. search ability of the algorithm. From Table 2 and Fig 4, Fig
TABLE II
5, we can see that the MGCS algorithm converges faster and
T EST INDEX COMPARISON IN 10 DIMENSIONS the accuracy of optimization is higher in the iterative process.
The MGCS proposed in this paper also has advantages in
Function name Algorithm GlobaBst MeanBest Std
Sphere CS 6.51E-15 7.00E-15 9.04E-15
the problem of continuous function optimization in different
Sphere MGCS 2.68E-15 5.25E-15 6.15E-15 dimensions .The experimental results are shown in Table 3 .
Schwefel P2.22 CS 1.10E-14 1.20E-14 1.94E-14 Table 3 shows the index statistics of the two algorithms
Schwefel P2.22 MGCS 5.72E-15 6.28E-15 6.34E-15 in the case of 30 dimensions. With the simple unimodal
Rosenbrock CS 19.99586391 19.99999992 19.99999355
Rosenbrock MGCS 15.23950366 18.99998315 19.65855967function Sphere function and Schwefel P2.22 function, the
Rasrigin CS 10.21958966 12.33431814 12.58039728two algorithms do not converge after 1000 iterations, but the
Rasrigin MGCS 3.983805477 7.384826575 8.876382911optimization accuracy of MGCS algorithm is better than that
Griewank CS 0.098449154 0.110008284 0.118649313
Griewank MGCS 0.033432186 0.065601353 0.078803622
Authorized licensed use limited to: Universidad Tecnologica de Pereira. Downloaded on February 19,2023 at 00:01:47 UTC from IEEE Xplore. Restrictions apply.
2021 7th International Conference on Systems and Informatics (ICSAI)
TABLE III
T EST INDEX COMPARISON IN 30 DIMENSIONS
of CS algorithm.
For the Rosenbrock function , both the MGCS and CS
Fig. 6. Sphere algorithms converge quickly and the found optimal solution is
similar .For Rasrigin functions , the function itself has a strong
shock , and is not easy to obtain the global optimal solution
.Through the experimental results , the optimization curve of
the CS algorithm is worse than the MGCS algorithm , which
has converged in the late iteration , and the CS algorithm has
not yet converged .The Griewank function is characterized by
the higher the dimension , the more it tends to be uni modal
.The experimental results show that theMGCS algorithm out
performs the CS algorithm in both the optimal accuracy and
convergence speed .
The results of function testing in different dimensions show
that the MGCS algorithm shows good advantages in most
function optimization problems, and can better solve the
Fig. 7. Schwefel P2.22 problems of low optimization accuracy and slow convergence
speed. The experimental results of the algorithm adjusting its
weight in 10 dimensions are shown in Table 4.
TABLE IV
C OMPARISON OF TEST INDICATORS IN 10 DIMENSIONS
Authorized licensed use limited to: Universidad Tecnologica de Pereira. Downloaded on February 19,2023 at 00:01:47 UTC from IEEE Xplore. Restrictions apply.
2021 7th International Conference on Systems and Informatics (ICSAI)
Table 4 shows the index statistics before and after the weight
modification in the 10-dimensional case. When optimizing for
the simple unimodal Sphere and Schwefel P2.22 functions,
the two are not very different, but the convergence results
after modifying the weights are slightly better than before the
modification. For the Rosenbrock function, the weight can be
further modified. The convergence and optimization accuracy
of the algorithm after the modified weights are significantly
Fig. 13. Rosenbrock improved for the Rasrigin and Griewank functions.
The experimental results of the algorithm adjusting its
weight in 30 dimensions are shown in Table 5.
TABLE V
C OMPARISON OF TEST INDICATORS IN 30 DIMENSIONS
Authorized licensed use limited to: Universidad Tecnologica de Pereira. Downloaded on February 19,2023 at 00:01:47 UTC from IEEE Xplore. Restrictions apply.
2021 7th International Conference on Systems and Informatics (ICSAI)
V. S UMMARY
In this paper, we study the improvement problem of the
cuckoo algorithm. The algorithm performance is improved by
improving the update mechanism of the population. Here, the
improved algorithm guides the updates through three optimal
solutions, with faster convergence and higher optimization
accuracy than the original cuckoo algorithm. This paper
demonstrates the effectiveness and advantages of the improved
algorithm through the standard test function. For future work,
we mix the cuckoo algorithm with other algorithms and apply
it to practical systems.
R EFERENCES
[1] Yu J P,Zhou X M,Chen M, Research on representative algorithms of
swarm intelligence . Harlow, England: Addison-Wesley,2010, 46(25):1-
4.
[2] YANG X S, DEB S, Cuckoo search chvia Levy flight. Proceedings of
World Congress on Nature and Biologically Inspired Computing. India,
Washington: IEEE Publications, 2009:210-214.
[3] WANG L J,YIN Y L,ZHONG Y W, Cuckoo search with varied scaling
factor. Frontiers of Computer Science, 2015, 9(4):623-635.
[4] JIN Q B,QI L F, Novel improved cuckoo search for PID con-
troller design. Transactions of the Instutete of Measurement and
Control,2014,37(6):1-11.
[5] Lian Zhi-gang, Lu Li-hua, Chen Yang-quan, A new cuckoo search. Proc
of International Conference on Intelligence Science, Springer Press,
2017:75-83.
[6] Walton S , Hassan O , Morgan K , et al, Modified cuckoo search: A new
gradient free optimisation algorithm . Chaos Solitons and Fractals,
2011, 44(9):710-718.
[7] Tuba M , Subotic M , Stanarevic N , Modified cuckoo search algorithm
for unconstrained optimization problems . European Computing Con-
ference;ECC ’11. Faculty of Computer Science University Megatrend
Belgrade Bulevar umetnosti 29 SERBIA;Faculty of Computer Science
University Megatrend Belgrade Bulevar umetnosti 29 SERBIA;Faculty
of Computer Science University Megatrend Belgrade Bulevar umetnosti
29 SERBI, 2011.
[8] Ge Gengyu, Shen Kun, Yang Maoqiang, et al, Powell-based cuckoo
search algorithm . 14th Internati onal Conference on Natural Com-
putation, Fuzzy Systems and Knowledge Discovery(ICNCFSKD), IEEE
Press, 2018: 489-493.
[9] Doob J L, Review: P. Lvy, Thorie de l’Addition des Variables Alatoires
. Bulletin of the American Mathema tical Society, 1938, 44(1):19-20.
[10] YANG X S, Nature-inspired metaheuristic algorithms . UK:Luniver
Press, 2010:11-16.
[11] YANG X S, DEBS, Cuckoo search: recent advances and applications
. Neural Computing and Applications, 2014, 24(1):169-174.
Authorized licensed use limited to: Universidad Tecnologica de Pereira. Downloaded on February 19,2023 at 00:01:47 UTC from IEEE Xplore. Restrictions apply.