0% found this document useful (0 votes)
67 views

A Novel Swarm Intelligence Optimization Approach Sparrow Search Algorithm

Uploaded by

Aref 12345
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
67 views

A Novel Swarm Intelligence Optimization Approach Sparrow Search Algorithm

Uploaded by

Aref 12345
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 14

Systems Science & Control Engineering

An Open Access Journal

ISSN: (Print) (Online) Journal homepage: https://www.tandfonline.com/loi/tssc20

A novel swarm intelligence optimization approach:


sparrow search algorithm

Jiankai Xue & Bo Shen

To cite this article: Jiankai Xue & Bo Shen (2020) A novel swarm intelligence optimization
approach: sparrow search algorithm, Systems Science & Control Engineering, 8:1, 22-34, DOI:
10.1080/21642583.2019.1708830

To link to this article: https://doi.org/10.1080/21642583.2019.1708830

© 2020 The Author(s). Published by Informa


UK Limited, trading as Taylor & Francis
Group

Published online: 03 Jan 2020.

Submit your article to this journal

Article views: 5043

View related articles

View Crossmark data

Citing articles: 23 View citing articles

Full Terms & Conditions of access and use can be found at


https://www.tandfonline.com/action/journalInformation?journalCode=tssc20
SYSTEMS SCIENCE & CONTROL ENGINEERING: AN OPEN ACCESS JOURNAL
2020, VOL. 8, NO. 1, 22–34
https://doi.org/10.1080/21642583.2019.1708830

A novel swarm intelligence optimization approach: sparrow search algorithm


Jiankai Xuea,b and Bo Shena,b
a College of Information Science and Technology, Donghua University, Shanghai, People’s Republic of China; b Engineering Research Center of
Digitalized Textile and Fashion Technology, Ministry of Education, Shanghai, People’s Republic of China

ABSTRACT ARTICLE HISTORY


In this paper, a novel swarm optimization approach, namely sparrow search algorithm (SSA), is Received 2 November 2019
proposed inspired by the group wisdom, foraging and anti-predation behaviours of sparrows. Exper- Accepted 20 December 2019
iments on 19 benchmark functions are conducted to test the performance of the SSA and its KEYWORDS
performance is compared with other algorithms such as grey wolf optimizer (GWO), gravitational Swarm optimization; sparrow
search algorithm (GSA), and particle swarm optimization (PSO). Simulation results show that the search algorithm;
proposed SSA is superior over GWO, PSO and GSA in terms of accuracy, convergence speed, sta- convergence speed; stability
bility and robustness. Finally, the effectiveness of the proposed SSA is demonstrated in two practical and robustness
engineering examples.

1. Introduction and Shi (2001) and Kennedy and Eberhart (1995), which
The optimization problems are common in engineering mimics the cooperation and foraging behaviour of the
applications such as knapsack problems, data cluster- bird flocks.
ing, data classification, path planning, robot control, and On the other hand, due to ACO and PSO were proven
so on. It is well known that the swarm intelligence (SI) to be very competitive and have the strong global search-
optimization algorithms have been used as primary tech- ing ability, the SI optimization algorithm has attracted
niques to solve global optimization problems because of increasing attention from scholars in this area. Many new
its simplicity, flexibility and high efficiency. It should be algorithms started to be proposed, which imitates the
mentioned that, the SI optimization algorithms mainly social behaviour of organisms such as fishs, birds, or
introduce the randomness in the search process, which insects in nature. For example, the bat algorithm (BA)
is different from deterministic approaches. Note that the has been proposed in Yang and He (2013), which mimics
deterministic algorithm is easy to get trapped in local the echolocation behaviour of bats. The grey wolf opti-
optimal solutions in the complex situation. Therefore, it mizer (GWO) algorithm (Mirjalili, Mirjalili, & Lewis, 2014)
is of practical importance to employ the SI optimization is another popular algorithm, mimicking the leadership
algorithm so as to obtain an optimal solution to the global and hunting behaviour of grey wolves. There are four
optimization problem. types of grey wolves in the GWO algorithm: alpha, beta,
In the past decades, the SI optimization algorithm delta, and omega wolves. The alpha wolves are responsi-
has been developed rapidly and becomes a hotspot ble for making decisions to the pack, whereas beta and
in many fields. So far, there have been many different delta wolves should help the alpha wolves in decision
types of optimization algorithms available in the exist- making process. Furthermore, the rest of the recent SI
ing literature. Among various optimization algorithms, optimization algorithms are: artificial bee colony (ABC)
the ant colony optimization (ACO) algorithm and parti- algorithm (Karaboga, 2005; Karaboga & Basturk, 2007),
cle swarm optimization (PSO) algorithm are representa- firefly algorithm (FA) (Yang, 2008, 2010a), cuckoo search
tive that have received considerable attention. For exam- (CS) algorithm (Yang & Deb, 2009), et al.
ple, the ACO algorithm has been proposed in Dorigo, Except for the SI optimization algorithms, some algo-
Maniezzo, and Colorni (1996), which mimics the bio- rithms are inspired by the concept of natural evolu-
logical characteristics of ants in nature that mark the tion or the physical rules. For example, the well-known
path by pheromone. As one of the well-known algo- genetic algorithm (GA) has been presented in Hol-
rithms, the PSO algorithm has been proposed in Eberhart land (1975, 1992), which is a powerful stochastic search

CONTACT Bo Shen [email protected]; [email protected]


© 2020 The Author(s). Published by Informa UK Limited, trading as Taylor & Francis Group
This is an Open Access article distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use,
distribution, and reproduction in any medium, provided the original work is properly cited.
SYSTEMS SCIENCE & CONTROL ENGINEERING: AN OPEN ACCESS JOURNAL 23

algorithm based on the principles of natural selection 2. Sparrow search algorithm (SSA)
and natural genetics. Generally, there are consists of
In this section, we discuss the inspiration of the SSA. Then,
three operators: selection, reproduction, and mutation,
the mathematical model and the SSA are described in
which makes GA an efficient global optimizer. More-
details.
over, it should be pointed out that there are also
algorithms usually proposed by simulating the physi- 2.1. Biological characteristics
cal rules such as gravitational search algorithm (GSA)
(Rashedi, Nezamabadi-Pour, & Saryazdi, 2009), simulated The sparrows are usually gregarious birds and have many
annealing (SA) (Kirkpatrick, Gelatto, & Vecchi, 1983), species. They are distributed in most parts of the world
etc. and like to live in places where the human life. More-
On the other hand, we found that although each over, they are omnivorous birds and mainly feed on seeds
algorithm has its advantages, there are also shortcom- of grains or weeds. It is well known that the sparrows
ings by a deeper investigation. For instance, the ACO are common resident birds. In contrast with many other
algorithm has the disadvantage of slow search speed, small birds, the sparrow is strongly intelligent and has a
and the PSO algorithm has the disadvantage of easy strong memory. Note that there are two different types
premature convergence. Therefore, it is very impor- of captive house sparrows, both the producer and the
tant to enhance the current optimization algorithm. scrounger (Barnard & Sibly, 1981). The producers actively
According to the no-free-lunch (NFL) theorem (Wolpert search for the food source, while the scroungers obtain
& Macready, 1997), the expected performance of each food by the producers. Furthermore, the evidence shows
algorithm is the same for solving all optimization prob- that the birds usually use behavioural strategies flexibly,
lems. In other words, an optimization algorithm may and switch between producing and scrounging (Barta,
perform well in a series of problems and show poor Liker, & Mónus, 2004; Coolen, Giraldeau, & Lavoie, 2001;
performance in a different series of problems. Obvi- Koops & Giraldeau, 1996; Liker & Barta, 2002). It can also be
ously, we can solve the different problems by propos- said that, in order to find their food, the sparrows usually
ing new optimization algorithms. At the same time, use the strategy of both the producer and the scrounger
the newly proposed optimization algorithm provides a (Barnard & Sibly, 1981; Johnson, Grant, & Giraldeau, 2001;
new solution to solve a complex global optimization Liker & Barta, 2002).
problem. The studies have shown that the individuals monitor
In response to the above discussions, in this paper, the behaviour of the others in the group. Meanwhile, the
we aim to propose a novel swarm intelligence optimiza- attackers in the bird flock, which want to increase their
tion technique which is called sparrow search algorithm own predation rate, are used to compete food resources
(SSA). The main contributions of this paper are summa- of the companions with high intakes (Bautista, Alonso,
rized as follows: (1) a new SI technique, i.e. the SSA is & Alonso, 1998; Lendvai, Barta, Liker, & Bokony, 2004).
proposed inspired by the sparrow population’s forag- In addition, the energy reserves of the individuals may
ing and anti-predation behaviours; (2) by using the pro- play an important role when the sparrow chooses differ-
posed SSA, both the exploration and the exploitation of ent foraging strategies, and the sparrows with low energy
the search space of the optimization are improved to reserves scrounge more (Lendvai et al., 2004). It is worth
some extent; and (3) the proposed SSA is successfully mentioning that the birds, which located on the periph-
applied in two practical engineering problems. Finally, ery of the population, are more likely to be attacked
in order to test the effectiveness and performance of by predators and constantly try to get a better position
the proposed algorithm in this paper, some compara- (Budgey, 1998; Pomeroy & Hepner, 1992). Note that the
tive experiments are carried out. The simulation results animals, which located on the centre, may move closer
show that the proposed SSA is superior to other exist- to their neighbours in order to minimize their domain
ing algorithms in terms of searching precision, conver- of danger (Hamilton, 1971; Pulliam, 1973). We also know
gence rate, stability and the avoidance of local optimal that all sparrows display the natural instinct of curiosity
value. about everything, and at the same time they are always
The remainder of the paper is organized as follows. vigilant. For example, when a bird does detect a predator,
Section 2 introduces the SSA in detail. Section 3 is the ver- one or more individuals give a chirp and the entire group
ification and comparison of the SSA. Section 4 applies the flies away (Pulliam, 1973).
SSA to the two practical engineering problems and fur-
2.2. Mathematical model and algorithm
ther tests the performance of the algorithm. In Section 5,
we draw the conclusion of this paper and discuss the next According to the previous description of the sparrows,
work. we can establish the mathematical model to construct
24 J. XUE AND B. SHEN

the sparrow search algorithm. For simplicity, we idealized where n shows the number of sparrows, and the value of
the following behaviour of the sparrows and formulated each row in FX represents the fitness value of the individ-
corresponding rules. ual. In the SSA, the producers with better fitness values
have the priority to obtain food in the search process.
(1) The producers typically have high levels of energy In addition, because the producers are responsible for
reserves and provide foraging areas or directions for searching food and guiding the movement of the entire
all scroungers. It is responsible for identifying areas population. Therefore, the producers can search for food
where rich food sources can be found. The level of in a broad range of the places than that of the scroungers.
energy reserves depends on the assessment of the According to rules (1) and (2), during each iteration, the
fitness values of the individuals. location of the producer is updated as below:
(2) Once the sparrow detects the predator, the individ-

uals begin to chirp as alarming signals. When the ⎨X t · exp −i
if R2 < ST
alarm value is greater than the safety threshold, the Xi,jt+1 = i,j
α · itermax (3)
producers need to lead all scroungers to the safe area. ⎩ t
Xi,j + Q · L if R2 ≥ ST
(3) Each sparrow can become a producer as long as
it searches for the better food sources, but the
where t indicates the current iteration, j = 1, 2, . . . , d. Xi,jt
proportion of the producers and the scroungers is
represents the value of the jth dimension of the ith spar-
unchanged in the whole population.
row at iteration t. itermax is a constant with the largest
(4) The sparrows with the higher energy would be acted
number of iterations. α ∈ (0, 1] is a random number. R2
as the producers. Several starving scroungers are
(R2 ∈ [0, 1]) and ST (ST ∈ [0.5, 1.0]) represent the alarm
more likely to fly to other places for food in order to
value and the safety threshold respectively. Q is a ran-
gain more energy.
dom number which obeys normal distribution. L shows
(5) The scroungers follow the producer who can provide
a matrix of 1 × d for which each element inside is 1.
the best food to search for food. In the meantime,
When R2 < ST, which means that there are no preda-
some scroungers may constantly monitor the pro-
tors around, the producer enters the wide search mode.
ducers and compete for food in order to increasing
If R2 ≥ ST, it means that some sparrows have discovered
their own predation rate.
the predator, and all sparrows need quickly fly to other
(6) The sparrows at the edge of the group quickly move
safe areas.
toward the safe area to get a better position when
As for the scroungers, they need to enforce the rules (4)
aware of danger, while the sparrows in the middle
and (5). As mentioned above, some scroungers monitor
of the group randomly walk in order to be close to
the producers more frequently. Once they find that the
others.
producer has found good food, they immediately leave
their current position to compete for food. If they win,
In the simulation experiment, we need to use virtual
they can get the food of the producer immediately, oth-
sparrows to find food. The position of sparrows can be
erwise they continue to execute the rules (5). The position
represented in the following matrix:
update formula for the scrounger is described as follows:
⎡ ⎤ ⎧ t
x1,1 x1,2 · · · · · · x1,d
⎨Q · exp Xworst −Xi,j
t
⎢x2,1 x2,2 · · · · · · x2,d ⎥ if i > n/2
⎢ ⎥ Xi,jt+1 = i2 (4)
X=⎢ . .. .. .. .. ⎥ (1) ⎩ t+1
⎣ .. . . . . ⎦ XP + |Xi,jt − XPt+1 | · A+ · L otherwise
xn,1 xn,2 · · · · · · xn,d
where XP is the optimal position occupied by the pro-
where n is the number of sparrows and d shows the ducer. Xworst denotes the current global worst location. A
dimension of the variables to be optimized. Then, the represents a matrix of 1 × d for which each element inside
fitness value of all sparrows can be expressed by the is randomly assigned 1 or −1, and A+ = AT (AAT )−1 . When
following vector: i > n/2, it suggests that the ith scrounger with the worse
fitness value is most likely to be starving.
⎡ ⎤ In the simulation experiment, we assume that these
f ([x1,1 x1,2 · · · · · · x1,d ])
⎢f ([x2,1 sparrows, which are aware of the danger, account for 10%
⎢ x2,2 · · · · · · x2,d ])⎥⎥
FX = ⎢ . .. .. .. .. ⎥ (2) to 20% of the total population. The initial positions of
⎣ .. . . . . ⎦ these sparrows are randomly generated in the popula-
f ([xn,1 xn,2 · · · · · · xn,d ]) tion. According to rules (6), the mathematical model can
SYSTEMS SCIENCE & CONTROL ENGINEERING: AN OPEN ACCESS JOURNAL 25

be expressed as follows: Algorithm 1 The framework of the SSA.


⎧ Input:

⎨Xbest + β· |Xi,j − Xbest |  if fi > fg
t t t
⎪ G: the maximum iterations
Xi,jt+1 = |Xi,jt − Xworst
t | (5) PD: the number of producers

⎪X t +K · if fi = fg
⎩ i,j (fi − fw ) + ε
SD: the number of sparrows who perceive the danger
R2 : the alarm value
n: the number of sparrows
where Xbest is the current global optimal location. β, as Initialize a population of n sparrows and define its
the step size control parameter, is a normal distribution relevant
of random numbers with a mean value of 0 and a vari- parameters.
ance of 1. K ∈ [−1, 1] is a random number. Here fi is the Output:Xbest , fg .
fitness value of the present sparrow. fg and fw are the cur- 1 : while (t < G)
rent global best and worst fitness values, respectively. ε is 2 : Rank the fitness values and find the current best
the smallest constant so as to avoid zero-division-error. individual and the current worst individual.
For simplicity, when fi > fg indicates that the sparrow 3 : R2 = rand(1)
is at the edge of the group. Xbest represents the loca- 4 : for i = 1 : PD
tion of the centre of the population and is safe around 5: Using equation (3) update the sparrow’s location;
it. fi = fg shows that the sparrows, which are in the mid- 6 : end for
dle of the population, are aware of the danger and need 7 : for i = (PD + 1) : n
to move closer to the others. K denotes the direction in 8: Using equation (4) update the sparrow’s location;
which the sparrow moves and is also the step size control 9 : end for
coefficient. 10 : for l = 1 : SD
Based on the idealization and feasibility of the above 11 : Using equation (5) update the sparrow’s loca-
model, the basic steps of the SSA can be summarized as tion;
the pseudo code shown in Algorithm 1. 12 : end for
13 : Get the current new location;
3. Validation and comparison 14 : If the new location is better than before, update it;
15 : t = t + 1
In this part, there are nineteen standard test functions 16 : end while
verifying the feasibility and effectiveness of the proposed 17 : return Xbest , fg .
SSA, and the test results are compared with PSO, GSA
and GWO. Integrated development environment is Mat-
lab 2014a for all the experimentations. Operating Sys-
tem: Window7. The standard test functions (Fateen & multimodal test functions and fixed-dimension test func-
Bonilla-Petriciolet, 2014; Jamil & Yang, 2013; Rashedi tions corresponding to Tables 1–3, respectively. The size
et al., 2009; Yang, 2010b) are unimodal test functions, of each dimension is 30 (DIM = 30) for the functions of
Tables 1 and 2, and the dimension of the fixed-dimension
test functions can be seen in Table 3.
Table 1. Unimodal test functions (Dim = 30).
Figures 1 and 2 show the trajectories of sparrows in the
Function Initial range Fmin different test functions. We can clearly see that most spar-
n
 rows aggregate towards the global optimum in Figure 1.
F1 (x) = xi2 [−100,100] 0
i=1 Nevertheless, it should be pointed out that in the Dama-
n n

F2 (x) = |xi | + |xi | [−10,10] 0
vandi function of Figure 2, although the most sparrows
i=1 i=1 are clustered at the local minimums, some sparrows still
n i

F3 (x) = ( xj )2 [−100,100] 0
are able to avoid local minimums to move towards the
i=1 j=1 global best (2, 2).
F4 (x) = maxi {|xi |, 1 ≤ i ≤ n} [−100,100] 0 In order to make the algorithm more convincing, in all
n−1

F5 (x) = [100(xi+1 − xi2 )2 + (xi − 1)2 ] [−30,30] 0 cases, we run 30 times independent trials on each test
i=1
n
function. The maximum number of the iterations is 1000

F6 (x) = ([xi + 0.5])2 [−100,100] 0 and the population size is set to 100 (n = 100) in each
i=1 trial. The parameters of the GWO are arranged as fol-
n
F7 (x) = ixi4 + random[0, 1) [−1.28,1.28] 0 lows: a value is linearly decreased from 2 to 0 and r1 , r2
i=1
are random vectors in [0, 1]. The parameters of the PSO
26 J. XUE AND B. SHEN

Table 2. Multimodal test functions (Dim = 30).


Function Initial range Fmin
n
 
F8 (x) = −xi sin( |xi |) [−500,500] −418.9829 n
i=1
n
F9 (x) = [xi2 − 10 cos(2πxi ) + 10] [−5.12,5.12] 0
i=1     n 
1 n 2 1
F10 (x) = −20 exp −0.2 x − exp cos(2πx i ) + 20 + e [−32,32] 0
n i=1 i n
i=1
n n
1  2  xi
F11 (x) = xi − cos √ + 1 [−600,600] 0
4000 i
 i=1 i=1
n−1

π 
2 2 2
F12 (x) = 10 sin(πy1 ) + (yi − 1) [1 + 10 sin (πyi+1 )] + (yn − 1) [−50, 50] 0
n
i=1
 n
+ u(xi , 10, 100, 4)
i=1
xi + 1
yi = 1 +
4 ⎧
⎪ m
⎨k(xi − a) xi > a
u(xi , a, k, m) = 0 −a < xi < a

⎩k(−x − a)m
i xi < −a

Table 3. Fixed-dimension test functions.


Function Dim Initial range Fmin
1
F13 (x) = 4x12 − 2.1x14 + x16 + x1 x2 − 4x22 + 4x24 2 [−5,5] −1.0316
  3  
 sin[π(x1 − 2)] sin[π(x2 − 2)] 5
F14 (x) = 1 −    [2 + (x1 − 7)2 + 2(x2 − 7)2 ] 2 [0,14] 0
π 2 (x1 − 2)(x2 − 2) 

|100− x12 +x22 /π |
F15 (x) = −(|e sin(x1 ) sin(x2 )| + 1)−0.1 2 [−10,10] −1.0
n n 2  n
F16 (x) = [e− i=1 (xi /β)2m − 2e− i=1 xi ] cos2 (xi ), β = 15, m = 5 2 [−20,20] −1.0
 2 i=1

11
x1 (b2i + bi x2 )
F17 (x) = ai − 4 [−5,5] 0.000307
b2i + bi x3 + x4
i=1 ⎛ ⎞

4  3
F18 (x) = − ci exp ⎝− aij (xj − pij ) ⎠
2
3 [0,1] −3.86
i=1 j=1
⎛ ⎞

4 
6
F19 (x) = − ci exp ⎝− aij (xj − pij ) 2⎠
6 [0,1] −3.32
i=1 j=1

are c1 = c2 = 1.49445, w = 0.729. The parameters of the are to make the algorithm concentrate to exploit during
GSA are G0 = 100, α = 20. The parameters of the SSA the optimization process to find the global optimum.
are set as follows: the number of the producers and SD
accounts for 20% and 10%, respectively, and ST = 0.8. 3.1.1. Analysis of the convergence accuracy
Finally, we get the best value, the mean value, and the As shown in Table 4, the SSA obtains the optimal value for
standard deviation (Std) of the objective function values. solving F1 –F4 . Although the SSA does not get the opti-
With the same standard test function, the average value mal value when solving F5 , it is also significantly better
represents the convergence accuracy of the algorithm, than GWO, PSO, and GSA. On the F6 test function, the con-
and the standard deviation presents the stability of the vergence accuracy of the PSO is higher than GWO, GSA
algorithm. The solutions of the four different algorithms and SSA. For the F7 test function, the SSA is slightly better
are shown in Table 4. than the other three algorithms from the average values
obtained.

3.1. Unimodal test functions


3.1.2. Analysis of the stability
On the unimodal test functions, it mainly reflects the From the Std in Table 4, we can get the standard devia-
good convergence property and exploitation capability tion of the SSA is zero on the F1 –F4 . This shows that our
of the algorithm. Generally speaking, these test functions algorithm can yield more stable results than those of PSO,
SYSTEMS SCIENCE & CONTROL ENGINEERING: AN OPEN ACCESS JOURNAL 27

Figure 1. The paths of the SSA on the 2-D version of the test functions: (a) Michalewicz function, (b) Drop-wave function, (c) Rastrigin
function.

GWO and GSA. On the F5 test function, the SSA is the and PSO. The SSA gets a good fitness value at the begin-
most stable, and PSO is the most unstable. When deal- ning and converges to a better value after about 400
ing with the F6 test function, the stability of the SSA is iterations for the F5 test function. In addition, It can be
worse than that of GSA and PSO, but it is much better seen from the convergence curves of the F6 and F7 test
than GWO. On the F7 test function, the SSA is similar in functions that the proposed, SSA, not only enhances the
the performance stability to the other three algorithms. In convergence rate but also has strong competitiveness
summary, the simulation experimental results show the compared with other algorithms.
satisfactory performance of the SSA in comparison with From the above, we draw a conclusion that the pro-
the other algorithms that have been reported. posed SSA can quickly find the feasible solution and has
the best performance in terms of efficiency and conver-
gence in dealing with unimodal test functions.
3.1.3. Analysis of the convergence speed
In order to compare the convergence speed of the four
3.2. Multimodal test functions
algorithms intuitively, the fitness curves of the unimodal
test functions are shown in Figure 3. It can be concluded On the multimodal test functions, each of the functions is
that the SSA exhibits an absolute advantage on the F1 –F4 characterized by multiple local optimal solutions, which
test functions, which is obviously better than GWO, GSA makes the algorithm easy to fall into local optimal points.
28 J. XUE AND B. SHEN

Figure 2. The paths of the SSA on the 2-D version of the test functions: (a) Ackley function, (b) Damavandi function.

Therefore, it can be employed to test the local search and SSA has better stability in comparison to GWO, GSA and
global search abilities of the algorithm. PSO. Furthermore, from these results, we clearly know
that the SSA has better performance and strong adapt-
3.2.1. Analysis of the convergence accuracy ability when dealing with multimodal test functions.
From Table 4, it can be seen that there are five test func-
tions, namely F8 –F12 in which the SSA has better out-
performs than the other algorithms. Then, for the F8 test 3.2.3. Analysis of the convergence speed
function, the SSA can achieve the solution which is closest Firstly, we test the convergence speed of the algorithm on
to the optimal value compared to the other algorithms. the multimodal test functions, the results of all the algo-
For the F9 test function, the SSA can successfully find an rithms are shown in Figure 4. Then, for the F8 test function,
excellent solution and it always converges to the global it can be reasonably concluded that the SSA converges to
minimum in each experiment. It is clear that the SSA has a value close to the optimal solution after about 200 iter-
a good global search capability. The search abilities of SSA ations. This also makes it clearer that the SSA highlights
and GWO on the F10 test function are basically the same, its superiority. It can be proved from the F9 test function
followed by GSA, and the worst is PSO. On the F11 test that the SSA converges to the optimal value after about
function, the four algorithms can quickly converge to the 20 iterations and the GWO converges to the optimal value
optimal value of the function. But by obtaining the mean, after about 180 iterations. As a result, the proposed SSA
we can notice that although GWO, GSA and PSO can find has a much faster speed than others. For the remaining
the global optimal value, it is easy to fall into the local opti- functions, the proposed algorithm also obtains very com-
mum value during the iterative process. For the F12 test petitive results. Overall, the figure shows that the SSA has
function, the SSA shows good performance in all aspect. higher search efficiency and convergence rate than the
On the whole, the SSA algorithm has strong exploration other three algorithms for the multimodal functions, and
capability. the processing process is very stable.
After the above analysis, we further conclude that the
3.2.2. Analysis of the stability SSA has global search ability and strong adaptability.
On the F8 test function, we can see that the proposed Note that it is controlled by the mechanism of the SSA
algorithm is better than the other three algorithms in itself. In the SSA, the different behavioural strategies of
terms of solution accuracy, but it is relatively poor in sta- the sparrows have made great contributions to the global
bility. For the remaining functions, i.e. F9 –F12 in which the search.
SYSTEMS SCIENCE & CONTROL ENGINEERING: AN OPEN ACCESS JOURNAL 29

9.7994E−19
1.3844E−09
3.6476E+01
1.3933E−10
1.2923E−01
3.3. Fixed-dimension test function

8.4977E−19
1.6588E−03

1.5989E−10

3.1101E−02
6.5368E−16

7.3655E−02
4.9189E−04
2.6097E−15
1.3323E−15
3.9740
0.0012
404.0800
2.9088

0.3537
Std
In order to more fully test the performance of the SSA, we
selected seven fixed-dimensional functions to verify the
convergence speed, stability and convergence accuracy

−9.3836E−01
3.6298E−18
9.9675E−09
1.1588E+02
1.0016E−09
2.5982E+01
3.8571E−18
5.5783E−03

1.5116E−09

1.0367E−02

1.9607E−03
7.6944

0.1687

−1.0316
1.8186
−0.0085

−3.8628
−3.3220
of the algorithm.

−3059.1
GSA
Ave

3.3.1. Analysis of the convergence accuracy


According to Table 4, for the F13 test function, the simu-
lating results indicate that the four algorithms can search
1.6686E−18
7.6671E−09
5.7431E+01
7.4751E−10
2.5746E+01
2.6556E−18
2.2159E−03

1.2460E−09

1.5659E−20

7.8978E−04
2.9549

−1.0316
0.3394
−0.0114

−3.8628
−3.3220
−3996.8
the optimal value quickly and efficiently. More concretely,

0.0

−1.0
Best

the four algorithms show a good balance between the


exploration and the exploitation on the F13 test function.
For the F14 test function, the accuracy of the SSA is much
7.7381E−24
3.1274E−10

1.8180E−24
2.7000E−03

6.7752E−16

2.6776E−04
2.7101E−15
0.0712
0.0097

0.3651
0.3936
0.4795
35.9017

838.1568
9.8581

0.0220
0.6375

0.1209

0.0592
advantageous all of the comparison algorithms. More-
Std

over, from the optimal value obtained, both GWO and


PSO find a better value but the GSA gets the worst value.
On the two test functions, F15 and F16 , the SSA perfor-
2.3453E−24
9.6805E−11

1.2547E−24
7.0000E−03

5.9429E−04
0.0857
0.0100
46.5122

44.4083

0.0160

−3.8628
0.3429

0.0484
−1.0316
1.9333
−0.5732
−0.6667

−3.2744
mances almost same, but the other three algorithms are
−6960.8
Ave
PSO

easy to fall into local optimum. For the F17 test function,
the four algorithms may trap into local optima in each
independent experiment. On the F18 test function, all four
−8.5487E−14
1.3375E−26
1.2849E−13

8.8377E−04

2.5356E−26
3.2000E−03

7.1942E−14

4.0845E−27

3.0749E−04

algorithms can find the optimal solution, but by analysing


0.0099

8.1702

29.8495

−1.0316

−3.8628
−3.3220
−1.0
−1.0
−8700.4

0.0
Best

the average value we can find that GWO is slightly worse.


The performance of the GSA for solving F19 is better than
the other three algorithms.
8.0928E−85
4.0798E−49
7.3559E−26
9.1734E−22

1.3641E−04

2.5721E−15
3.7560E−03
1.4395E−02
2.0977E−09

3.7200E−04
0.7228
0.1948

1.4976

0.5920
0.1825
0.4901

0.0020
615.6736

0.0659

3.3.2. Analysis of the stability


Std

For the F13 test function in Table 4, the stabilities of SSA,


GSA and PSO are excellent than GWO. On the F14 test
function, it can be found that the standard deviations of
2.9804E−85
4.0755E−49
2.1458E−26
7.5793E−22

2.3525E−04

1.0125E−14
1.4380E−03
3.3584E−02

3.3700E−04
26.0690
0.2325

0.2734

−1.0316
1.7593
−0.0336
−0.3667

−3.8622
−3.2347

GWO, PSO and GSA are relatively large, which indicates


−6347.9
GWO
Ave

that the stability is comparatively poor. And the standard


deviation of the SSA is relatively small, which means that
the stability is better. The standard deviation of the SSA is
−1.03162845
4.1687E−88
4.9914E−50
6.1877E−32
4.6452E−23

3.6952E−06
3.1934E−05

7.9936E−15

6.2991E−03

3.7946E−06

3.0749E−04
25.1280

−3.8628
−3.3220

zero on the F15 and F16 test functions, which shows that
Best

0.0

−1.0
−1.0
−7570.5

0.0

the sparrows can stably gather around the global best


point. On the F17 test function, the stability of the SSA
is slightly better compared with other three algorithms.
For the F18 test function, the GSA, SSA and PSO are better
1.9881E−06
1.5154E−10
1.7597E−04

1.5378E−11
6.5843E−16
1.3466E−09

1.5695E−05
2.7101E−15
698.7294

0.0605

stability, and GWO is the worst stability. On the F19 test


Std
0.0
0.0
0.0
0.0

0.0

0.0
0.0
0.0
0.0

function, the stability of the GSA is the best.

3.3.3. Analysis of the convergence speed


2.1196E−259

5.7257E−278
Table 4. Results of test functions.

9.2495E−07
6.8266E−11
1.3458E−04

8.8818E−16

7.1744E−12

3.9574E−10

3.1035E−04
−1.0316

−3.8628
−3.2625
−7726.67

The convergence curve is illustrated in Figure 5. The four


0.0

0.0

0.0

0.0

−1.0
−1.0
SSA
Ave

algorithms have high convergence speed on the F13 test


function. From the convergence curve of F14 , it is obvi-
ous that the convergence trend of GWO, GSA and PSO is
9.9463E−10
8.1322E−14
9.3993E−06

8.8818E−16

8.6262E−17

1.7097E−13

3.0749E−04
−1.0316

−3.8628
−3.3220

the same during the optimization process. For the F15 test
Best
0.0
0.0
0.0
0.0

−9013.0
0.0

0.0

−1.0
−1.0

function, it can be found that the SSA quickly converges


to the optimal value after about 200 iterations. Thus, the
convergence speed of the SSA is very fast. In addition,
F10
F11
F12
F13
F14
F15
F16
F17
F18
F19
F1
F2
F3
F4
F5
F6
F7
F8
F9

there are four test functions, namely F16 –F19 in which the
F
30 J. XUE AND B. SHEN

Figure 3. The convergence characteristics of the four algorithms on the unimodal test functions.

Figure 4. The convergence characteristics of the four algorithms on the multimodal test functions.
SYSTEMS SCIENCE & CONTROL ENGINEERING: AN OPEN ACCESS JOURNAL 31

Figure 5. The convergence characteristics of the four algorithms on the fixed-dimension test functions.

convergence speed of the SSA is faster than that of GWO, The formula described as follows:
PSO and GSA. This is because the SSA rapidly converges 
p
to a stable value at the beginning of iteration. f̌ (x) = f (x) + φ gκj (x)δ(gj (x)) (6)
The simulation results show that the SSA has strong j=1
optimization ability for the optimization of the uni-
where κ ∈ {1, 2}, and φ  1 is the penalty parameter. f̌ (x)
modal test functions, multimodal test functions and
is the penalized objective function, and f (x) is the original
fixed-dimension test functions. Moreover, it is seen that
fitness function. p denotes the number of the inequality
the SSA has a certain competitiveness with other state-
constraints. gj (x)(j = 1,2, . . . ,p) are the inequality con-
of-the-art algorithms. Therefore, it is believed that the
straints. In addition, δ(gj (x)) is defined as
SSA is able to achieve a certain balance between global

exploration and local exploitation. 1, if gj (x) > 0
δ(gj (x)) =
0, if gj (x) ≤ 0

4. Case studies
4.1. Case I. Himmelblau’s nonlinear optimization
In this section, the two practical engineering problems are problem
chosen to illustrate the competitiveness of the proposed
The Himmelblau’s nonlinear optimization is a well-known
algorithm in solving constrained optimization problems
benchmark problem, which has been applied to many
with mix variables. For the processing of the inequal-
fields. The problem is outlined as
ity constraints in the problem we use penalty functions,
which embeds constraints into the objective function. min : f (x) = 5.3578547x32 + 0.8356891x1 x5
32 J. XUE AND B. SHEN

+ 37.293239x1 − 40, 792.141 stress, transverse deflections of the shafts and stresses in
the shafts. The variables b(x1 ), m(x2 ), z(x3 ), l1 (x4 ), l2 (x5 ),
s.t. g1 (x) = 85.334407 + 0.0056858x2 x5
d1 (x6 ), d2 (x7 ) are the face width, module of teeth, number
+ 0.0006262x1 x4 − 0.0022053x3 x5 of teeth on pinion, length of shaft one between bearings,
g2 (x) = 80.51249 + 0.0071317x2 x5 length of shaft two between bearings and the diameter of
the first and second shafts, repectively. These constraints
+ 0.0029955x1 x2 + 0.0021813x32 and the problem are designed as below:
g3 (x) = 9.300961 + 0.0047026x3 x5
min : f (b; m; z; l1 ; l2 ; d1 ; d2 ) = 0.7854bm2 (3.3333z2
+ 0.0012547x1 x3 + 0.0019085x3 x4
+ 14.9334z − 43.0934) − 1.508b(d12 + d22 )
0 ≤ g1 (x) ≤ 92
+ 7.4777(d13 + d23 ) + 0.7854(l1 d12 + l2 d22 )
90 ≤ g2 (x) ≤ 110
27
20 ≤ g3 (x) ≤ 25 s.t. g1 (x) = − 1 ≤ 0,
bm2 z
78 ≤ x1 ≤ 102 397.5
g2 (x) = −1≤0
bm2 z2
33 ≤ x2 ≤ 45
1.93l13
27 ≤ x3 ≤ 45 g3 (x) = −1≤0
mzd14
27 ≤ x4 ≤ 45
1.93l23
27 ≤ x5 ≤ 45 (7) g4 (x) = −1≤0
mzd24

We independently run 30 times to generate the sta- M2 + 16.9 × 106
g5 (x) = −1≤0
tistical results. Table 5 illustrates the optimized results 110d13
obtained by the SSA for the Himmelblau’s nonlin- √
H2 + 157.5 × 106
ear optimization problem. Moreover, the feasible best g6 (x) = −1≤0
85d23
solution is x = (78, 33, 29.9953, 45, 36.7758) with f (x) =
mz
−30, 665.5387. The constraint values are g = (92, 98.8405, g7 (x) = −1≤0
20). It can be clearly seen that the proposed algorithm is 40
5m
feasible on this issue. g8 (x) = −1≤0
b
b
4.2. Case II. Speed reducer design g9 (x) = −1≤0
12m
The Figure 6 presents the designing of the speed reducer, 1.5d1 + 1.9
g10 (x) = −1≤0
which is to minimize the total weight and subject to con- l1
straints on the bending stress of the gear teeth, surfaces 1.1d2 + 1.9
g11 (x) = −1≤0
l2
Table 5. Results obtained by the SSA for the Himmelblau’s non- 2.6 ≤ b ≤ 3.6
linear optimization problem.
0.7 ≤ m ≤ 0.8
Best Worst Mean Std No. sparrow
17 ≤ z ≤ 28
−30,665.5387 −30,662.8505 −30,665.3808 0.5713 40
7.3 ≤ l1 ≤ 8.3
7.8 ≤ l2 ≤ 8.3
2.9 ≤ d1 ≤ 3.9
(8)
where
745l1 745l2
M= , H=
mz mz
In the SSA, the maximum number of the iterations
is 3000 and the sparrow population size is set to 300.
Figure 6. Schematic of the speed reducer design (Gandomi, We independently run 30 times. Table 6 shows the opti-
Yang, & Alavi, 2013). mization results for the speed reducer problem. In this
SYSTEMS SCIENCE & CONTROL ENGINEERING: AN OPEN ACCESS JOURNAL 33

Table 6. Results obtained by the SSA for the speed reducer prob- References
lem.
Barnard, C. J., & Sibly, R. M. (1981). Producers and scroungers: A
Best Worst Mean Std general model and its application to captive flocks of house
2996.7077 3008.1638 2997.7101 2.8209 sparrows. Animal Behaviour, 29, 543–550.
Barta, Z., Liker, A., & Mónus, F. (2004). The effects of predation
risk on the use of social foraging tactics. Animal Behaviour, 67,
301–308.
case, the optimal solution obtained by our algorithm is Bautista, L. M., Alonso, J. C., & Alonso, J. A. (1998). Foraging site
x = (3.500059, 0.7, 17, 7.3, 7.8, 3.351209, 5.286813) with a displacement in common crane flocks. Animal Behaviour, 56,
1237–1243.
function value of 2996.7077. Here, the constraint val-
Budgey, R. (1998). Three dimensional bird flock structure and
ues are g = (−0.073931, −0.19801, −0.49977, −0.90148, its implications for birdstrike tolerence in aircraft. Stara Lesna:
−0.00089021, −7.374e − 05, −0.7025, −1.6971e − 05, International Bird Strike Committee.
−0.58333, −0.051121, −0.010834). Hence, our result is Coolen, I., Giraldeau, L. A., & Lavoie, M. (2001). Head position as
feasible and verifies the effectiveness of the proposed an indicator of producer and scrounger tactics in a ground-
algorithm. feeding bird. Animal Behaviour, 61, 895–903.
Dorigo, M., Maniezzo, V., & Colorni, A. (1996). The ant system:
Optimization by a colony of cooperating agents. IEEE Trans-
actions on Systems, Man and Cybernetics, Part B (Cybernetics),
5. Conclusion 26(1), 29–41.
Eberhart, R. C., & Shi, Y. (2001). Particle swarm optimiza-
In this paper, we present an effective optimization tech- tion: Developments, applications and resources. In Pro-
nique, the sparrow search algorithm, which simulates ceedings of the IEEE congress on evolutionary computation
the foraging and anti-predation behaviours of sparrows. (pp. 81–86).
Then, it introduces the mathematical model and the Fateen, S. E. K., & Bonilla-Petriciolet, A. (2014). Intelligent firefly
framework of the proposed algorithm. Finally, the per- algorithm for global optimization. In X. S. Yang (Ed.), Cuckoo
search and firefly algorithm, Springer International Publishing,
formance of the SSA is compared with that of the GWO,
Cham (Vol. 516, pp. 315–330).
PSO and GSA on 19 test functions. The results demon- Gandomi, A. H., Yang, X. S., & Alavi, A. H. (2013). Cuckoo search
strate that the proposed SSA can provide highly com- algorithm: A metaheuristic approach to solve structural opti-
petitive results compared with the other state-of-the-art mization problems. Engineering with Computers, 29(1), 17–35.
algorithms in terms of searching precision, convergence Hamilton, W. D. (1971). Geometry for the selfish herd. Journal of
speed, and stability. Moreover, the results of the two prac- Theoretical Biology, 31, 295–311.
Holland, J. H. (1975). Adapation in natural and artificial systems.
tical engineering problems also exhibit that the SSA has Ann Arbor: University of Michigan Press.
high performance in diverse search spaces. As analysis Holland, J. H. (1992). Genetic algorithms. Scientific American, 267,
above, it can be seen that the SSA has a good ability to 66–72.
explore the potential region of the global optimum, and Jamil, M., & Yang, X. S. (2013). A literature survey of bench-
hence the local optimum issue is avoided effectively. mark functions for global optimisation problems. Interna-
tional Journal of Mathematical Modelling and Numerical Opti-
In our further research, we would continue to do more
misation, 4(2), 150–194.
in-depth analysis and research on the SSA. Also, we would Johnson, C., Grant, J. W. A., & Giraldeau, L.-A. (2001). The effect
try to apply this algorithm in more complex practical engi- of handling time on interference among house sparrows for-
neering problems, such as travelling salesman problem aging at different seed densities. Behaviour, 138, 597–614.
(TSP), robot path planning problem, etc. Moreover, we Karaboga, D. (2005). An idea based on honey bee swarm for
would extend the current SSA to deal with the multi- numerical optimization (Technical report-TR06). Erciyes Uni-
versity, Engineering Faculty, Computer Engineering Depart-
objective optimization problem. ment (Vol. 129(2), pp. 2865–2874).
Karaboga, D., & Basturk, B. (2007). A powerful and efficient
algorithm for numerical function optimization: Artificial bee
Disclosure statement colony (ABC) algorithm. Journal of Global Optimization, 39(3),
No potential conflict of interest was reported by the authors. 459–471.
Kennedy, J., & Eberhart, R. (1995). Particle swarm optimization. In
IEEE international conference on neural networks proceedings
Funding (pp. 1942–1948).
Kirkpatrick, S., Gelatto, C. D., & Vecchi, M. P. (1983). Optimization
This work was supported in part by the National Natural Science by simulated annealing. Science, 220, 671–680.
Foundation of China [grant numbers 61873059 and 61922024], Koops, M. A., & Giraldeau, L. A. (1996). Producer-scrounger for-
the Program for Professor of Special Appointment (Eastern aging games in starlings: A test of rate-maximizing and risk-
Scholar) at Shanghai Institutions of Higher Learning of China, sensitive models. Animal Behaviour, 51, 773–783.
and the Natural Science Foundation of Shanghai [grant number Lendvai, A. Z., Barta, Z., Liker, A., & Bokony, V. (2004). The
18ZR1401500]. effect of energy reserves on social foraging: Hungry sparrows
34 J. XUE AND B. SHEN

scrounge more. Proceedings of the Royal Society of London. Yang, X. S. (2008). Nature-inspired metaheuristic algorithms.
Series B: Biological Sciences, 271, 2467–2472. Frome: Luniver Press.
Liker, A., & Barta, Z. (2002). The effects of dominance on Yang, X. S. (2010a). Firefly algorithm, stochastic test functions
social foraging tactic use in house sparrows. Behaviour, 139, and design optimisation. International Journal of Bio-Inspired
1061–1076. Computation, 2(2), 78–84.
Mirjalili, S., Mirjalili, S. M., & Lewis, A. (2014). Grey wolf optimizer. Yang, X. S. (2010b). Test problems in optimization. In Engineering
Advances in Engineering Software, 69, 46–61. optimization: An introduction with metaheuristic applications
Pomeroy, H., & Hepner, F. (1992). Structure of turning in airborne (pp. 261–266).
rock dove (Columba livia) flocks. The Auk, 109(2), 256–267. Yang, X. S., & Deb, S. (2009). Cuckoo search via Lévy flights. In
Pulliam, H. R. (1973). On the advantages of flocking. Journal of Proceedings of the world congress on nature and biologically
Theoretical Biology, 38, 419–422. inspired computing, NaBIC 2009 (pp. 210–214).
Rashedi, E., Nezamabadi-Pour, H., & Saryazdi, S. (2009). GSA: A Yang, X. S., & He, X. (2013). Bat algorithm: Literature review
gravitational search algorithm. Information Sciences, 179(13), and applications. International Journal of Bio-Inspired Compu-
2232–2248. tation, 5(3), 141–149.
Wolpert, D. H., & Macready, W. G. (1997). No free lunch theorems
for optimization. IEEE Transactions on Evolutionary Computa-
tion, 1(1), 67–82.

You might also like