A Novel Swarm Intelligence Optimization Approach Sparrow Search Algorithm
A Novel Swarm Intelligence Optimization Approach Sparrow Search Algorithm
To cite this article: Jiankai Xue & Bo Shen (2020) A novel swarm intelligence optimization
approach: sparrow search algorithm, Systems Science & Control Engineering, 8:1, 22-34, DOI:
10.1080/21642583.2019.1708830
1. Introduction and Shi (2001) and Kennedy and Eberhart (1995), which
The optimization problems are common in engineering mimics the cooperation and foraging behaviour of the
applications such as knapsack problems, data cluster- bird flocks.
ing, data classification, path planning, robot control, and On the other hand, due to ACO and PSO were proven
so on. It is well known that the swarm intelligence (SI) to be very competitive and have the strong global search-
optimization algorithms have been used as primary tech- ing ability, the SI optimization algorithm has attracted
niques to solve global optimization problems because of increasing attention from scholars in this area. Many new
its simplicity, flexibility and high efficiency. It should be algorithms started to be proposed, which imitates the
mentioned that, the SI optimization algorithms mainly social behaviour of organisms such as fishs, birds, or
introduce the randomness in the search process, which insects in nature. For example, the bat algorithm (BA)
is different from deterministic approaches. Note that the has been proposed in Yang and He (2013), which mimics
deterministic algorithm is easy to get trapped in local the echolocation behaviour of bats. The grey wolf opti-
optimal solutions in the complex situation. Therefore, it mizer (GWO) algorithm (Mirjalili, Mirjalili, & Lewis, 2014)
is of practical importance to employ the SI optimization is another popular algorithm, mimicking the leadership
algorithm so as to obtain an optimal solution to the global and hunting behaviour of grey wolves. There are four
optimization problem. types of grey wolves in the GWO algorithm: alpha, beta,
In the past decades, the SI optimization algorithm delta, and omega wolves. The alpha wolves are responsi-
has been developed rapidly and becomes a hotspot ble for making decisions to the pack, whereas beta and
in many fields. So far, there have been many different delta wolves should help the alpha wolves in decision
types of optimization algorithms available in the exist- making process. Furthermore, the rest of the recent SI
ing literature. Among various optimization algorithms, optimization algorithms are: artificial bee colony (ABC)
the ant colony optimization (ACO) algorithm and parti- algorithm (Karaboga, 2005; Karaboga & Basturk, 2007),
cle swarm optimization (PSO) algorithm are representa- firefly algorithm (FA) (Yang, 2008, 2010a), cuckoo search
tive that have received considerable attention. For exam- (CS) algorithm (Yang & Deb, 2009), et al.
ple, the ACO algorithm has been proposed in Dorigo, Except for the SI optimization algorithms, some algo-
Maniezzo, and Colorni (1996), which mimics the bio- rithms are inspired by the concept of natural evolu-
logical characteristics of ants in nature that mark the tion or the physical rules. For example, the well-known
path by pheromone. As one of the well-known algo- genetic algorithm (GA) has been presented in Hol-
rithms, the PSO algorithm has been proposed in Eberhart land (1975, 1992), which is a powerful stochastic search
algorithm based on the principles of natural selection 2. Sparrow search algorithm (SSA)
and natural genetics. Generally, there are consists of
In this section, we discuss the inspiration of the SSA. Then,
three operators: selection, reproduction, and mutation,
the mathematical model and the SSA are described in
which makes GA an efficient global optimizer. More-
details.
over, it should be pointed out that there are also
algorithms usually proposed by simulating the physi- 2.1. Biological characteristics
cal rules such as gravitational search algorithm (GSA)
(Rashedi, Nezamabadi-Pour, & Saryazdi, 2009), simulated The sparrows are usually gregarious birds and have many
annealing (SA) (Kirkpatrick, Gelatto, & Vecchi, 1983), species. They are distributed in most parts of the world
etc. and like to live in places where the human life. More-
On the other hand, we found that although each over, they are omnivorous birds and mainly feed on seeds
algorithm has its advantages, there are also shortcom- of grains or weeds. It is well known that the sparrows
ings by a deeper investigation. For instance, the ACO are common resident birds. In contrast with many other
algorithm has the disadvantage of slow search speed, small birds, the sparrow is strongly intelligent and has a
and the PSO algorithm has the disadvantage of easy strong memory. Note that there are two different types
premature convergence. Therefore, it is very impor- of captive house sparrows, both the producer and the
tant to enhance the current optimization algorithm. scrounger (Barnard & Sibly, 1981). The producers actively
According to the no-free-lunch (NFL) theorem (Wolpert search for the food source, while the scroungers obtain
& Macready, 1997), the expected performance of each food by the producers. Furthermore, the evidence shows
algorithm is the same for solving all optimization prob- that the birds usually use behavioural strategies flexibly,
lems. In other words, an optimization algorithm may and switch between producing and scrounging (Barta,
perform well in a series of problems and show poor Liker, & Mónus, 2004; Coolen, Giraldeau, & Lavoie, 2001;
performance in a different series of problems. Obvi- Koops & Giraldeau, 1996; Liker & Barta, 2002). It can also be
ously, we can solve the different problems by propos- said that, in order to find their food, the sparrows usually
ing new optimization algorithms. At the same time, use the strategy of both the producer and the scrounger
the newly proposed optimization algorithm provides a (Barnard & Sibly, 1981; Johnson, Grant, & Giraldeau, 2001;
new solution to solve a complex global optimization Liker & Barta, 2002).
problem. The studies have shown that the individuals monitor
In response to the above discussions, in this paper, the behaviour of the others in the group. Meanwhile, the
we aim to propose a novel swarm intelligence optimiza- attackers in the bird flock, which want to increase their
tion technique which is called sparrow search algorithm own predation rate, are used to compete food resources
(SSA). The main contributions of this paper are summa- of the companions with high intakes (Bautista, Alonso,
rized as follows: (1) a new SI technique, i.e. the SSA is & Alonso, 1998; Lendvai, Barta, Liker, & Bokony, 2004).
proposed inspired by the sparrow population’s forag- In addition, the energy reserves of the individuals may
ing and anti-predation behaviours; (2) by using the pro- play an important role when the sparrow chooses differ-
posed SSA, both the exploration and the exploitation of ent foraging strategies, and the sparrows with low energy
the search space of the optimization are improved to reserves scrounge more (Lendvai et al., 2004). It is worth
some extent; and (3) the proposed SSA is successfully mentioning that the birds, which located on the periph-
applied in two practical engineering problems. Finally, ery of the population, are more likely to be attacked
in order to test the effectiveness and performance of by predators and constantly try to get a better position
the proposed algorithm in this paper, some compara- (Budgey, 1998; Pomeroy & Hepner, 1992). Note that the
tive experiments are carried out. The simulation results animals, which located on the centre, may move closer
show that the proposed SSA is superior to other exist- to their neighbours in order to minimize their domain
ing algorithms in terms of searching precision, conver- of danger (Hamilton, 1971; Pulliam, 1973). We also know
gence rate, stability and the avoidance of local optimal that all sparrows display the natural instinct of curiosity
value. about everything, and at the same time they are always
The remainder of the paper is organized as follows. vigilant. For example, when a bird does detect a predator,
Section 2 introduces the SSA in detail. Section 3 is the ver- one or more individuals give a chirp and the entire group
ification and comparison of the SSA. Section 4 applies the flies away (Pulliam, 1973).
SSA to the two practical engineering problems and fur-
2.2. Mathematical model and algorithm
ther tests the performance of the algorithm. In Section 5,
we draw the conclusion of this paper and discuss the next According to the previous description of the sparrows,
work. we can establish the mathematical model to construct
24 J. XUE AND B. SHEN
the sparrow search algorithm. For simplicity, we idealized where n shows the number of sparrows, and the value of
the following behaviour of the sparrows and formulated each row in FX represents the fitness value of the individ-
corresponding rules. ual. In the SSA, the producers with better fitness values
have the priority to obtain food in the search process.
(1) The producers typically have high levels of energy In addition, because the producers are responsible for
reserves and provide foraging areas or directions for searching food and guiding the movement of the entire
all scroungers. It is responsible for identifying areas population. Therefore, the producers can search for food
where rich food sources can be found. The level of in a broad range of the places than that of the scroungers.
energy reserves depends on the assessment of the According to rules (1) and (2), during each iteration, the
fitness values of the individuals. location of the producer is updated as below:
(2) Once the sparrow detects the predator, the individ-
⎧
uals begin to chirp as alarming signals. When the ⎨X t · exp −i
if R2 < ST
alarm value is greater than the safety threshold, the Xi,jt+1 = i,j
α · itermax (3)
producers need to lead all scroungers to the safe area. ⎩ t
Xi,j + Q · L if R2 ≥ ST
(3) Each sparrow can become a producer as long as
it searches for the better food sources, but the
where t indicates the current iteration, j = 1, 2, . . . , d. Xi,jt
proportion of the producers and the scroungers is
represents the value of the jth dimension of the ith spar-
unchanged in the whole population.
row at iteration t. itermax is a constant with the largest
(4) The sparrows with the higher energy would be acted
number of iterations. α ∈ (0, 1] is a random number. R2
as the producers. Several starving scroungers are
(R2 ∈ [0, 1]) and ST (ST ∈ [0.5, 1.0]) represent the alarm
more likely to fly to other places for food in order to
value and the safety threshold respectively. Q is a ran-
gain more energy.
dom number which obeys normal distribution. L shows
(5) The scroungers follow the producer who can provide
a matrix of 1 × d for which each element inside is 1.
the best food to search for food. In the meantime,
When R2 < ST, which means that there are no preda-
some scroungers may constantly monitor the pro-
tors around, the producer enters the wide search mode.
ducers and compete for food in order to increasing
If R2 ≥ ST, it means that some sparrows have discovered
their own predation rate.
the predator, and all sparrows need quickly fly to other
(6) The sparrows at the edge of the group quickly move
safe areas.
toward the safe area to get a better position when
As for the scroungers, they need to enforce the rules (4)
aware of danger, while the sparrows in the middle
and (5). As mentioned above, some scroungers monitor
of the group randomly walk in order to be close to
the producers more frequently. Once they find that the
others.
producer has found good food, they immediately leave
their current position to compete for food. If they win,
In the simulation experiment, we need to use virtual
they can get the food of the producer immediately, oth-
sparrows to find food. The position of sparrows can be
erwise they continue to execute the rules (5). The position
represented in the following matrix:
update formula for the scrounger is described as follows:
⎡ ⎤ ⎧ t
x1,1 x1,2 · · · · · · x1,d
⎨Q · exp Xworst −Xi,j
t
⎢x2,1 x2,2 · · · · · · x2,d ⎥ if i > n/2
⎢ ⎥ Xi,jt+1 = i2 (4)
X=⎢ . .. .. .. .. ⎥ (1) ⎩ t+1
⎣ .. . . . . ⎦ XP + |Xi,jt − XPt+1 | · A+ · L otherwise
xn,1 xn,2 · · · · · · xn,d
where XP is the optimal position occupied by the pro-
where n is the number of sparrows and d shows the ducer. Xworst denotes the current global worst location. A
dimension of the variables to be optimized. Then, the represents a matrix of 1 × d for which each element inside
fitness value of all sparrows can be expressed by the is randomly assigned 1 or −1, and A+ = AT (AAT )−1 . When
following vector: i > n/2, it suggests that the ith scrounger with the worse
fitness value is most likely to be starving.
⎡ ⎤ In the simulation experiment, we assume that these
f ([x1,1 x1,2 · · · · · · x1,d ])
⎢f ([x2,1 sparrows, which are aware of the danger, account for 10%
⎢ x2,2 · · · · · · x2,d ])⎥⎥
FX = ⎢ . .. .. .. .. ⎥ (2) to 20% of the total population. The initial positions of
⎣ .. . . . . ⎦ these sparrows are randomly generated in the popula-
f ([xn,1 xn,2 · · · · · · xn,d ]) tion. According to rules (6), the mathematical model can
SYSTEMS SCIENCE & CONTROL ENGINEERING: AN OPEN ACCESS JOURNAL 25
are c1 = c2 = 1.49445, w = 0.729. The parameters of the are to make the algorithm concentrate to exploit during
GSA are G0 = 100, α = 20. The parameters of the SSA the optimization process to find the global optimum.
are set as follows: the number of the producers and SD
accounts for 20% and 10%, respectively, and ST = 0.8. 3.1.1. Analysis of the convergence accuracy
Finally, we get the best value, the mean value, and the As shown in Table 4, the SSA obtains the optimal value for
standard deviation (Std) of the objective function values. solving F1 –F4 . Although the SSA does not get the opti-
With the same standard test function, the average value mal value when solving F5 , it is also significantly better
represents the convergence accuracy of the algorithm, than GWO, PSO, and GSA. On the F6 test function, the con-
and the standard deviation presents the stability of the vergence accuracy of the PSO is higher than GWO, GSA
algorithm. The solutions of the four different algorithms and SSA. For the F7 test function, the SSA is slightly better
are shown in Table 4. than the other three algorithms from the average values
obtained.
Figure 1. The paths of the SSA on the 2-D version of the test functions: (a) Michalewicz function, (b) Drop-wave function, (c) Rastrigin
function.
GWO and GSA. On the F5 test function, the SSA is the and PSO. The SSA gets a good fitness value at the begin-
most stable, and PSO is the most unstable. When deal- ning and converges to a better value after about 400
ing with the F6 test function, the stability of the SSA is iterations for the F5 test function. In addition, It can be
worse than that of GSA and PSO, but it is much better seen from the convergence curves of the F6 and F7 test
than GWO. On the F7 test function, the SSA is similar in functions that the proposed, SSA, not only enhances the
the performance stability to the other three algorithms. In convergence rate but also has strong competitiveness
summary, the simulation experimental results show the compared with other algorithms.
satisfactory performance of the SSA in comparison with From the above, we draw a conclusion that the pro-
the other algorithms that have been reported. posed SSA can quickly find the feasible solution and has
the best performance in terms of efficiency and conver-
gence in dealing with unimodal test functions.
3.1.3. Analysis of the convergence speed
In order to compare the convergence speed of the four
3.2. Multimodal test functions
algorithms intuitively, the fitness curves of the unimodal
test functions are shown in Figure 3. It can be concluded On the multimodal test functions, each of the functions is
that the SSA exhibits an absolute advantage on the F1 –F4 characterized by multiple local optimal solutions, which
test functions, which is obviously better than GWO, GSA makes the algorithm easy to fall into local optimal points.
28 J. XUE AND B. SHEN
Figure 2. The paths of the SSA on the 2-D version of the test functions: (a) Ackley function, (b) Damavandi function.
Therefore, it can be employed to test the local search and SSA has better stability in comparison to GWO, GSA and
global search abilities of the algorithm. PSO. Furthermore, from these results, we clearly know
that the SSA has better performance and strong adapt-
3.2.1. Analysis of the convergence accuracy ability when dealing with multimodal test functions.
From Table 4, it can be seen that there are five test func-
tions, namely F8 –F12 in which the SSA has better out-
performs than the other algorithms. Then, for the F8 test 3.2.3. Analysis of the convergence speed
function, the SSA can achieve the solution which is closest Firstly, we test the convergence speed of the algorithm on
to the optimal value compared to the other algorithms. the multimodal test functions, the results of all the algo-
For the F9 test function, the SSA can successfully find an rithms are shown in Figure 4. Then, for the F8 test function,
excellent solution and it always converges to the global it can be reasonably concluded that the SSA converges to
minimum in each experiment. It is clear that the SSA has a value close to the optimal solution after about 200 iter-
a good global search capability. The search abilities of SSA ations. This also makes it clearer that the SSA highlights
and GWO on the F10 test function are basically the same, its superiority. It can be proved from the F9 test function
followed by GSA, and the worst is PSO. On the F11 test that the SSA converges to the optimal value after about
function, the four algorithms can quickly converge to the 20 iterations and the GWO converges to the optimal value
optimal value of the function. But by obtaining the mean, after about 180 iterations. As a result, the proposed SSA
we can notice that although GWO, GSA and PSO can find has a much faster speed than others. For the remaining
the global optimal value, it is easy to fall into the local opti- functions, the proposed algorithm also obtains very com-
mum value during the iterative process. For the F12 test petitive results. Overall, the figure shows that the SSA has
function, the SSA shows good performance in all aspect. higher search efficiency and convergence rate than the
On the whole, the SSA algorithm has strong exploration other three algorithms for the multimodal functions, and
capability. the processing process is very stable.
After the above analysis, we further conclude that the
3.2.2. Analysis of the stability SSA has global search ability and strong adaptability.
On the F8 test function, we can see that the proposed Note that it is controlled by the mechanism of the SSA
algorithm is better than the other three algorithms in itself. In the SSA, the different behavioural strategies of
terms of solution accuracy, but it is relatively poor in sta- the sparrows have made great contributions to the global
bility. For the remaining functions, i.e. F9 –F12 in which the search.
SYSTEMS SCIENCE & CONTROL ENGINEERING: AN OPEN ACCESS JOURNAL 29
9.7994E−19
1.3844E−09
3.6476E+01
1.3933E−10
1.2923E−01
3.3. Fixed-dimension test function
8.4977E−19
1.6588E−03
1.5989E−10
3.1101E−02
6.5368E−16
7.3655E−02
4.9189E−04
2.6097E−15
1.3323E−15
3.9740
0.0012
404.0800
2.9088
0.3537
Std
In order to more fully test the performance of the SSA, we
selected seven fixed-dimensional functions to verify the
convergence speed, stability and convergence accuracy
−9.3836E−01
3.6298E−18
9.9675E−09
1.1588E+02
1.0016E−09
2.5982E+01
3.8571E−18
5.5783E−03
1.5116E−09
1.0367E−02
1.9607E−03
7.6944
0.1687
−1.0316
1.8186
−0.0085
−3.8628
−3.3220
of the algorithm.
−3059.1
GSA
Ave
1.2460E−09
1.5659E−20
7.8978E−04
2.9549
−1.0316
0.3394
−0.0114
−3.8628
−3.3220
−3996.8
the optimal value quickly and efficiently. More concretely,
0.0
−1.0
Best
1.8180E−24
2.7000E−03
6.7752E−16
2.6776E−04
2.7101E−15
0.0712
0.0097
0.3651
0.3936
0.4795
35.9017
838.1568
9.8581
0.0220
0.6375
0.1209
0.0592
advantageous all of the comparison algorithms. More-
Std
1.2547E−24
7.0000E−03
5.9429E−04
0.0857
0.0100
46.5122
44.4083
0.0160
−3.8628
0.3429
0.0484
−1.0316
1.9333
−0.5732
−0.6667
−3.2744
mances almost same, but the other three algorithms are
−6960.8
Ave
PSO
easy to fall into local optimum. For the F17 test function,
the four algorithms may trap into local optima in each
independent experiment. On the F18 test function, all four
−8.5487E−14
1.3375E−26
1.2849E−13
8.8377E−04
2.5356E−26
3.2000E−03
7.1942E−14
4.0845E−27
3.0749E−04
8.1702
29.8495
−1.0316
−3.8628
−3.3220
−1.0
−1.0
−8700.4
0.0
Best
1.3641E−04
2.5721E−15
3.7560E−03
1.4395E−02
2.0977E−09
3.7200E−04
0.7228
0.1948
1.4976
0.5920
0.1825
0.4901
0.0020
615.6736
0.0659
2.3525E−04
1.0125E−14
1.4380E−03
3.3584E−02
3.3700E−04
26.0690
0.2325
0.2734
−1.0316
1.7593
−0.0336
−0.3667
−3.8622
−3.2347
3.6952E−06
3.1934E−05
7.9936E−15
6.2991E−03
3.7946E−06
3.0749E−04
25.1280
−3.8628
−3.3220
zero on the F15 and F16 test functions, which shows that
Best
0.0
−1.0
−1.0
−7570.5
0.0
1.5378E−11
6.5843E−16
1.3466E−09
1.5695E−05
2.7101E−15
698.7294
0.0605
0.0
0.0
0.0
0.0
0.0
5.7257E−278
Table 4. Results of test functions.
9.2495E−07
6.8266E−11
1.3458E−04
8.8818E−16
7.1744E−12
3.9574E−10
3.1035E−04
−1.0316
−3.8628
−3.2625
−7726.67
0.0
0.0
0.0
−1.0
−1.0
SSA
Ave
8.8818E−16
8.6262E−17
1.7097E−13
3.0749E−04
−1.0316
−3.8628
−3.3220
the same during the optimization process. For the F15 test
Best
0.0
0.0
0.0
0.0
−9013.0
0.0
0.0
−1.0
−1.0
there are four test functions, namely F16 –F19 in which the
F
30 J. XUE AND B. SHEN
Figure 3. The convergence characteristics of the four algorithms on the unimodal test functions.
Figure 4. The convergence characteristics of the four algorithms on the multimodal test functions.
SYSTEMS SCIENCE & CONTROL ENGINEERING: AN OPEN ACCESS JOURNAL 31
Figure 5. The convergence characteristics of the four algorithms on the fixed-dimension test functions.
convergence speed of the SSA is faster than that of GWO, The formula described as follows:
PSO and GSA. This is because the SSA rapidly converges
p
to a stable value at the beginning of iteration. f̌ (x) = f (x) + φ gκj (x)δ(gj (x)) (6)
The simulation results show that the SSA has strong j=1
optimization ability for the optimization of the uni-
where κ ∈ {1, 2}, and φ 1 is the penalty parameter. f̌ (x)
modal test functions, multimodal test functions and
is the penalized objective function, and f (x) is the original
fixed-dimension test functions. Moreover, it is seen that
fitness function. p denotes the number of the inequality
the SSA has a certain competitiveness with other state-
constraints. gj (x)(j = 1,2, . . . ,p) are the inequality con-
of-the-art algorithms. Therefore, it is believed that the
straints. In addition, δ(gj (x)) is defined as
SSA is able to achieve a certain balance between global
exploration and local exploitation. 1, if gj (x) > 0
δ(gj (x)) =
0, if gj (x) ≤ 0
4. Case studies
4.1. Case I. Himmelblau’s nonlinear optimization
In this section, the two practical engineering problems are problem
chosen to illustrate the competitiveness of the proposed
The Himmelblau’s nonlinear optimization is a well-known
algorithm in solving constrained optimization problems
benchmark problem, which has been applied to many
with mix variables. For the processing of the inequal-
fields. The problem is outlined as
ity constraints in the problem we use penalty functions,
which embeds constraints into the objective function. min : f (x) = 5.3578547x32 + 0.8356891x1 x5
32 J. XUE AND B. SHEN
+ 37.293239x1 − 40, 792.141 stress, transverse deflections of the shafts and stresses in
the shafts. The variables b(x1 ), m(x2 ), z(x3 ), l1 (x4 ), l2 (x5 ),
s.t. g1 (x) = 85.334407 + 0.0056858x2 x5
d1 (x6 ), d2 (x7 ) are the face width, module of teeth, number
+ 0.0006262x1 x4 − 0.0022053x3 x5 of teeth on pinion, length of shaft one between bearings,
g2 (x) = 80.51249 + 0.0071317x2 x5 length of shaft two between bearings and the diameter of
the first and second shafts, repectively. These constraints
+ 0.0029955x1 x2 + 0.0021813x32 and the problem are designed as below:
g3 (x) = 9.300961 + 0.0047026x3 x5
min : f (b; m; z; l1 ; l2 ; d1 ; d2 ) = 0.7854bm2 (3.3333z2
+ 0.0012547x1 x3 + 0.0019085x3 x4
+ 14.9334z − 43.0934) − 1.508b(d12 + d22 )
0 ≤ g1 (x) ≤ 92
+ 7.4777(d13 + d23 ) + 0.7854(l1 d12 + l2 d22 )
90 ≤ g2 (x) ≤ 110
27
20 ≤ g3 (x) ≤ 25 s.t. g1 (x) = − 1 ≤ 0,
bm2 z
78 ≤ x1 ≤ 102 397.5
g2 (x) = −1≤0
bm2 z2
33 ≤ x2 ≤ 45
1.93l13
27 ≤ x3 ≤ 45 g3 (x) = −1≤0
mzd14
27 ≤ x4 ≤ 45
1.93l23
27 ≤ x5 ≤ 45 (7) g4 (x) = −1≤0
mzd24
√
We independently run 30 times to generate the sta- M2 + 16.9 × 106
g5 (x) = −1≤0
tistical results. Table 5 illustrates the optimized results 110d13
obtained by the SSA for the Himmelblau’s nonlin- √
H2 + 157.5 × 106
ear optimization problem. Moreover, the feasible best g6 (x) = −1≤0
85d23
solution is x = (78, 33, 29.9953, 45, 36.7758) with f (x) =
mz
−30, 665.5387. The constraint values are g = (92, 98.8405, g7 (x) = −1≤0
20). It can be clearly seen that the proposed algorithm is 40
5m
feasible on this issue. g8 (x) = −1≤0
b
b
4.2. Case II. Speed reducer design g9 (x) = −1≤0
12m
The Figure 6 presents the designing of the speed reducer, 1.5d1 + 1.9
g10 (x) = −1≤0
which is to minimize the total weight and subject to con- l1
straints on the bending stress of the gear teeth, surfaces 1.1d2 + 1.9
g11 (x) = −1≤0
l2
Table 5. Results obtained by the SSA for the Himmelblau’s non- 2.6 ≤ b ≤ 3.6
linear optimization problem.
0.7 ≤ m ≤ 0.8
Best Worst Mean Std No. sparrow
17 ≤ z ≤ 28
−30,665.5387 −30,662.8505 −30,665.3808 0.5713 40
7.3 ≤ l1 ≤ 8.3
7.8 ≤ l2 ≤ 8.3
2.9 ≤ d1 ≤ 3.9
(8)
where
745l1 745l2
M= , H=
mz mz
In the SSA, the maximum number of the iterations
is 3000 and the sparrow population size is set to 300.
Figure 6. Schematic of the speed reducer design (Gandomi, We independently run 30 times. Table 6 shows the opti-
Yang, & Alavi, 2013). mization results for the speed reducer problem. In this
SYSTEMS SCIENCE & CONTROL ENGINEERING: AN OPEN ACCESS JOURNAL 33
Table 6. Results obtained by the SSA for the speed reducer prob- References
lem.
Barnard, C. J., & Sibly, R. M. (1981). Producers and scroungers: A
Best Worst Mean Std general model and its application to captive flocks of house
2996.7077 3008.1638 2997.7101 2.8209 sparrows. Animal Behaviour, 29, 543–550.
Barta, Z., Liker, A., & Mónus, F. (2004). The effects of predation
risk on the use of social foraging tactics. Animal Behaviour, 67,
301–308.
case, the optimal solution obtained by our algorithm is Bautista, L. M., Alonso, J. C., & Alonso, J. A. (1998). Foraging site
x = (3.500059, 0.7, 17, 7.3, 7.8, 3.351209, 5.286813) with a displacement in common crane flocks. Animal Behaviour, 56,
1237–1243.
function value of 2996.7077. Here, the constraint val-
Budgey, R. (1998). Three dimensional bird flock structure and
ues are g = (−0.073931, −0.19801, −0.49977, −0.90148, its implications for birdstrike tolerence in aircraft. Stara Lesna:
−0.00089021, −7.374e − 05, −0.7025, −1.6971e − 05, International Bird Strike Committee.
−0.58333, −0.051121, −0.010834). Hence, our result is Coolen, I., Giraldeau, L. A., & Lavoie, M. (2001). Head position as
feasible and verifies the effectiveness of the proposed an indicator of producer and scrounger tactics in a ground-
algorithm. feeding bird. Animal Behaviour, 61, 895–903.
Dorigo, M., Maniezzo, V., & Colorni, A. (1996). The ant system:
Optimization by a colony of cooperating agents. IEEE Trans-
actions on Systems, Man and Cybernetics, Part B (Cybernetics),
5. Conclusion 26(1), 29–41.
Eberhart, R. C., & Shi, Y. (2001). Particle swarm optimiza-
In this paper, we present an effective optimization tech- tion: Developments, applications and resources. In Pro-
nique, the sparrow search algorithm, which simulates ceedings of the IEEE congress on evolutionary computation
the foraging and anti-predation behaviours of sparrows. (pp. 81–86).
Then, it introduces the mathematical model and the Fateen, S. E. K., & Bonilla-Petriciolet, A. (2014). Intelligent firefly
framework of the proposed algorithm. Finally, the per- algorithm for global optimization. In X. S. Yang (Ed.), Cuckoo
search and firefly algorithm, Springer International Publishing,
formance of the SSA is compared with that of the GWO,
Cham (Vol. 516, pp. 315–330).
PSO and GSA on 19 test functions. The results demon- Gandomi, A. H., Yang, X. S., & Alavi, A. H. (2013). Cuckoo search
strate that the proposed SSA can provide highly com- algorithm: A metaheuristic approach to solve structural opti-
petitive results compared with the other state-of-the-art mization problems. Engineering with Computers, 29(1), 17–35.
algorithms in terms of searching precision, convergence Hamilton, W. D. (1971). Geometry for the selfish herd. Journal of
speed, and stability. Moreover, the results of the two prac- Theoretical Biology, 31, 295–311.
Holland, J. H. (1975). Adapation in natural and artificial systems.
tical engineering problems also exhibit that the SSA has Ann Arbor: University of Michigan Press.
high performance in diverse search spaces. As analysis Holland, J. H. (1992). Genetic algorithms. Scientific American, 267,
above, it can be seen that the SSA has a good ability to 66–72.
explore the potential region of the global optimum, and Jamil, M., & Yang, X. S. (2013). A literature survey of bench-
hence the local optimum issue is avoided effectively. mark functions for global optimisation problems. Interna-
tional Journal of Mathematical Modelling and Numerical Opti-
In our further research, we would continue to do more
misation, 4(2), 150–194.
in-depth analysis and research on the SSA. Also, we would Johnson, C., Grant, J. W. A., & Giraldeau, L.-A. (2001). The effect
try to apply this algorithm in more complex practical engi- of handling time on interference among house sparrows for-
neering problems, such as travelling salesman problem aging at different seed densities. Behaviour, 138, 597–614.
(TSP), robot path planning problem, etc. Moreover, we Karaboga, D. (2005). An idea based on honey bee swarm for
would extend the current SSA to deal with the multi- numerical optimization (Technical report-TR06). Erciyes Uni-
versity, Engineering Faculty, Computer Engineering Depart-
objective optimization problem. ment (Vol. 129(2), pp. 2865–2874).
Karaboga, D., & Basturk, B. (2007). A powerful and efficient
algorithm for numerical function optimization: Artificial bee
Disclosure statement colony (ABC) algorithm. Journal of Global Optimization, 39(3),
No potential conflict of interest was reported by the authors. 459–471.
Kennedy, J., & Eberhart, R. (1995). Particle swarm optimization. In
IEEE international conference on neural networks proceedings
Funding (pp. 1942–1948).
Kirkpatrick, S., Gelatto, C. D., & Vecchi, M. P. (1983). Optimization
This work was supported in part by the National Natural Science by simulated annealing. Science, 220, 671–680.
Foundation of China [grant numbers 61873059 and 61922024], Koops, M. A., & Giraldeau, L. A. (1996). Producer-scrounger for-
the Program for Professor of Special Appointment (Eastern aging games in starlings: A test of rate-maximizing and risk-
Scholar) at Shanghai Institutions of Higher Learning of China, sensitive models. Animal Behaviour, 51, 773–783.
and the Natural Science Foundation of Shanghai [grant number Lendvai, A. Z., Barta, Z., Liker, A., & Bokony, V. (2004). The
18ZR1401500]. effect of energy reserves on social foraging: Hungry sparrows
34 J. XUE AND B. SHEN
scrounge more. Proceedings of the Royal Society of London. Yang, X. S. (2008). Nature-inspired metaheuristic algorithms.
Series B: Biological Sciences, 271, 2467–2472. Frome: Luniver Press.
Liker, A., & Barta, Z. (2002). The effects of dominance on Yang, X. S. (2010a). Firefly algorithm, stochastic test functions
social foraging tactic use in house sparrows. Behaviour, 139, and design optimisation. International Journal of Bio-Inspired
1061–1076. Computation, 2(2), 78–84.
Mirjalili, S., Mirjalili, S. M., & Lewis, A. (2014). Grey wolf optimizer. Yang, X. S. (2010b). Test problems in optimization. In Engineering
Advances in Engineering Software, 69, 46–61. optimization: An introduction with metaheuristic applications
Pomeroy, H., & Hepner, F. (1992). Structure of turning in airborne (pp. 261–266).
rock dove (Columba livia) flocks. The Auk, 109(2), 256–267. Yang, X. S., & Deb, S. (2009). Cuckoo search via Lévy flights. In
Pulliam, H. R. (1973). On the advantages of flocking. Journal of Proceedings of the world congress on nature and biologically
Theoretical Biology, 38, 419–422. inspired computing, NaBIC 2009 (pp. 210–214).
Rashedi, E., Nezamabadi-Pour, H., & Saryazdi, S. (2009). GSA: A Yang, X. S., & He, X. (2013). Bat algorithm: Literature review
gravitational search algorithm. Information Sciences, 179(13), and applications. International Journal of Bio-Inspired Compu-
2232–2248. tation, 5(3), 141–149.
Wolpert, D. H., & Macready, W. G. (1997). No free lunch theorems
for optimization. IEEE Transactions on Evolutionary Computa-
tion, 1(1), 67–82.