SCA with source code
SCA with source code
Knowledge-Based Systems
journal homepage: www.elsevier.com/locate/knosys
a r t i c l e i n f o a b s t r a c t
Article history: This paper proposes a novel population-based optimization algorithm called Sine Cosine Algorithm (SCA)
Received 8 June 2015 for solving optimization problems. The SCA creates multiple initial random candidate solutions and re-
Revised 19 December 2015
quires them to fluctuate outwards or towards the best solution using a mathematical model based on
Accepted 27 December 2015
sine and cosine functions. Several random and adaptive variables also are integrated to this algorithm to
Available online 6 January 2016
emphasize exploration and exploitation of the search space in different milestones of optimization. The
Keywords: performance of SCA is benchmarked in three test phases. Firstly, a set of well-known test cases includ-
Optimization ing unimodal, multi-modal, and composite functions are employed to test exploration, exploitation, local
Stochastic optimization optima avoidance, and convergence of SCA. Secondly, several performance metrics (search history, tra-
Constrained optimization jectory, average fitness of solutions, and the best solution during optimization) are used to qualitatively
Meta-heuristic observe and confirm the performance of SCA on shifted two-dimensional test functions. Finally, the cross-
Population-based algorithm
section of an aircraft’s wing is optimized by SCA as a real challenging case study to verify and demon-
strate the performance of this algorithm in practice. The results of test functions and performance met-
rics prove that the algorithm proposed is able to explore different regions of a search space, avoid local
optima, converge towards the global optimum, and exploit promising regions of a search space during
optimization effectively. The SCA algorithm obtains a smooth shape for the airfoil with a very low drag,
which demonstrates that this algorithm can highly be effective in solving real problems with constrained
and unknown search spaces. Note that the source codes of the SCA algorithm are publicly available at
http://www.alimirjalili.com/SCA.html.
© 2015 Elsevier B.V. All rights reserved.
http://dx.doi.org/10.1016/j.knosys.2015.12.022
0950-7051/© 2015 Elsevier B.V. All rights reserved.
S. Mirjalili / Knowledge-Based Systems 96 (2016) 120–133 121
Fig. 2. Example of a search space with two variables and several constraints.
Without the loss of generality, a single-objective optimization Fig. 3. Stochastic population-based optimizers consider the system as black box.
can be formulated as a minimization problem as follows:
lbi ≤ xi ≤ ubi i = 1, 2, . . . , n (2.4) In the field of optimization, in 1977, a revolutionary idea was
proposed by Holland where evolutionary concepts in nature was
where n is number of variables, m indicates the number of inequal- simulated in computer for solving optimization problems [44]. The
ity constraints, p shows the number of equality constraints, lbi is GA algorithm came to existence and opened a new way of tackling
the lower bound of the i-th variable, and ubi is the upper bound of challenging problems in different fields of study. The general idea
the i-th variable. of the GA algorithm was very simple. It mimicked selection, re-
As can be seen in Eqs. (2.2) and (2.3), there are two types combination, and mutation of genes in nature. In fact, the Darwin’s
of constraints: inequality and equality. The set of variables, con- theory of evolution was the main inspiration of this algorithm. In
straints, and objective constructs a search space for a given prob- GA, the optimization process is started by creating a set of random
lem. Unfortunately, it is usually impossible to draw the search solutions as candidate solutions (individuals) for a given optimiza-
space due to the high-dimensionality of the variables. However, an tion problem. Each variable of the problem is considered as a gene
example of a search space constructed by two variables and several and the set of variables is analogous to chromosomes. Similarly to
constraints are shown in Fig. 2. nature, a cost function defines the fitness of each chromosome.
It may be observed in Fig. 2 that the search space can have The whole set of solutions is considered as a population. When
multiple local optima, but one of them is the global optimum (or the fitness of chromosomes are calculated, the best chromosomes
more than one in case of a flat landscape). The constraints create are randomly selected for creating the next population. They main
gaps in the search space and occasionally split it to various sepa- inspiration of the GA algorithm is here, in which the fittest individ-
rated regions. In the literature, infeasible regions refer to the areas uals have higher probability to be selected and participated in cre-
of the search space that violate constraints. ating the next population similarly to what is happening in nature.
The search space of a real problem can be super challenging. The next step is the combination of the individuals selected. In this
Some of the difficulties of the real search spaces are discontinuity, step the genes from pairs of individuals are randomly merged to
large number of local optima, large number of constrains, global produce new individuals. Eventually, some of the individuals’ genes
optimum located on the boundaries of constraints, deceptive in the population are changed randomly to mimic mutation.
valleys towards local optima, and isolation of the global optimum. The GA algorithm proved that the nature-inspired paradigms
An optimization algorithm should be equipped with suitable can be very simple yet powerful in optimizing problems. After the
operators for handling all these difficulties to find the global proposal of the GA algorithm, the field of stochastic optimization
optimum. techniques received much attention. The PSO algorithm [52] is the
With formulating a problem, an optimizer would be able to outcome of this popularity several years after the invention of the
tune its variables based on the outputs and constraints. As men- GA algorithm. The PSO algorithm mimics the social and individual
tioned in Section 1, one of the advantages of stochastic algorithms behavior of herd of animals, schools of fishes, or flocks of birds in
is that they consider a system as a black box. Fig. 3 shows that foraging. Similarly to the GA algorithm, the optimization process
the optimizer only provides the system with variables and ob- starts with a set of randomly created solutions. In addition to the
serves the outputs. The optimizer then iteratively and stochasti- set of solutions, there is another set called velocity set which is
cally changes the inputs of the system based on the feedbacks responsible for storing and defining the amount of movement of
(output) obtained so far until the satisfaction of an end criterion. particles. During optimization, the velocity of a particle is updated
The process of changing the variables based on the history of out- based on the best solution that it has obtained so far as well as
puts is defined by the mechanism of an algorithm. For instance, the best solution that the swarm has found. There are three ran-
PSO saves the best solutions obtained so far and encourages new dom components in defining the tendency towards previous veloc-
solutions to relocate around them. ity, effect of the personal best, and the impact of the global best.
S. Mirjalili / Knowledge-Based Systems 96 (2016) 120–133 123
Since the best solutions are saved in the PSO algorithm, there is tion, the NFL theorem says that all algorithms perform equal on
always high possibility of finding better solutions when searching all optimization problems. Therefore, there are still problems that
around them. This is the key reason about the success of the PSO have not yet been solved, or they can be solved better by new al-
algorithm. gorithms. These two reasons are the main motivations of this work,
After the development of GA and PSO algorithms, several in which a novel population-based optimization algorithm is pro-
algorithms were developed and proposed as well. As men- posed and compared to the current well-known algorithms in the
tioned in the introduction, they can be divided to two main literature.
classes: individual-based versus population-based algorithms. The
individual-based algorithm creates only a single solutions and 3. Sine Cosine Algorithm (SCA)
evolves/improves it over the course of iterations. However, a
population-based algorithm initializes the optimization process by Generally speaking, population-based optimization techniques
more than one solutions. The solutions in this set are then en- start the optimization process with a set of random solutions. This
hanced over the course of iterations. The way that these two fami- random set is evaluated repeatedly by an objective function and
lies perform optimization are illustrated in Fig. 4. The advantage of improved by a set of rules that is the core of an optimization tech-
individual-based algorithms is the need for a low number of func- nique. Since population-based optimization techniques look for the
tion evaluation because a single solution only needs one function optima of optimization problems stochastically, there is no guaran-
evaluation. Therefore, such optimization techniques require 1 × T tee of finding a solution in a single run. However, with enough
number of function evaluations where T is the maximum number number of random solutions and optimization steps (iterations),
of iterations. However, high probability of local optima stagnation the probability of finding the global optimum increases.
and lack of information sharing are the main drawbacks of these Regardless of the differences between algorithms in the field of
algorithms, which is due to the low number of solutions. Fig. 4(a) stochastic population-based optimization, the common is the di-
shows that the single candidate solution entraps in the local op- vision of optimization process to two phases: exploration versus
tima which is very close the global optimum. exploitation [62]. In the former phase, an optimization algorithm
In contrary, population-based algorithms benefit from high combines the random solutions in the set of solutions abruptly
local optima avoidance since they employ multiple solutions. with a high rate of randomness to find the promising regions of
Fig. 4(b) illustrates how the collection of candidate solutions re- the search space. In the exploitation phase, however, there are
sults in finding the global optimum. Multiple solutions also assist gradual changes in the random solutions, and random variations
a population-based algorithm to collect information from differ- are considerably less than those in the exploration phase.
ent regions of the search space easily. This is done by informa- In this work, the following position updating equations are pro-
tion exchange between the search agents during the optimization posed for both phases:
process. Therefore, search agents are able to better and faster ex-
plore and exploit search spaces. However, the main drawbacks of Xit+1 = Xit + r1 × sin (r2 ) × r3 Pit − Xit (3.1)
these methods is the large number of function evaluation. Such
optimization techniques require n × T number of function evalu- X t+1 = Xit + r1 × cos(r2 ) × r3 Pit − X ti
i i
(3.2)
ations where n is the number of solutions (search agents) and T is
where Xit is the position of the current solution in i-th dimension
the maximum number of iterations.
at t-th iteration, r1 /r2 /r3 are random numbers, Pi is position of the
destination point in i-th dimension, and || indicates the absolute
2.3. Motivation of this work value.
These two equations are combined to be used as follows:
Despite the need for more function evaluations, the literature
shows that population-based algorithms are highly suitable for Xit + r1 × sin (r2 ) × r3 Pit − Xit , r4 < 0.5
Xit+1 = (3.3)
solving real challenging problems since they are able avoid local X t + r1 × cos(r2 ) × r3 Pt − X t ,
i i ii
r4 ≥ 0.5
optima, explore the search space, and exploit the global optimum
more reliably compared to individual-based algorithms. In addi- where r4 is a random number in [0,1]
124 S. Mirjalili / Knowledge-Based Systems 96 (2016) 120–133
Fig. 5. Effects of Sine and Cosine inn Eqs. (3.1) and (3.2) on the next position.
a
r1 = a − t (3.4)
T
Fig. 7. Sine and cosine with the range in [−2,2] allow a solution to go around (inside the space between them) or beyond (outside the space between them) the destination.
• The SCA algorithm smoothly transits from exploration to 4. Results and discussion
exploitation using adaptive range in the sine and cosine
functions. In the field of optimization using meta-heuristics and evolu-
• The best approximation of the global optimum is stored in a tionary algorithms, several test cases should be employed to con-
variable as the destination point and never get lost during op- firm the performance of an algorithm. This is due to the stochastic
timization. nature of these algorithms, in which a proper and sufficient set
• Since the solutions always update their positions around the of test functions and case studies should be employed to confi-
best solution obtained so far, there is a tendency towards the dently make sure that the superior results are not happened by
best regions of the search spaces during optimization. chance. However, there is no clear definition of suitability for a set
• Since the proposed algorithm considers optimization problem of benchmark cases studies. Therefore, researchers try to test their
as black boxes, it is readily incorporable to problems in differ- algorithms on as many test cases as possible. This paper also em-
ent fields subject to proper problem formulation. ploys several test functions with different characteristics. Later, a
real challenging Computational Fluid Dynamics (CFD) problem is
The next sections employ a wide range of test problems and solved by the SCA algorithm as well.
one real case study to investigate, analyse, and confirm the effec- The set of cases studies employed includes three families
tiveness of the SCA algorithm. of test functions: unimodal, multi-modal, and composite test
126 S. Mirjalili / Knowledge-Based Systems 96 (2016) 120–133
Table 1
Results on benchmark functions.
ave std ave std ave std ave std ave std ave std ave
F1 0.0000 0.0000 0.0003 0.0011 0.8078 0.4393 1.0000 1.0000 0.2111 0.0717 0.0004 0.0002 0.0000
F2 0.0000 0.0001 0.0693 0.2164 0.5406 0.2363 1.0000 1.0000 0.9190 0.7804 0.0177 0.0179 0.0100
F3 0.0371 0.1372 0.0157 0.0158 0.5323 0.2423 1.0000 1.0000 0.2016 0.1225 0.0000 0.0004 0.0016
F4 0.0965 0.5823 0.0936 0.4282 0.8837 0.7528 1.0000 1.0000 0.8160 0.5618 0.0000 0.0107 0.1177
F5 0.0005 0.0017 0.0000 0.0000 0.6677 0.4334 1.0000 1.0000 0.0813 0.0426 0.0000 0.0000 0.0000
F6 0.0002 0.0001 0.0004 0.0033 0.7618 0.7443 1.0000 1.0000 0.2168 0.1742 0.0004 0.0002 0.0000
F7 0.0000 0.0014 0.0398 0.0634 0.5080 0.1125 1.0000 1.0000 0.3587 0.2104 0.0009 0.0022 0.0021
F8 1.0000 0.0036 1.0000 0.0036 1.0000 0.0055 0.0000 1.0000 1.0000 0.0029 1.0000 0.0168 1.0000
F9 0.0000 0.7303 0.3582 0.8795 1.0000 0.6881 0.4248 1.0000 0.8714 0.8665 0.0190 0.3298 0.0222
F10 0.3804 1.0000 0.1045 0.0541 0.8323 0.0686 0.8205 0.0796 1.0000 0.0162 0.0000 0.0079 0.1569
F11 0.0000 0.0051 0.0521 0.0448 0.7679 0.2776 1.0000 1.0000 0.2678 0.0706 0.0074 0.0001 0.4011
F12 0.0000 0.0000 0.0000 0.0000 0.4573 0.4222 1.0000 1.0000 0.0008 0.0015 0.0000 0.0000 0.0000
F13 0.0000 0.0000 0.0000 0.0000 0.6554 0.8209 1.0000 1.0000 0.0187 0.0375 0.0000 0.0000 0.0000
F14 0.3908 0.1924 0.1816 1.0000 0.4201 0.1610 1.0000 0.6977 0.3786 0.1716 0.0000 0.9571 0.0961
F15 0.0230 0.0676 0.3016 1.0000 0.0000 0.0779 1.0000 0.7614 0.2235 0.4252 0.4395 0.9135 0.2926
F16 0.0497 0.4921 0.0427 0.7228 0.0000 0.2422 0.3572 0.7629 0.2652 0.6012 0.5298 1.0000 1.0000
F17 0.0000 0.1105 0.0249 1.0000 0.1093 0.1873 0.8189 0.7754 0.5197 0.4847 0.7093 0.8842 0.7887
F18 0.0129 0.0134 0.1772 0.4289 0.0000 0.0538 1.0000 0.2855 0.1310 0.0429 0.0723 0.2069 0.8018
F19 0.0000 0.2001 0.7727 1.0000 0.0192 0.0312 1.0000 0.2142 0.3192 0.4635 0.8176 0.7924 0.9950
Sum 1.9911 3.5379 3.2346 6.8619 9.9634 5.9972 16.4214 15.5767 7.8004 5.1479 3.6143 5.1403 5.6858
Table 2
p-Values of the Wilcoxon ranksum test over all runs (p > = 0.05 have been underlined).
functions [63–66]. The mathematical formulation of these test a non-parametric statistical test called Wilcoxon ranksum test is
functions are available in the appendix. The first family of test conducted as well. The p-values obtained from this statistical test
functions has no local optima and there is only one global optima. are reported in Table 2.
This makes them highly suitable for testing the convergence speed The results in Table 1 show that the SCA algorithm outper-
and exploitation of algorithms. The second group of test functions, forms others on the majority of the test cases. Firstly, the SCA al-
however, has multiple local solutions in addition to the global op- gorithm shows superior results on 3 out of 6 unimodal test func-
timum. These characteristics are beneficial for testing local optima tions. The p-values in Table 2 show that this superiority is statis-
avoidance and explorative ability of an algorithm. Finally, the com- tically significant. Due to the characteristics of the unimodal test
posite test functions are the rotated, shifted, biased, and combined functions, these results strongly show that the SCA algorithm has
version of several unimodal and multi-modal test functions. a high exploitation and convergence. Secondly, Table 1 shows that
For solving the aforementioned test functions, a total of 30 the SCA algorithm outperforms all of the algorithms employed on
search agents are allowed to determine the global optimum over the majority of the multi-modal test functions (F7, F9, F11, and
500 iterations. The SCA algorithm is compared to Firefly Algorithm F12). The p-values in Table 2 also support the better results of
(FA) [67], Bat Algorithm (BA) [68], Flower Pollination Algorithm SCA statistically. Inspecting the results of this table, the SCA al-
(FPA) [69], Gravitational Search Algorithm (GSA) [54], PSO and GA gorithm provides p-values greater than 0.05 for the rest of test
for verification of the results. Since the results of a single run functions, showing that this algorithm is very competitive. These
might be unreliable due to the stochastic nature of meta-heuristics, results prove that the SCA algorithm benefits from high explo-
all of the algorithms are run 30 times and statistical results (mean ration and local optima avoidance. Finally, the results of the pro-
and standard deviation) are collected and reported in Table 1. Note posed algorithm on the composite test functions in Tables 1 and
that the results are normalized in [0, 1] to compare the results of 2 demonstrate the merits of SCA in solving problems with chal-
all test functions. To decide about the significance of the results, lenging search spaces. Due to the normalization of the results, the
S. Mirjalili / Knowledge-Based Systems 96 (2016) 120–133 127
Fig. 10. Search history of search agents when solving the test problems.
overall performance of algorithms can be compared as well. The search agents are improved during optimization despite the rapid
last row of Table 1 presents the summation of the average and and steady changes in Fig. 11. In order to confirm the improvement
standard deviation of algorithms on all test functions. It is evident of all solutions, the average fitness of all search agents during op-
that SCA shows the minimum values for both ave and std, proving timization is illustrated in Fig. 12.
that this algorithm reliably outperforms others in total. This figure shows that the average fitness of all search agents
Although the above-discussed results prove and verify the high tend to be decreased over the course of iterations. The interest-
performance of the SCA algorithm, there are several other exper- ing pattern that can be observed in this figure is the high fluctu-
iments that need to be done to confidently confirm the perfor- ation of the average fitness in the exploration phase (until nearly
mance of this algorithm in solving real problems. In other words, the 50th iteration) and low changes in the average fitness in the
the behavior of search agents during optimization should be mon- exploitation phase (after 50th iteration). Deterioration of the fit-
itored to observe: how they move around the search space, if they ness of some of the search agents is unavoidable in the explo-
face abrupt changes in the initial stages of optimization to explore ration phase where the SCA algorithm should discover the promis-
the search space, if they undergo small changes in the final steps ing regions of the search space. However, the observed patterns in
of iteration to exploit the search space, how they converge towards Fig. 12 show that the fitness of search agents has a descending be-
the promising regions of the search space, how they improve their havior over the course of iterations. This proves that the SCA al-
initial random solutions, and how they improve their fitness val- gorithm is able to eventually improve the fitness of initial random
ues over the course of iterations. In order to observe the behaviour solutions for a given optimization problem.
of search agents, the two-dimensional version of the test func- In the previous paragraphs, it was claimed that the search
tions is solved by 4 search agents. Note that the optima of some agents of the SCA algorithm tend to explore the promising regions
of the test functions are shifted to locations other than the origin of the search space and exploit the best one eventually. However,
to provide more challenging test beds. The search history of the the convergence behavior of the algorithm was not observed and
search agents is illustrated in Fig. 10. This figure shows that the verified. Although this can be inferred indirectly from the trajec-
SCA algorithm searches around the promising regions of the search tory and average fitness, the convergence curves of SCA are de-
space. The distribution of the sampled points around the global op- picted in Fig. 13.
tima is substantially high, which shows that the SCA algorithm ex- This figure illustrates the best solution obtained so far during
ploits the most promising region of the search space in addition optimization. The descending trend is quite evident in the conver-
to the exploration. However, it is not clear from this figure if the gence curves of SCA on all of the test functions investigated. This
search agents first start exploration or exploitation. To observe this, strongly evidences the ability of the SCA algorithm in obtaining
Fig. 11 is provided in this regard, which illustrates the fluctuations a better approximation of the global optimum over the course of
of the first dimension in the first search agent. iterations.
Fig. 11 shows that the search agents face abrupt fluctuations The results and discussions in this section prove that the SCA
in the early steps of optimization. However, the sudden changes algorithm proposed is able to determine the global optima of the
are decreased gradually over the course of iterations. This con- test functions. Although it can be claimed here that this algo-
firms that the search agents first explore the search space and rithm would be able to approximate the global optima of real
then converge around the best solution obtained in the exploration problems, there is a main difference between real problems and
phase. There is a question here as how to make sure that all of the benchmark functions. The shape of search space and the location
128 S. Mirjalili / Knowledge-Based Systems 96 (2016) 120–133
Fig. 11. Trajectory of the first variable of the first search agent when solving the test problems.
Fig. 13. Convergence curve (best solution in each iteration) of the SCA algorithm.
of the global optimum of the test functions are known, while those the other. In a real airplane both of these forces are desirable in
of real problems are mostly unknown. In addition, the real prob- different occasions. When the airplane is taking off, ascending, and
lems are accompanied by a large number of equality and inequal- cruising maximum lift and minimum drag is fruitful. When de-
ity constraints. Therefore, there is a need to investigate the perfor- scending, landing, and touching down the drag becomes important
mance of the SCA algorithm in solving at least one real challenging to slow down the speed of the vehicle. In this section the drag is
constrained problem with unknown global optimum and search only considered, so the main objective is to minimize this force. In
space. This is the motivation of the next section, in which the two- other words, this section employs the SCA algorithm to define the
dimensional cross-section of an aircraft’s wing is optimized by SCA best shape for the wing to minimize drag.
to confirm its performance in practice. To design an aircraft wing, several components should be con-
sidered: shape of the cross section of the wing (airfoil), the overall
5. Airfoil design using SCA shape of the wing, flaps, internal frames, and position of engines.
This paper only concentrates on designing a 2D airfoild, which is
The problem investigated in this subsection is airfoil design. the main and essential component in a wing. The shape of a 2D
There are two objectives in this problem: lift versus drag. There airfoil is illustrated in Fig. 15.
two forces are shown in Fig. 14. It may be observed that lift is There are different version of this problem in the literature in
when the thrust force is converted to a vertical force, which causes terms of the design parameters. In this work, the B-spline is uti-
flying of a plane. However, drag is the opposite force that is ap- lized to define the shape of the airfoil. As shown in Fig. 16, there
plied to the wing and causes decreasing speed. The lift and drag are eight controlling parameters of which one of the leading points
are in conflict, meaning that increasing one results in decreasing is fixed. The rest of controlling parameters, however, are allowed
130 S. Mirjalili / Knowledge-Based Systems 96 (2016) 120–133
Fig. 17. Convergence curve of the SCA on the airfoil design problem, initial aifoil, and optimized airfoil.
to move along both directions of x and y axes. Therefore, there is A freeware called XFoil is used for calculating drag [75]. It may
a total of 14 (7 × 2) parameters, which are the x and y positions be seen in Eq. (5.1) that the problem is subject to several con-
of the seven controlling points. The problem of airfoil design is straints. Generally speaking, Computational Fluid Dynamics (CFD)
formulated for the SCA algorithm as follows: problem are highly constrained, which make them very challeng-
ing. For solving such problems, an optimization algorithm should
Minimize : F (x, y
) = Cd (x, y
)
(5.1) be equipped with a proper constraint handling method. There are
Sub ject to : −1 ≤ x, y ≤ 1, satis f action o f CO set
different approaches in the literature to cope with constraints of
where x = {x1 , x2 , . . . , x7 }, y
= {y1 , y2 , . . . , y7 )}, CO includes many which penalty functions are the simplest ones. In such methods,
constraints such as minimum of thickness, maximum of thickness, the main objective function is penalized by a penalty function
etc. with respect to the level of constraints’ violation. Other powerful
S. Mirjalili / Knowledge-Based Systems 96 (2016) 120–133 131
constraint handling methods can be found in [70–73]. Interested The results of unimodal test functions showed that the SCA
readers are referred to the comprehensive literature review by algorithm converged substantially faster than FA, BA, FPA, GSA,
Coello Coello [74]. In this work the following penalty function is PSO and GA. A similar behavior was observed in the multi-modal
utilized, which penalizes F proportional to the level of violation: test functions, which proved the high exploration and local optima
3 avoidance of the algorithm proposed. As per the results of compos-
F (x, y
) = F (x, y
) + p Pi (5.2) ite test functions, SCA outperformed other algorithms occasionally,
i=1 which showed that this algorithm was also able to successfully bal-
where p is a constant and Pi is the violation size on the i-th con- ance exploration and exploitation to determine the global optima
straint in the CO set in Eq. (5.1). of challenging test functions. The results of performance metrics
For solving this problem, 30 search agents is employed and al- proved that SCA required its search agent to change abruptly in
lowed to determine the optimal shape for the airfoil over 1000 it- the initial stage of optimization and gradually in the final steps
erations. The algorithm is run 4 times and the best results are il- of optimization. The results showed that this behavior caused ex-
lustrated in Fig. 17. ploration of the search space extensively and exploitation of the
This figure clearly shows that the SCA algorithm improves the most promising region. The average fitness of solutions and con-
initial random shape for the airfoil to minimize drag. The improve- vergence curves also evidenced and confirmed the improvement
ment is quite significant, in which drag was reduced from 0.009 to of initial random population and the best solution obtained so-far
0.0061. These results highly demonstrate that the SCA algorithm is by SCA. The results of the first two test phases proved the SCA is
able to solve real problems with unknown, challenging, and con- able to successfully solve test problems, which have known shape
strained search spaces. This is due to several reasons. Firstly, SCA of search space. The results of SCA on the aroifoil design problem
is a population-based algorithm, so it intrinsically benefits from also showed that this algorithm have the potential to solve chal-
high exploration and local optima avoidance. This assists the SCA lenging real problems as well. The airfoil design problem was a
algorithm to avoid the large number of local solutions in a real highly constrained case study with a completely unknown search
search space and explore different regions extensively. Secondly, space. Therefore, the results of real case study highly demonstrated
SCA smoothly transits from exploration to exploitation using the and confirmed the merits of SCA in solving real problems.
adaptive mechanism for the range of since and cosine functions. As per the findings of this paper and referring the NFL theorem,
This causes local optima avoidance at the beginning of optimiza- it can be concluded that the SCA can be a very suitable alternative
tion and quick convergence towards the most promising region of compared the current algorithms in the literature for solving dif-
the search space in the final steps of optimization. Thirdly, SCA ferent optimization problems. On the other hand, this algorithm
obliges the solutions to update their positions around the best so- might not be able to outperform other algorithms on specific set
lution obtained so far as the destination point. Therefore, there is of problems, but definitely worth testing and applying to problems
always a tendency towards the best regions of the search spaces in different fields. Therefore, the SCA algorithm is offered to re-
during optimization and chances for improving the solutions are searchers in different fields. The source codes of this algorithm are
considerably high. Finally, the SCA algorithm considers optimiza- publicly available at http://www.alimirjalili.com/SCA.html.
tion problems as black boxes, so it is readily incorporable to prob- This paper opens up several research directions for future stud-
lems in different fields subject to the proper formulation of the ies. Firstly, binary and multi-objective version of this algorithm can
problem. In addition, the problem independency allows this algo- be proposed to solve problems with binary and multiple objectives
rithm not to need gradient information of the search space and respectively. Secondly, levy flight, mutation, and other evolution-
work with any types of penalty functions for solving constrained ary operators can be integrated to this algorithm for improving
problems. its performance. Thirdly, the SCA algorithm can be hybridized with
other algorithms in the field of stochastic optimization to improve
6. Conclusion its performance. Finally, investigation of the application of SCA in
different fields would be a valuable contribution.
In this paper a novel population-based optimization algorithm
was proposed as an alternative for solving optimization problems Appendix A
among the current techniques in the literature. In the SCA algo-
rithm proposed, the solutions were required to update their po- See Tables A.1, A.2 and A.3.
sitions with respect to the best solution obtained so far as the
destination point. The mathematical model of position updating
fluctuated the solutions outwards or towards the destination point Table A.1
to guarantee exploration and exploitation of the search space, re- Unimodal benchmark functions.
spectively. Several random and adaptive variables also facilitated
Function Dim Range Shift position fmin
divergence and convergence of the search agents in the SCA al-
gorithm. To benchmark the performance of SCA, several exper-
n
f 1 (x ) = x2i 20 [−100,100] [−30, −30,.., −30] 0
iments were done. Firstly, the a set of well-known test cases i=1
n
i
and convergence of the proposed algorithm. Secondly, the two- f 3 (x ) = ( x j )2 20 [−100,100] [−30, −30,.., −30] 0
i=1 j−1
dimensional versions of some of the test functions were chosen
and re-solved by SCA. Several performance metrics (search history, f4 (x ) = max{|xi |, 1 ≤ i ≤ n} 20 [−100,100] [−30, −30,.., −30] 0
i
trajectory, average fitness of solutions, and best solution during op-
n−1
2
f 5 (x ) = [100(xi+1 − )
x2i 20 [−30,30] [−15, −15,.., −15] 0
timization) were employed to qualitatively observe and confirm i=1
+ ( xi − 1 ) ]
2
the performance of SCA. Finally, the shape of a two-dimensional
airfoil (cross-section of an aircraft’s wing) was optimized by SCA as
n
f 6 (x ) = ([xi + 0.5] )2 20 [−100,100] [−750,.., −750] 0
a real challenging case study to verify and demonstrate the perfor- i=1
Table A.2
Multimodal benchmark functions.
n
F8 (x ) = −xi sin( |xi | ) 20 [−500,500] [−300,.., −300] −418.9829 × 5
i=1
n
F9 (x ) = [x2i − 10cos(2π xi ) + 10] 20 [−5.12,5.12] [−2, −2,.., −2] 0
i=1
n
n
F10 (x ) = −20exp(−0.2 1
n
x2i ) − exp( 1n cos(2π xi )) + 20 + e 20 [−32,32] 0
i=1 i=1
n
n
F11 (x ) = 1
4000
x2i − cos( √xi ) + 1 20 [−600,600] [−400,.., −400] 0
i
i=1 i=1
n−1
n
F12 (x ) = πn {10sin(π y1 ) + (yi − 1 )2 [1 + 10sin2 (π yi+1 )] + (yn − 1 )2 } + u(xi , 10, 100, 4 ) 20
i=1 i=1
xi +1
yi = 1 + [−50,50] [−30, −30,.., −30]
4
⎧
⎨k(xi − a )m xi > a
u(xi , a, k, m ) = 0 − a < xi < a 20 0
⎩k(−x − a )m x < −a
i i
n
n
F13 (x ) = 0.1{sin2 (3π x1 ) + (xi − 1 )2 [1 + sin2 (3π xi + 1 )] + (xn − 1 )2 [1 + sin2 (2π xn )]} + u(xi , 5, 100, 4 ) [−50,50] [−100,.., −100] 0
i=1 i=1
Table A.3
Composite benchmark functions.
F14 (CF1):
f1 , f2 , f3 , . . . , f10 = Sphere Function
[б1 , б2 , б3 , . . . , б10 ] = [1, 1, 1, .., 1] 10 [−5,5] 0
[λ1 , λ2 , λ3 . . . , λ10 ] = [5/100, 5/100, 5/100, .., 5/100]
F15 (CF2):
f1 , f2 , f3 , . . . , f10 = Griewank sF unction
[б1 , б2 , б3 , . . . , б10 ] = [1, 1, 1, .., 1] 10 [−5,5] 0
[λ1 , λ2 , λ3 , . . . , λ10 ] = [5/100, 5/100, 5/100, .., 5/100]
F16 (CF3):
f1 , f2 , f3 , . . . , f10 = Griewank sF unction
[б1 , б2 , б3 , . . . , б10 ] = [1, 1, 1, .., 1] 10 [−5,5] 0
[λ1 , λ2 , λ3 , . . . , λ10 ] = [1, 1, 1, .., 1]
f17 (CF4):
f1 , f2 = Ackley sFunction
f3 , f4 = Rastrigin s Function
f5 , f6 = Weierstrass Function 10 [−5,5] 0
f7 , f8 = Griewank s Function
f9 , f10 = Sphere Function
[б1 , б2 , б3 , . . . , б10 ] = [1, 1, 1, .., 1]
[λ1 , λ2 , λ3 , . . . , λ10 ] = [5/32, 5/32, 1, 1, 5/0.5, 5/0.5, 5/100, 5/100, 5/100, 5/100]
f18 (CF5):
f1 , f2 = Rastrigin s Function
f3 , f4 = Weierstrass Function
f5 , f6 = Griewank s Function 10 [−5,5] 0
f7 , f8 = Ackley sFunction
f9 , f10 = Sphere Function
[б1 , б2 , б3 , . . . , б10 ] = [1, 1, 1, .., 1]
[λ1 , λ2 , λ3 , . . . , λ10 ] = [1/5, 1/5, 5/0.5, 5/0.5, 5/100, 5/100, 5/32, 5/32, 5/100, 5/100]
f19 (CF6):
f1 , f2 = Rastrigin s Function
f3 , f4 = Weierstrass Function
f5 , f6 = Griewank s Function 10 [−5,5] 0
f7 , f8 = Ackley sFunction
f9 , f10 = Sphere Function
[б1 , б2 , б3 , . . . , б10 ] = [0.1, 0.2, 0.3, 0.4, 0.5, 0.6, 0.7, 0.8, 0.9, 1]
[λ1 , λ2 , λ3 , . . . , λ10 ] = [0.1 ∗ 1/5, 0.2 ∗ 1/5, 0.3 ∗ 5/0.5, 0.4 ∗ 5/0.5, 0.5 ∗ 5/100,
0.6∗ 5/100, 0.7∗5/32, 0.8∗ 5/32, 0.9∗5/100, 1∗5/100]
Reference [5] A. Zhou, B.-Y. Qu, H. Li, S.-Z. Zhao, P.N. Suganthan, Q. Zhang, Multiobjective
evolutionary algorithms: a survey of the state of the art, Swarm Evol. Comput.
[1] A.R. Simpson, G.C. Dandy, L.J. Murphy, Genetic algorithms compared to other 1 (2011) 32–49.
techniques for pipe optimization, J. Water Resour. Plan. Manag. 120 (1994) [6] S. Droste, T. Jansen, I. Wegener, Upper and lower bounds for randomized
423–443. search heuristics in black-box optimization, Theory of Comput. Syst. 39 (2006)
[2] C. James, Introduction to Stochastics Search and Optimization, Wiley- 525–544.
Interscience, New Jersey, 2003. [7] H.H. Hoos, T. Stützle, Stochastic Local Search: Foundations & Applications, El-
[3] I. Boussaïd, J. Lepagnot, P. Siarry, A survey on optimization metaheuristics, Inf. sevier, 2004.
Sci. 237 (2013) 82–117. [8] R.S. Parpinelli, H.S. Lopes, New inspirations in swarm intelligence: a survey,
[4] J.A. Parejo, A. Ruiz-Cortés, S. Lozano, P. Fernandez, Metaheuristic optimization Int. J. Bio-Inspired Comput. 3 (2011) 1–16.
frameworks: a survey and benchmarking, Soft Comput. 16 (2012) 527–561. [9] C.M. Fonseca, P.J. Fleming, An overview of evolutionary algorithms in multiob-
jective optimization, Evol. Comput. 3 (1995) 1–16.
S. Mirjalili / Knowledge-Based Systems 96 (2016) 120–133 133
[10] A. Biswas, K. Mishra, S. Tiwari, A. Misra, Physics-inspired optimization algo- [43] G.-G. Wang, A.H. Gandomi, A.H. Alavi, An effective krill herd algorithm with
rithms: a survey, J. Optim. 2013 (2013). migration operator in biogeography-based optimization, Appl. Math. Model. 38
[11] A. Gogna, A. Tayal, Metaheuristics: review and application, J. Exp. Theor. Artif. (2014) 2454–2462.
Intell. 25 (2013) 503–526. [44] J.H. Holland, J.S. Reitman, Cognitive systems based on adaptive algorithms,
[12] X.-S. Yang, Z. Cui, R. Xiao, A.H. Gandomi, M. Karamanoglu, Swarm intelligence ACM SIGART Bull. (63) (1977) 49–49.
and bio-inspired computation: theory and applications, Newnes (2013). [45] R. Storn, K. Price, Differential evolution–a simple and efficient heuristic for
[13] S. Saremi, S. Mirjalili, A. Lewis, Biogeography-based optimisation with chaos, global optimization over continuous spaces, J. Global Optim. 11 (1997) 341–
Neural Comput. Appl. 25 (2014) 1077–1097. 359.
[14] G.-G. Wang, L. Guo, A.H. Gandomi, G.-S. Hao, H. Wang, Chaotic krill herd algo- [46] Y. Wang, H.-X. Li, T. Huang, L. Li, Differential evolution based on covariance
rithm, Inf. Sci. 274 (2014) 17–34. matrix learning and bimodal distribution parameter setting, Appl. Soft Com-
[15] G.-G. Wang, A. Hossein Gandomi, A. Hossein Alavi, A chaotic particle-swarm put. 18 (2014) 232–247.
krill herd algorithm for global numerical optimization, Kybernetes 42 (2013) [47] Y. Wang, Z. Cai, Q. Zhang, Differential evolution with composite trial vector
962–978. generation strategies and control parameters, IEEE Trans. Evol. Comput. 15
[16] G.G. Wang, S. Deb, A.H. Gandomi, Z. Zhang, A.H. Alavi, A novel cuckoo search (2011) 55–66.
with chaos theory and elitism scheme, in: Proceedings of 2014 International [48] Y. Wang, Z. Cai, Q. Zhang, Enhancing the search ability of differential evolution
Conference on Soft Computing and Machine Intelligence (ISCMI), 2014, pp. 64– through orthogonal crossover, Inf. Sci. 185 (2012) 153–177.
69. [49] D. Simon, Biogeography-based optimization, IEEE Trans. Evol. Comput. 12
[17] G.-G. Wang, S. Deb, A.H. Gandomi, Z. Zhang, A.H. Alavi, Chaotic cuckoo search, (2008) 702–713.
Soft Comput. (1726) 1–14. [50] I. Rechenberg, Evolutionsstrategien, in: B. Schneider, U. Ranft (Eds.), Simula-
[18] G. Wang, L. Guo, H. Wang, H. Duan, L. Liu, J. Li, Incorporating mutation scheme tionsmethoden in der Medizin und Biologie, 8, Springer, Berlin Heidelberg,
into krill herd algorithm for global numerical optimization, Neural Comput. 1978, pp. 83–114.
Appl. 24 (2014) 853–871. [51] M. Dorigo, M. Birattari, Ant colony optimization, Encyclopedia of Machine
[19] G. Wang, L. Guo, H. Duan, L. Liu, H. Wang, A bat algorithm with mutation for Learning, Springer, 2010, pp. 36–39.
UCAV path planning, Sci. World J. 2012 (2012). [52] R.C. Eberhart, J. Kennedy, A new optimizer using particle swarm theory, in:
[20] J.W. Zhang, G.G. Wang, Image matching using a bat algorithm with mutation, Proceedings of the Sixth International Symposium on Micro Machine and Hu-
Appl. Mech. Mater. 203 (2012) 88–93. man Science, 1995, pp. 39–43.
[21] H.-R. Li, Y.-L. Gao, Particle swarm optimization algorithm with exponent de- [53] D. Karaboga, B. Basturk, A powerful and efficient algorithm for numerical func-
creasing inertia weight and stochastic mutation, in: Proceedings of Second In- tion optimization: artificial bee colony (ABC) algorithm, J. Global Optim. 39
ternational Conference on Information and Computing Science, 2009 (ICIC’09), (2007) 459–471.
2009, pp. 66–69. [54] E. Rashedi, H. Nezamabadi-Pour, S. Saryazdi, GSA: a gravitational search algo-
[22] S. Chen, Particle swarm optimization with pbest crossover, in: Proceedings of rithm, Inf. Sci. 179 (2009) 2232–2248.
2012 IEEE Congress on Evolutionary Computation (CEC), 2012, pp. 1–6. [55] A. Kaveh, V. Mahdavi, Colliding Bodies Optimization method for optimum dis-
[23] Q. Zhu, Z. Yang, An ant colony optimization algorithm based on mutation and crete design of truss structures, Comput. Struct. 139 (2014) 43–53.
dynamic pheromone updating, J. Softw. 15 (2004) 185–192. [56] A. Hatamlou, Black hole: a new heuristic optimization approach for data clus-
[24] J.J. Liang, P.N. Suganthan, Dynamic multi-swarm particle swarm optimizer with tering, Inf. Sci. 222 (2013) 175–184.
local search, in: Proceedings of 2005 IEEE Congress on Evolutionary Computa- [57] A.H. Kashan, League Championship Algorithm (LCA): an algorithm for global
tion, 2005, pp. 522–528. optimization inspired by sport championships, Appl. Soft Comput. 16 (2014)
[25] K. Premalatha, A. Natarajan, A new approach for data clustering based on PSO 171–200.
with local search, Comput. Inf. Sci. 1 (2008) 139. [58] A. Sadollah, A. Bahreininejad, H. Eskandar, M. Hamdi, Mine blast algorithm: a
[26] N. Noman, H. Iba, Accelerating differential evolution using an adaptive local new population based algorithm for solving constrained engineering optimiza-
search, IEEE Trans. Evol. Comput. 12 (2008) 107–125. tion problems, Appl. Soft Comput. 13 (2013) 2592–2612.
[27] J. Levine, F. Ducatelle, Ant colony optimization and local search for bin packing [59] R.V. Rao, V.J. Savsani, D. Vakharia, Teaching–learning-based optimization:
and cutting stock problems, J. Oper. Res. Soc. 55 (2004) 705–716. a novel method for constrained mechanical design optimization problems,
[28] C. Blum, A. Roli, Hybrid metaheuristics: an introduction, Hybrid Metaheuris- Comput.-Aided Des. 43 (2011) 303–315.
tics, Springer, 2008, pp. 1–30. [60] D.H. Wolpert, W.G. Macready, No free lunch theorems for optimization, IEEE
[29] M. Ehrgott, X. Gandibleux, Hybrid metaheuristics for multi-objective combina- Trans. Evol. Comput. 1 (1997) 67–82.
torial optimization, Hybrid metaheuristics, Springer, 2008, pp. 221–259. [61] S. Mirjalili, Moth-flame optimization algorithm: A novel nature-inspired
[30] G. Wang, L. Guo, A novel hybrid bat algorithm with harmony search for global heuristic paradigm, Knowl.-Based Syst. 89 (2015) 228–249.
numerical optimization, J. Appl. Math. 2013 (2013). [62] M. Črepinšek, S.-H. Liu, M. Mernik, Exploration and exploitation in evolution-
[31] G.-G. Wang, A.H. Gandomi, A.H. Alavi, G.-S. Hao, Hybrid krill herd algorithm ary algorithms: a survey, ACM Comput. Surv. 45 (2013) 35.
with differential evolution for global numerical optimization, Neural Comput. [63] X. Yao, Y. Liu, G. Lin, Evolutionary programming made faster, IEEE Trans. Evol.
Appl. 25 (2014) 297–308. Comput. 3 (1999) 82–102.
[32] G. Wang, L. Guo, H. Duan, H. Wang, L. Liu, M. Shao, A hybrid metaheuristic [64] J. Digalakis, K. Margaritis, On benchmarking functions for genetic algorithms,
DE/CS algorithm for UCAV three-dimension path planning, Sci. World J. 2012 Int. J. Comput. Math. 77 (2001) 481–506.
(2012). [65] M. Molga, C. Smutnicki, Test functions for optimization needs, Test Funct. Op-
[33] G.-g. Wang, L. Guo, H. Duan, H. Wang, L. Liu, M. Shao, Hybridizing harmony tim. Needs (2005).
search with biogeography based optimization for global numerical optimiza- [66] X.-S. Yang, Test problems in optimization, 2010. Available from arXiv:1008.
tion, J. Comput. Theor. Nanosci. 10 (2013) 2312–2322. 0549.
[34] H. Duan, W. Zhao, G. Wang, X. Feng, Test-sheet composition using analytic hi- [67] X.-S. Yang, Firefly algorithm, stochastic test functions and design optimisation,
erarchy process and hybrid metaheuristic algorithm TS/BBO, Math. Probl. Eng. Int. J. Bio-Inspired Comput. 2 (2010) 78–84.
2012 (2012). [68] X.-S. Yang, A new metaheuristic bat-inspired algorithm, Nature
[35] G. Wang, L. Guo, H. Duan, L. Liu, H. Wang, B. Wang, A hybrid meta-heuristic Inspired Cooperative Strategies for Optimization (NICSO 2010), Springer,
DE/CS algorithm for UCAV path planning, J. Inf. Comput. Sci. 5 (2012) 4811– 2010, pp. 65–74.
4818. [69] X.-S. Yang, M. Karamanoglu, X. He, Flower pollination algorithm: a novel ap-
[36] X. Shi, Y. Liang, H. Lee, C. Lu, L. Wang, An improved GA and a novel PSO-GA- proach for multiobjective optimization, Eng. Optim. 46 (2014) 1222–1237.
based hybrid algorithm, Inf. Process. Lett. 93 (2005) 255–261. [70] Y. Wang, Z. Cai, Combining multiobjective optimization with differential evo-
[37] N. Holden, A.A. Freitas, A hybrid PSO/ACO algorithm for discovering classifica- lution to solve constrained optimization problems, IEEE Trans. Evol. Comput.
tion rules in data mining, J. Artif. Evol. Appl. 2008 (2008) 2. 16 (2012) 117–134.
[38] S. Nemati, M.E. Basiri, N. Ghasem-Aghaee, M.H. Aghdam, A novel ACO–GA hy- [71] S.H.R. Pasandideh, S.T.A. Niaki, A. Gharaei, Optimization of a multiproduct eco-
brid algorithm for feature selection in protein function prediction, Expert Syst. nomic production quantity problem with stochastic constraints using sequen-
Appl. 36 (2009) 12086–12094. tial quadratic programming, Knowl.-Based Syst. 84 (2015) 98–107.
[39] W.-Y. Lin, A GA–DE hybrid evolutionary algorithm for path synthesis of four- [72] S. Jalali, M. Seifbarghy, J. Sadeghi, S. Ahmadi, Optimizing a bi-objective reli-
bar linkage, Mech. Mach. Theory 45 (2010) 1096–1107. able facility location problem with adapted stochastic measures using tuned-
[40] B. Niu, L. Li, A novel PSO-DE-based hybrid algorithm for global optimization, parameter multi-objective algorithms, Knowl.-Based Syst. (2015) in press.
Advanced Intelligent Computing Theories and Applications. With Aspects of [73] H. Salimi, Stochastic fractal search: a powerful metaheuristic algorithm,
Artificial Intelligence, Springer, 2008, pp. 156–163. Knowl.-Based Syst. 75 (2015) 1–18.
[41] H. Duan, Y. Yu, X. Zhang, S. Shao, Three-dimension path planning for UCAV [74] C.A. Coello Coello, Theoretical and numerical constraint-handling techniques
using hybrid meta-heuristic ACO-DE algorithm, Simul. Model. Pract. Theory 18 used with evolutionary algorithms: a survey of the state of the art, Comput.
(2010) 1104–1115. Methods Appl. Mech. Eng. 191 (2002) 1245–1287.
[42] G.-G. Wang, A.H. Gandomi, X.-S. Yang, A.H. Alavi, A new hybrid method based [75] M. Drela, XFOIL: An analysis and design system for low Reynolds number air-
on krill herd and cuckoo search for global optimization tasks, Int. J. Bio- foils, in: Low Reynolds number aerodynamics, Springer, 1989, pp. 1–12.
Inspired Comput. (2013).