0% found this document useful (0 votes)
24 views

A Novel Enhanced Whale Optimization Algorithm For Global Optimization

The document presents a novel enhanced whale optimization algorithm called WOAmM that incorporates a modified mutualism phase from the symbiotic organisms search algorithm to address premature convergence in whale optimization algorithm. WOAmM is tested on benchmark functions and real-world engineering problems, showing improved performance over other algorithms.

Uploaded by

crazyapple
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
24 views

A Novel Enhanced Whale Optimization Algorithm For Global Optimization

The document presents a novel enhanced whale optimization algorithm called WOAmM that incorporates a modified mutualism phase from the symbiotic organisms search algorithm to address premature convergence in whale optimization algorithm. WOAmM is tested on benchmark functions and real-world engineering problems, showing improved performance over other algorithms.

Uploaded by

crazyapple
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 28

Computers & Industrial Engineering 153 (2021) 107086

Contents lists available at ScienceDirect

Computers & Industrial Engineering


journal homepage: www.elsevier.com/locate/caie

A novel enhanced whale optimization algorithm for global optimization


Sanjoy Chakraborty a, b, Apu Kumar Saha c, *, Sushmita Sharma c, Seyedali Mirjalili d, e, g, 1,
Ratul Chakraborty f
a
Department of Computer Science and Engineering, Iswar Chandra Vidysagar College, Belonia, Tripura, India
b
Department of Computer Science and Engineering, National Institute of Technology Agartala, Tripura, India
c
Department of Mathematics, National Institute of Technology Agartala, Tripura, India
d
Centre for Artificial Intelligence Research and Optimisation, Torrens University Australia, Fortitude Valley, Brisbane 4006, QLD, Australia
e
YFL (Yonsei Frontier Lab), Yonsei University, Seoul, South Korea
f
Departmentof Statistics, Maharaja Bir Bikram College, Agartala, Tripura, India
g
King Abdulaziz University, Jeddah, Saudi Arabia

A R T I C L E I N F O A B S T R A C T

Keywords: One of the main issues with heuristics and meta-heuristics is the local optima stagnation phenomena. It is often
Whale Optimization Algorithm called premature convergence, which refers to the assumption of a locally optimal solution as the best solution
Mutualism phase for an optimization problem and failure in finding the global optimum. Whale Optimization Algorithm (WOA)
Meta-heuristic
has demonstrated its merits in the optimization area. Though WOA is an effective algorithm, it may suffer from a
Benchmark function
IEEE CEC 2019 functions
low exploration of the search space. In this work, an enhanced WOA (WOAmM) is proposed. The mutualism
Real-world problem phase from Symbiotic Organisms Search (SOS) is modified and integrated with WOA to alleviate premature
convergence’s inherent drawback. The addition of a modified mutualism phase with WOA makes the algorithm a
balanced one to explore search space more extensively and avoid wasting computational resources in excessive
exploitation. The proposed WOAmM method is tested on 36 benchmark functions and IEEE CEC 2019 function
suite. It is compared with a wide range of algorithms, including improved WOAs and other meta-heuristics.
Statistical analyses and convergence analysis are performed as well to examine its effectiveness and conver­
gence speed. In addition, six real-world engineering optimization problems are solved by the proposed method.
The performance is compared with a wide range of existing algorithms to inspect the problem-solving capability
of WOAmM. The results demonstrate the performance improvement of WOAmM and its superiority over
different algorithms.

1. Introduction 2019). All these led the researchers to develop optimization techniques
that can solve real-world optimization problems (Kaur and Arora, 2018).
Engineering and scientific research problems are of great importance One of the solutions they received in the form of an algorithm is
today as the world is changing very fast in technological aspects. Many based on some natural law, and the answers are obtained through some
of these problems are high-dimensional and challenging global optimi­ heuristic search. A heuristic method generates a reasonable solution to a
zation problems (Mohapatra et al., 2017). Such issues are difficult to problem by understanding and exploiting it wisely with the least
solve as an increase in the problem’s dimension exponentially increases computational cost. Meta-heuristics are the approximate methods
the search space (Sun et al., 2019). Classical and gradient-based opti­ derived from the diverse concepts of classical heuristics; biological
mization methods are inefficient in solving these complicated modern- evolution, artificial intelligence; natural phenomena; neural systems,
day problems for various reasons, and also, they are trapped in the and statistical mechanics (Smith et al., 1996). Nature-inspired meta-
curse of dimensionality. Moreover, searching with gradient-based heuristics are designed to solve optimization problems based on some
methods in a problem with the local solution is very difficult because physical or biological phenomena. These algorithms work on randomly-
the gradient search relies on the position of an initial point (Luo and Shi, generated values. The next generation of individuals is enumerated,

* Corresponding author.
E-mail addresses: [email protected] (S. Chakraborty), [email protected] (A. Kumar Saha), [email protected] (S. Mirjalili).
1
https://seyedalimirjalili.com/

https://doi.org/10.1016/j.cie.2020.107086
Received 7 August 2020; Received in revised form 22 December 2020; Accepted 23 December 2020
Available online 28 December 2020
0360-8352/© 2021 Elsevier Ltd. All rights reserved.
S. Chakraborty et al. Computers & Industrial Engineering 153 (2021) 107086

combining the best individual’s value in the present generation, allow­ (Cheng and Prayogo, 2014) is another well-known algorithm in swarm-
ing the population to be optimized throughout iterations. They are based algorithms. SOS algorithm works on the concept of a symbiotic
efficient in solving global optimization problems with a higher dimen­ relationship between the organisms. The word “symbiotic” describes the
sion and can simultaneously handle multiple local optima. These char­ relationship between two or more distinct species in nature. Mutualism,
acteristics have given these algorithms popularity and are being commensalism, and parasitism are common symbiotic relationships in
massively used to solve global optimization problems. Nature-inspired nature. Mutualism is the relationship in which both the organisms are
optimization algorithms are associated with concepts from different benefited by each other either fully or partially. In commensalism, any
fields of researches, such as (i) evolutionary, these methods are based on one of the two organisms benefits from the relationship, and the other
natural laws of evolution for example, genetic programming (GP) organism may or may not benefit from the relationship. When one or­
(Angeline, 1994), differential evolution (DE) (Storn and Price, 1997) (ii) ganism gets benefit from the other and harms it, it is known as para­
laws of physics and chemistry, these methods replicates physical rules in sitism. The algorithm’s mutualism and commensalism phase generate
the cosmos for example, water evaporation optimization (WEO) (Kaveh new individuals concerning the present global best solution. The selec­
and Bakhshpoori, 2016); atom search optimization (ASO) (Zhao et al., tion of random individuals during these phases enables the search pro­
2018), equilibrium optimizer (EO) (Faramarzi et al., 2019); etc., (iii) cess to explore the search region. The exploitation capability of SOS
human behavior, studies behavioral aspects of human being for becomes better as these phases update individuals by using the global
example, teaching learning based optimization (TLBO) algorithm (Rao best individual. The algorithm’s parasitism phase helps the search pro­
et al., 2011); mine blast algorithm (MBA) (Sadollah et al., 2013); social cess not trap into local solutions, and the parasitism phase can also
engineering optimizer (SEO) (Fathollahi-Fard et al., 2018); etc. (iv) eliminate inferior solutions (Ezugwu and Prayogo, 2018). SOS is used to
swarm based, developed mimicking the social or foraging behavior of solve many real-world optimization problems in different disciplines of
the animals or insects for example, symbiotic organism search (SOS) engineering, science, etc. in single objective and multi-objective sce­
(Cheng and Prayogo, 2014); salp swarm algorithm (SSA) (Mirjalili et al., nario (Chakraborty et al., 2019; Kumar et al., 2018; Dinh-Cong et al.,
2017); squirrel search algorithm (SSA) (Jain et al., 2018); etc. (v) bio­ 2020; Zheng et al., 2020). A brief literature review on SOS is presented
logical immune system based, considering features of immune system in Section 2.
for example, dendritic cell algorithm (DCA) (Anandita et al., 2015); Many nature-inspired algorithms are found in the literature, and
neural network algorithm (NNA) (Sadollah et al., 2018). A more sys­ some of those works efficiently in varied range of problems. But every
tematic way of classification of the meta-heuristic search algorithms is method has some limitations for which researchers are working towards
given by Mirjalili and Lewis (Mirjalili and Lewis, 2016). improving those algorithms by using another metaheuristic or compo­
These methods may exhibit improved performance compared to nent of a metaheuristic. Thus, it’s common to improve the algorithms’
regular optimization procedures, mainly when applied to non-convex performance in such algorithms to make them more efficient and robust.
multi-dimensional and greater dimensional optimization problems. A brief discussion on such improvements is discussed in Section 2.
Though these algorithms emerge from diverse fields of research, they Motivated by the algorithms’ performance-enhancing techniques, in
share a common feature despite their differences. The whole search this work, an enhanced WOA method, WOAmM, is proposed by inte­
process is divided into two phases: exploration and exploitation. In grating the ‘modified mutualism phase’ with the original WOA.
exploration, the algorithm must accommodate variables to explore the Although WOA is an efficient algorithm, it suffers from low exploration
search space; as maximum as the possible random move is desirable in ability, slow convergence speed, and being trapped into a local solution
this phase. The exploitation stage is identified with the assessment of the easily (Ling et al., 2017). Moreover, WOA uses a parameter ‘A’, known
potential districts discovered during the exploration stage. Therefore, as a coefficient vector, within the interval [− 2, 2], and the value of |A|
the exploitation phase is related to the local search process. Balancing decreases from 2 to 0 with iterations. The exploration phase is selected
between the two steps of an algorithm is one of the most challenging in WOA when |A| ≥ 1. This ensures the execution of only the exploita­
tasks and also the reason for the success of an algorithm (Kaur and tion phase in the second half of the iteration process and weakening the
Arora, 2018) exploration process (Sun et al., 2019). On the other hand, Do and Lee
One of the exciting developments in numerical optimization is (Do and Lee, 2017) modified the SOS algorithm to balance the searching
introducing the No free lunch (NFL) theorem (Wolpert and Macready, process and increase the solution’s accuracy. In the modified mutualism
1997). The theorem logically proves that there is no optimization al­ phase of SOS, updating the individuals carried out with a random in­
gorithm general enough to solve all optimization problems. This is why dividual instead of the global best individual increases the solution’s
many algorithms, together with their modifications, improved versions, diversity. Hence, the algorithm’s exploration ability is increased, which
and hybrid algorithms, are being studied by a large number of re­ helps the algorithm bypass the local optimal solution. Thus, this inte­
searchers worldwide. It is also observed that the same algorithm on a gration aims to enhance the global exploration capacity, increase
problem gives different results depending on the various set of param­ convergence speed, and eventually adjust the investigation and exploi­
eter values. tation capacity of the basic WOA. The newly developed method’s per­
Recently, a nature-inspired meta-heuristic algorithm, namely Whale formance is tested on a set of thirty-six benchmark test functions and
Optimization Algorithm (WOA), has been proposed by Mirjalili and contrasted with five well-known algorithms of the meta-heuristic field.
Lewis (Mirjalili and Lewis, 2016). This algorithm mimics the unique Among the algorithms compared, two are the component algorithms of
foraging behavior of humpback whales to catch the prey. WOA has a WOAmM, i.e., WOA and SOS. The other three are, namely, DE (Storn
simple yet effective mechanism with a small number of control param­ and Price, 1997), PSO (Kennedy and Eberhart, 1995), and BOA (Arora
eters. The authors established its efficiency by comparing it with state- and Singh, 2018). To benchmark the algorithm’s performance in detail,
of-the-art algorithms and solving several challenging real-world opti­ it is also compared with the five recently improved algorithms. Three of
mization problems. Despite having some limitations, the efficiency of those are recently modified versions of WOA, namely LWOA (Ling et al.,
WOA is better than several other well-known algorithms in terms of 2017), IWOA (Mostafa Bozorgi and Yazdani, 2019), QIWOA (Sun et al.,
exploitation and avoiding local solutions (Mohammed et al., 2019). 2019), and one is a modified version of SOS, i.e., mSOS (Do and Lee,
Considering its efficiency, WOA is used to solve different types of 2017). Another benchmarked method is mMBOA (Sharma and Saha,
problems from various fields like engineering, science, multi-objective 2019) proposed by Sharma et al. merging the ‘mutualism phase’ of the
problems, binary problems, etc. (Reddy and Kumar, 2017; Horng SOS algorithm with BOA to increase BOA’s exploitation ability is also
et al., 2017; Kumar et al., 2018; El Aziz et al., 2018; Hussien et al., 2019). used here for comparison. IEEE CEC19 benchmark functions suit, a set of
A brief literature review on WOA is presented in Section 2. ten highly challenging and complicated functions is also evaluated using
Symbiotic organisms search (SOS) proposed by Cheng and Prayogo WOAmM. The results are compared with five recent basic algorithms

2
S. Chakraborty et al. Computers & Industrial Engineering 153 (2021) 107086

and five modified algorithms. Statistical analyses, like Friedman’s rank WOA, thereby balances exploration and exploitation of the algorithm
test and box plot analysis, have been performed to evaluate the effec­ while solving global optimization problems. The Nelder-Mead local
tiveness of the proposed algorithm WOAmM. The speed of finding the search algorithm is merged with WOA to develop a hybrid variant
optimal solution is tested through convergence graphs and their anal­ (HWOANM) (Yildiz, 2019). The integration aimed to increase the
ysis. Finally, it has been used to solve six real-world engineering design convergence speed of WOA for solving design and manufacturing
problems and compared the performance with a wide range of algo­ problems. Hybrid algorithms (Natesan and Chokkalingam, 2019; Xiong
rithms. Performance analysis shows that the proposed algorithm is et al., 2019; Memarzadeh et al., 2020) incorporating WOA as one
efficient while finding the optimal solution in numerical results and component algorithm are developed and found to be used by the authors
convergence speed in most cases. to solve some of the defined problems.
The rest of this paper is organized as follows in Section 2, some SOS is another effective algorithm governed by interactive re­
previous improvement works done on WOA, SOS, and a few basic al­ lationships in the ecosystem. However, SOS’s performance is very
gorithms are reviewed. In Section 3, the working of the Whale optimi­ competitive, as the algorithm adopts greediness while updating the so­
zation algorithm is described. A brief description of the mutualism phase lutions; there is a chance of trapping into local optima. Though SOS has
of the SOS algorithm is given in Section 4. Section 5 describes the newly only two control parameters, it is hard to tune them (Nama et al., 2016).
proposed modified Whale optimization algorithm with a modified To overcome these disadvantages, a lot of improvements and hybrid­
mutualism phase (WOAmM). The simulation results, convergence izations have been performed on SOS. Some of those are, modified SOS
analysis, the application of six real-world problems, and a brief discus­ (mSOS) (Do and Lee, 2017), which is a modification of the basic SOS
sion on the outcome are given in Section 6. Section 7 finally concludes algorithm to increase diversity within the search process and balance
the study and suggest future directions. local and global search strategies. Here, both the mutualism and
commensalism phases are altered, and the parasitism phase is elimi­
2. Literature review nated from the basic SOS algorithm. Modified mutualism phase executes
by selecting a random individual Pk and updating the current individual
WOA is a well-regarded meta-heuristic compared to different such Pi and another random individual Pj using Pk instead of the best indi­
algorithms in finding optimal solutions (Ling et al., 2017) for optimi­ vidual Pbest . In the commensalism phase, a narrower range for the
zation problems, but it suffers from certain limitations. WOA also suffers random value is recommended to speed up the convergence. Five
from entrapping at local optima and slow convergence rate (Mohammed complex truss weight minimization problems were solved using the
et al., 2019). Moreover, the exploitation process in WOA gets more method. Another modified variant of SOS (Kumar et al., 2018) employs
preference than exploration, as exploration is only selected for |A| ≥ 1, an adaptive benefit factor in the mutualism phase and a modified
where A is a coefficient (Sun et al., 2019). To improve its performance a parasitism phase. The modified algorithm possesses enhanced efficacy
decent number of works have been done by many researchers. Some of and proper balance between local search and global search and solves
those are mentioned here. Levy flight trajectory based WOA (LWOA) structural optimization problems.
(Ling et al., 2017) used levy flight to increase diversity in solution for Quasi opposition-based learning (QOBL) is coupled with SOS to
local optima avoidance, balancing exploration and exploitation in WOA develop a new algorithm QOSOS (Nama and Saha, 2018). Quasi
and using it to solve infinite impulse response model identification. opposition-based learning is used to increase exploration ability. Hence,
Modified WOA (MWOA) (Xu et al., 2018) was proposed by Xu et al., balancing the local search and global search phases of the SOS algo­
introducing a non-linear dynamic strategy based on a cosine function. rithm. The new algorithm is used to solve optimization problems of the
Levy flight and quadratic interpolation are also added to avoid local unconstrained category. It is applied in radial distribution networks and
optima and make solutions more accurate. The algorithm with improved is also used to solve structural optimization problems. Perturbed global
global search and local search capability is used to solve large-scale crossover operator SOS (PGCSOS) (Zhao and Liu, 2019), assimilates a
global optimization problems. Balanced WOA (BWOA) (Chen et al., perturbed crossover scheme in the SOS algorithm’s parasitism phase to
2019), was developed synchronously introducing levy flight and chaotic maintain balance among global search and local search when solving
local search to alleviate premature convergence and local optima stag­ global optimization problems.
nation in WOA. It was used to solve complex constrained engineering In Improved SOS (ISOS) (Çelik, 2020), the author used quasi-
design problems. In WOA with Quadratic interpolation (QIWOA) (Sun oppositional-based learning while generating the initial population
et al., 2019), the authors proposed a direction vector to increase and the algorithm’s parasitism phase. A significant and alternate para­
exploration and control the early convergence of WOA. Quadratic sitism phase is also offered. The unnecessary exploration issue of the SOS
interpolation is used to enhance exploitation ability and solution accu­ algorithm is refined by using these two parasitism approaches. A chaotic
racy while solving problems with a higher dimension. Lamarckian local search based linear chaotic map is also employed to increase local
learning-based WOA (WOALam) (Zhang and Liu, 2019) uses a good search and used the algorithm to solve global optimization problems.
point set theory to initialize the population. The development potential Self-adaptive benefit factor-based improved SOS (SaISOS) (Nama et al.,
of the individual is calculated using the upper confidence bound algo­ 2020), is a modified SOS algorithm with self-adaptive benefit factors and
rithm. Finally, individuals with better development potential are modified mutualism phase. It also includes a random weighted reflec­
selected for performing local search. This process strengthens the local tion coefficient and a new control operator. The overall arrangement
search, improves the algorithm’s efficiency, and allows solving high- improves the performance of the basic SOS algorithm. Hence, it was
dimension optimization problems. used for solving complex optimization problems.
Reinforced WOA (RDWOA) (Chen et al., 2019) was proposed using a Similarly, many works can be found in literature where the existing
double adaptive weight strategy. This strategy’s inclusion increases algorithms are modified to improve their efficiencies. Some of such
exploration during the early phases of the algorithm, and exploitation works are PSO algorithm based on the fitness performance (PSOFAP) (Li
gets preference in later stages. A random replacement strategy is and Cheng, 2017), which was introduced with a fitness-varying ideal
incorporated to enhance the convergence speed of the algorithm. The velocity function called a transformation cosine function. Parameters
overall improvement significantly improves performance while solving are evaluated using transformed cosine as the objective function. The
benchmark functions and engineering problems. WOA hybridized with absolute velocity of the particle represented using fitness performance
DE to form a new variant Improved WOA (IWOA) (Mostafa Bozorgi and grades. This arrangement enhanced the convergence speed, quality of
Yazdani, 2019). The high exploration ability of DE is utilized to alleviate the solution, and parameters were adapted effectively.
the sufferings of WOA from early convergence and stagnation in local In another study, an improved particle swarm optimization (Wang
optima. The integration of DE increases the global search ability of et al., 2019) was introduced with a machine unavailable constraint-

3
S. Chakraborty et al. Computers & Industrial Engineering 153 (2021) 107086

based decoding scheme and transformation mechanism for population encircling the target, and spiral bubble-net feeding maneuver. As an
initialization. A new particle movement method was also presented algorithm’s performance depends on its balance between global and
using position changes and random inertia weight. Finally, it was used to local search mechanisms, WOA uses these three phases to balance be­
solve the job scheduling problem with unexpected job arrivals. In N- tween exploration and exploitation, depending on some criteria. The
state switching PSO (NS-SPSO) (Rahman et al., 2020), the authors process finally terminates based on pre-defined criteria.
calculated population diversity and an average distance of the particle
from the global best particle through Euclidean distance. They used it to 3.1. Searching the prey
update the velocity of the particle. Inertia weight and acceleration co­
efficients were calculated from local search, global search, convergence Depending on its present location, whales randomly search the
speed, and local optima avoidance ability assigned them in N steps. The target. This characteristic of the humpback whale is used here to amplify
improvement was useful on low dimensional problems. the exploration capability of the algorithm. The behavior can be written
A number of well-known modifications of Differential evolution (DE) mathematically as:
include Success History based parameter adaptation for Differential
evolution (SHADE) (Tanabe and Fukunaga, 2013), a historical memory (1)
(k) ˙
D= |C.P(k)
rnd − P
was introduced to store parameter value of previous generations. Pa­
rameters for every individual in a generation were calculated using the P(k+1) = P(k)
rnd − A.D (2)
successful pair of parameters stored in historical memory. This modifi­
cation enhanced the ability of DE. The authors proved its efficiency in where P indicates a position vector of the population, Prnd is a vector
evaluating complex CEC functions. SHADE with linear population which is chosen randomly from the present population, the current
reduction (LSHADE) (Tanabe and Fukunaga, 2014), introduced by iteration is denoted by k, D is the distance between the random and
reducing population size according to a linear process. The decline of the current individual of the population, the dot (.) operator indicates the
population decreased the complexity of the algorithm and hence element by element multiplication process, and | | is used to define
increased the efficiency. Later on, LSHADE was modified by the re­ absolute value.
searchers introducing various techniques. A recent variant of the SHADE The two parameters, known as coefficient vectors, A, and C are
family is self-adaptive LSHADE-cnEpSIN (SALSHADE-cnEpSIN) (Salgo­ calculated as follows:
tra et al., 2019), where parameters were adapted using Weibull
distribution-based scaling factor and exponentially decreasing crossover A = 2a1 × rnd − a1 (3)
rate. The authors also proved its efficiency in evaluating recent complex
C = 2 × rnd (4)
CEC functions.
The cuckoo search algorithm is hybridized with DE (CSDE) (Xia and where the value of a1 is a number decreased linearly from 2 to 0 during
Zhengb, 2020). A control parameter is introduced to estimate popula­ iteration, and rnd is a random number within the interval (Mohapatra
tion diversity in the cuckoo search algorithm’s local search phase. While et al., 2017).
diversity in the population is minimum, then used DE to increase the
strength of the solution. The algorithm’s global search phase uses mu­
tation and random walk-based population generation with a self- 3.2. Encircling the prey
adaptive cross over rate, which allows the algorithm to climb the local
top. A hybrid Teaching learning-based optimization algorithm (HTLBO) During this phase, the best solution found so far is assumed to be the
(Nama et al., 2020) was proposed hybridizing Quadratic approximation solution that is close to the optimal value. The remaining individuals in
(QA) with an improved TLBO algorithm. Modification on TLBO is done the population update their positions near the present best solution. This
by introducing an adaptive teaching factor. The integration of QA surges behavior can be mathematically defined as:
global search as well as the local search capacity of the TLBO algorithm. ⃒ ⃒
⃒ (k) ⃒
D = ⃒C.P(k)
best − P ⃒ (5)
The performance of HTLBO confirmed optimizing benchmark functions
and solving a real-life problem. Adaptive control parameter based
improved Backtracking search algorithm (IBSA) (Nama et al., 2017), (6)
(k)
P(k+1) = Pbest − A.D
proposed making the value parameter F adaptive. Parameter F is used in
BSA to control the direction of the search process. For each individual of where Pbest is the current best solution.
the population, F is generated between [0.45,1.99] automatically. The
introduction of an adaptive F automatically controls the probability of 3.3. Bubble-net attacking strategy
mutation in BSA. IBSA was used to determine active earth pressure on
retaining wall supporting C-∅ backfill using the pseudo-dynamic Humpback whales follow a bubble-net attack where the whales move
method. on a helix-shaped path. In WOA, while searching, the bubble-net strat­
From the above discussion, it is apparent that, like other algorithms, egy is implemented as:
WOA, SOS, TLBO, DE, and PSO were improved, modified, and hybrid­
ized in many studies since their inception. The upgraded variants were D* = P(k)
best − P
(k)
(7)
also used by the researchers to optimize different problems efficiently.
P(k+1) = D* ⋅ebl ⋅cos(2Πl) + P(k)
best (8)
3. Whale optimization algorithm (WOA)
where bis used to defining the shape of the logarithmic spiral, and the
WOA is a metaheuristic developed by Mirjalili and Lewis (Mirjalili value of b is a constant, l is a random number calculated according to the
and Lewis, 2016)2; which mimics the humpback whales’ foraging following equation:
behavior. Like other optimization methods, the WOA’s optimization l = (a2 − 1)rnd + 1 (9)
process starts with the initialization of the random population. The
whole search process is divided into three phases, searching the prey, where the value of a2 is decreased linearly within the value (-1) to (-2)
during the iteration process, and rndis an arbitrary number in the range
[0,1].
2
https://seyedalimirjalili.com/woa During the search process, switching between the exploration and

4
S. Chakraborty et al. Computers & Industrial Engineering 153 (2021) 107086

Fig. 1. Pseudo-code of the WOAmM algorithm.

exploitation process is chosen depending on the value of |A|. If|A| ≥ 1, species. The symbiotic relationship, which common in the real-world, is
the exploration process comes in existence, which enables global search mutualism, commensalism, and parasitism. As the present work con­
by using Eq. 1 and 2. When |A| < 1, then updating the positions of in­ siders only the modified mutualism phase of SOS so, in this section, just
dividuals is performed by using eq. 6 or 8. Depending on a probability the mutualism part of SOS is discussed.
value β, which is 0.5 for each strategy WOA process switches between Mutualism can be explained as the inter-dependable relationships
encircling prey or bubble-net attacking strategy. This characteristic can between two organisms where both organisms benefit from the inter­
be formulated as: action. The relationship between the bees and the flowers is an example
{ of a mutualism relationship. Bees move among the flowers and collect
P(k+1) = Pkbest − A.D if β < 0.5
(10) nectar and turn it into honey. This activity profits the flowers as it helps
P(k+1) = D* ⋅ebl cos(2πl) + Pkbest if β ≥ 0.5 them in the pollination process. The process can be written mathemat­
ically as:
4. Mutualism phase of symbiotic organism search (SOS) Pk+1 = Pki + rnd*(Pbest − MV*BF1) (11)
algorithm
i

Pk+1 = Pkj + rnd*(Pbest − MV*BF2) (12)


Cheng and Prayogo (Cheng and Prayogo, 2014) introduced SOS, a
j

new population-based meta-heuristic technique inspired by the natural


where Pi is the ith member of the population and Pj is an organism which
ecosystem. SOS uses the symbiotic relationship between the two distinct
is selected randomly to interact with Pi . Both the organisms are working

5
S. Chakraborty et al. Computers & Industrial Engineering 153 (2021) 107086

on a mutual basis for survival in the ecosystem, rnd is a random number Table 1
with a uniform distribution between [0, 1], MV is the mutual vector,BF Parameters of the basic algorithms.
is the benefit vector, k is the generation and Pbest is the best individual Method Parameter values
organism obtained in the kth generation. MVand BF are calculated as WOA a = 2to0, p = rand[0, 1], b = 1, l = − 1to1
follows: SOS BF1 = Either1or2, BF2 = Either1or2
Pi + Pj BOA p = 0.8, c = 0.01, a = 0.1to0.3
MV = (13)
2 DE F = 0.5, CR = 0.9, NP = 30
PSO w = 0.9to0.4, C1 = C2 = 2
BF = round(1 + rnd) (14) WOAmM Parameters of WOA, BF1 = Either1or2,BF2 = Either1or2

The round function is used to set the value of BF as one or two.BF is


used to identify whether an organism partially or fully benefits from the
functions (Price et al., 2018). CEC 2019 data set contains ten highly
interaction among individuals from the population.
challenging and complex 100-digit composition functions in the suite.
Unimodal functions, having only one local optimum, are used to test the
5. Proposed whale optimization algorithm with a modified algorithm’s exploitation and convergence ability. Multimodal functions
mutualism phase (WOAmM) have several local optima and are used to test an algorithm’s exploration
and local optima avoidance ability. F1 to F16 listed in Table (a) Ap­
According to the new search process, after initializing the popula­ pendix A are unimodal functions, F17 to F36 listed in Table (b) of Ap­
tion, each individual’s fitness is evaluated to find the global best solu­ pendix A, are multimodal functions, and F37 to F46 is the CEC 2019 test
tion. Afterward, at every iteration, the modified mutualism phase is suite functions listed in Table-(c) of Appendix A. Evaluated results of
executed first. In this phase for every individual (Pi ), two random in­ WOAmM is compared with five well-known basic algorithms, as well as
dividuals ((Pm & Pn ) are selected from the population where i ∕ =m∕ = n. with the same number of modified algorithms. While diaplaying the
The individual with minimum fitness among these two randomly chosen results, the evaluated values that are less than or equal to 10− 10 are
individuals is chosen to enumerate the new value of the present indi­ represented as ≈0. But, the comparison of the results has been done by
vidual Pi and the other random individual. If Pm is the individual with the actual results. In the experimental setup, MATLAB R2015a with Intel
minimum fitness between the two, the updating process is as follows: i3 processor and 8 GB of RAM is used under the Windows10 operating
Pk+1 = Pki + rnd(0, 1) × (Pm − MV × BF 1
)
(15) system. For a fair comparison, a fixed-size population (NP) 30 and the
maximum number of iterations 500 is used as the termination criterion
i

Pk+1 = Pkn + rnd(0, 1) × (Pm − MV × BF2


)
(16) for all the algorithms. QIWOA (Sun et al., 2019) and WOALam (Zhang
and Liu, 2019) are two enhanced variants of WOA. The termination
n

Otherwise, criterion of the proposed algorithm is chosen, similar to these


) algorithms.
Pk+1
i = Pki + rnd(0, 1) × (Pn − MV × BF1 (17)
)
Pk+1
m = Pkm + rnd(0, 1) × (Pn − MV × BF2 (18) 6.1. Comparison of WOAmM with other basic algorithms

where MV is enumerated as Mean(Pi , Pn ) in the first occasion and The efficiency of WOAmM has been measured by comparing it to the
Mean(Pi , Pm ) on the second occasion.BF1 and BF2 are benefit factors component algorithms WOA, SOS, and three other well-known algo­
having a randomly generated value of either one or two. rnd is a random rithms, DE, PSO, and BOA. All the algorithms are executed 30 times for
number in the range [0,1]. The individual with better fitness is accepted. each function; the dimension value for the variable dimension (D)
The update of two individuals from the population at a time using the benchmark functions is fixed at 30. Associated parameter values used for
best random individual among the two randomly selected individuals each of the algorithms are given in Table 1. The parameter setting of the
enables higher diversity in the solution. The selection of the best one compared algorithms is taken from the literature. For example, WOA,
among the two random individuals during the modified mutualism SOS, BOA, DE and PSO are taken from (Mirjalili and Lewis, 2016; Cheng
phase guides the solution towards the global best solution. The global and Prayogo, 2014; Arora and Singh, 2018; Rahnamayan et al., 2008;
best solution is evaluated after the modified mutualism phase and Kennedy and Eberhart, 1995) respectively. According to the above
updated if it is better than the present best solution. Updated global best studies, these algorithms work better with the above parameter setting,
solution value is used to carry out the WOA process. Therefore, the lack which is again given in Table 1. In this study, we compare the perfor­
of exploration ability of WOA is managed here by integrating the mance of the proposed method with the above five basic methods. Re­
modified mutualism phase. The increase of diversity in solution due to sults are calculated and tabulated, showing average (mean), stdv
the modified mutualism phase also reduces the chance of the WOA (standard deviation), and the best value found during the search given in
process being trapped into local optima. The selection of the best Tables 2a-2c.
random individual during the modified mutualism phase and global best In Tables 2, the boldface indicates the better values; boldface with
individual during the local search phase of WOA increases the algo­ symbol ≈ is used to show similar values. Table 3 shows the number of
rithm’s convergence speed. The process terminates after giving an occasions where WOAmM performs better or worse than or like the al­
optimal solution as output when the termination criterion is satisfied. gorithms WOA, SOS, BOA, DE, PSO considering the mean value of each
The proposed method’s pseudo-code is shown in Fig. 1 in which lines function. Table 3 depicts that the WOAmM works better on 31, 17, 34,
numbered from 6 to17 make the proposed WOAmM the algorithm 28, 28 occasions than WOA, SOS, BOA, DE, and PSO, respectively. Also,
different from the original WOA. it works similarly on 5, 10, 1, 4, 4 benchmark functions and performs
worse than WOA, SOS, BOA, DE, PSO on 0, 9, 1, 4, 4 occasions,
6. Experimental study respectively. Numerical results of WOAmM on functions F1, F2, F3, F4,
F8, F10, F11, F12, F13, F14, F15, F16, F19, F25 are better than other
The newly developed WOAmM algorithm’s efficiency is evaluated by compared methods. On functions F6, F18, F20, F23, F24, F29, F30, F31,
applying it to optimize 36 (thirty-six) well-known classical benchmark F34, F36, WOAmM shares optimal results with at least one algorithm
functions comprising unimodal, fixed dimensional unimodal, multi­ from the algorithms compared.
modal, fixed dimension multimodal function, and CEC 2019 test suite The performance validates WOAmM is superior in searching for an

6
S. Chakraborty et al. Computers & Industrial Engineering 153 (2021) 107086

Table 2a
Optimization experimental results for functions F1 to F12 using different metaheuristics with NP = 30 & D = 30.
Function ID WOAmM WOA SOS BOA DE PSO
− 01 − 01
F1 Average 0 ≈
0 ≈
0 1.67 × 10 9.19 × 10 3.95 × 10− 03
Stdv 0 ≈
0 ≈
0 8.63 × 10− 03 2.51 × 100 3.40 × 10− 03
Best 0 ≈
0 ≈
0 1.51 × 10− 01 1.19 × 10− 04 4.33 × 10− 04
F2 Average ≈
0 ≈
0 ≈
0 2.24 × 10− 01 2.05 × 10− 02 9.54 × 10− 03
Stdv 0 ≈
0 ≈
0 9.92 × 10− 02 7.80 × 10− 02 5.46 × 10− 03
Best ≈
0 ≈
0 ≈
0 3.13 × 10− 02 1.56 × 10− 03 2.91 × 10− 03
F3 Average 0 4.51 × 1004 ≈
0 1.98 × 10− 01 4.29 × 1002 4.83 × 1001
Stdv 0 1.18 × 1004 ≈
0 1.32 × 10− 02 2.23 × 1002 1.91 × 1001
Best 0 1.81 × 1004 ≈
0 1.76 × 10− 01 1.09 × 1002 1.85 × 1001
F4 Average ≈
0 4.96 × 1001 ≈
0 5.89 × 10− 01 1.80 × 1001 7.78 × 10− 01
Stdv 0 2.97 × 1001 ≈
0 2.59 × 10− 02 5.06 × 100 1.28 × 10− 01
Best ≈
0 6.99 × 10− 01 ≈
0 5.15 × 10− 01 9.83 × 100 5.33 × 10− 01
F5 Average 2.66 × 1001 2.81 × 1001 2.09 × 1001 2.89 × 1001 8.52 × 1002 4.91 × 1001
Stdv 3.54 × 10− 01 4.52 × 10− 01 1.06 × 100 3.02 × 10− 02 2.85 × 1003 4.33 × 1001
Best 2.60 × 1001 2.73 × 1001 1.86 × 1001 2.88 × 1001 2.83 × 1001 2.14 × 1001
F6 Average 0≈ 0 0 0 7.20 × 100 2.67 × 10− 01
Stdv 0 0 0 0 5.26 × 100 5.83 × 10− 01
Best 0≈ 0 0 0 1.00 × 100 0
F7 Average 1.91 × 10− 03 4.64 × 10− 03
6.03 × 10− 04 4.83 × 10− 04
5.27 × 10− 02 2.66 × 10− 02
Stdv 5.33 × 10− 03 5.41 × 10− 03
2.96 × 10− 04 2.81 × 10− 04
3.05 × 10− 02 8.85 × 10− 03
05
Best 1.89 £ 10¡05 8.16 × 10− 1.86 × 10− 04 7.99 × 10− 05
1.69 × 10− 02 1.40 × 10− 02
01
F8 Average 0 ≈
0 ≈
0 4.05 × 10− 1.27 × 1008 9.47 × 1004
02
Stdv 0 ≈
0 ≈
0 2.41 × 10− 6.60 × 1008 6.46 × 1004
01
Best 0 ≈
0 ≈
0 3.54 × 10− 2.25 × 1004 1.93 × 1004
05 02
F9 Average ≈
0 2.44 × 10− ≈
0 2.58 × 10− ≈
0 3.22 × 10− 07
05 03
Stdv ≈
0 1.62 × 10− ≈
0 8.45 × 10− ≈
0 3.15 × 10− 07
03
Best ≈
0 ≈
0 ≈
0 8.45 × 10− ≈
0 ≈
0
02
F10 Average 0 ≈
0 ≈
0 7.28 × 10− 8.06 × 10− 04 ≈
0
02
Stdv 0 ≈
0 0 3.35 × 10− 2.87 × 10− 03 ≈
0
03
Best 0 ≈
0 ≈
0 3.06 × 10− 3.10 × 10− 08 ≈
0
01
F11 Average 0 ≈
0 ≈
0 3.89 × 10− 3.29 × 1007 1.17 × 1004
02
Stdv 0 ≈
0 ≈
0 2.19 × 10− 1.27 × 1008 1.07 × 1004
01
Best 0 ≈
0 ≈
0 3.46 × 10− 1.06 × 1003 1.83 × 1003
02
F12 Average 0 ≈
0 ≈
0 2.47 × 10− 1.04 × 10− 02 7.25 × 10− 06
02
Stdv 0 ≈
0 ≈
0 1.45 × 10− 3.21 × 10− 02 5.35 × 10− 06
03
Best 0 ≈
0 ≈
0 4.49 × 10− 5.20 × 10− 06 6.87 × 10− 07

optimal solution on both unimodal and multimodal functions. Though can be seen that WOAmM is better in 15, 23, 26, 17, 19 benchmark
the evaluated average results of WOAmM on F27 and F33 are the same functions, and similar results are found in 14, 9, 5, 14, 09 benchmark
as other algorithms, the standard deviation value is greater. Conver­ functions, respectively. On 7, 4, 5, 5, 8 occasions, WOAmM performs
gence analysis (Fig. 5) reveals that the proposed algorithm is faster, worse than the respective algorithms.
searching for optimal solutions. The result of Friedman’s test (Table 4) The results of WOAmM on functions F1, F3, F6, F8, F10, F11, F12,
exposes the proposed algorithm’s supremacy over other compared al­ F13, F14, F15, F16, F18, F19, F20, F23, F24, F29, F30, F31, F35, F36 are
gorithms. The boxplot analysis (Fig. 2) confirms the consistency of optimal or close to the optimal value. In function F1 to F6, QIWOA is the
searching the optimal solution. These results simulate that the proposed best among the algorithms, though, in F1, F3, and F6, it shares similar
WOAmM algorithm’s ability is strong enough to escape from local results with proposed WOAmM. An increase in the dimension slightly
minima and yield better results overall. degrades the performance of WOAmM on functions F2, F4, and F5.
Performance of WOAmM on functions F5, F7, F17, F27, F32, F35 is the
6.2. Comparison of WOAmM with other modified algorithms worst among most of the compared algorithms. Result recorded for
function F29 is the same for all the algorithms. Convergence analysis
Performance of WOAmM measured exhaustively, comparing it to (Fig. 6) shows the competency of the algorithm in finding an optimal
three modified versions of WOA, namely QIWOA, LWOA, IWOA. solution. The newly developed algorithm’s statistical performance is
Compared with mSOS, a component algorithm WOAmM, and with m- verified employing a non-parametric statistical test, the Friedman rank
MBOA, a modified version of BOA where BOA has been modified by test. The mean values of functions evaluated by all the algorithms are
using the mutualism phase of SOS. Every algorithm is executed 30 times used to find the compared algorithms’ rank and shown in Table 8. It is
for each function listed in Table-(a) & Table-(b) of Appendix A. observed from the test results that the mean rank for WOAmM is the
Dimension value for the variable dimension (D) benchmark functions is minimum among all methods. All these findings ensure the competi­
fixed at 100 to test its efficiency in higher-dimensional problems. tiveness of the WOAmM.
Associated parameter values used for each of the algorithms are pre­
sented in Table 5. The parameter values used here, as suggested in the 6.3. Comparison of results of CEC 2019 functions with basic algorithms
respective study. Results are calculated and tabulated showing average
(mean), stdv (standard deviation), the best value and presented in CEC 2019 benchmark functions are evaluated using WOAmM, and
Table 6a-6c. Some of the results of QIWOA and LWOA are taken here the results are compared with the results of the WOA (Mirjalili and
from (Sun et al., 2019). In the Table, the boldface indicates the proposed Lewis, 2016), MFO (Mirjalili, 2015), BOA (Arora and Singh, 2018), SCA
method’s better performance, boldface with symbol ≈ is used to show (Mirjalili, 2016), and JAYA (Venkata Rao, 2016). For all the algorithms,
similar values. Table 7 shows the number of functions where WOAmM population size 50 is used, and the maximum number of iterations 1000
performs better than, worse than, or like the methods QIWOA, LWOA, is used as the termination criteria. Every algorithm is executed 30 times
IWOA, mSOS, mMBOA considering the mean value of each function. It for each function listed in Table-(c) of Appendix A. Results are

7
S. Chakraborty et al. Computers & Industrial Engineering 153 (2021) 107086

Table 2b
Optimization experimental results for functions F13 to F24 using different metaheuristics with NP = 30 & D = 30.
Function ID WOAmM WOA SOS BOA DE PSO
− 02 03
F13 Average 0 ≈
0 ≈
0 1.15 × 10 1.16 × 10 2.77 × 10− 05
Stdv 0 ≈
0 0 4.90 × 10− 04 5.71 × 1003 3.44 × 10− 05
Best 0 ≈
0 ≈
0 1.08 × 10− 02 7.90 × 10− 07 4.09 × 10− 07
F14 Average 0 ≈
0 ≈
0 2.26 × 10− 02 3.35 × 10− 04 5.32 × 10− 07
Stdv 0 ≈
0 ≈
0 7.18 × 10− 04 9.82 × 10− 04 4.66 × 10− 07
Best 0 ≈
0 ≈
0 2.09 × 10− 02 1.91 × 10− 07 6.20 × 10− 08
F15 Average 0 ≈
0 ≈
0 8.62 × 10− 03 ≈
0 ≈
0
Stdv 0 0 0 3.67 × 10− 03 ≈
0 ≈
0
Best 0 ≈
0 ≈
0 1.62 × 10− 03 ≈
0 ≈
0
08 05
F16 Average 7.95 × 10− 6.43 × 10− ≈
0 4.67 × 10− 02 6.23 × 10− 03 ≈
0
07 04
Stdv 1.33 × 10− 1.47 × 10− ≈
0 7.69 × 10− 02 1.72 × 10− 02 ≈
0
Best ≈
0 ≈
0 ≈
0 1.73 × 10− 05 0 ≈
0
F17 Average 0≈ 0 0 2.32 × 10− 01 1.65 × 1002 3.62 × 1001
Stdv 0 0 0 2.14 × 10− 02 3.36 × 1001 1.50 × 1001
Best 0≈ 0 0 1.99 × 10− 01 2.82 × 1001 1.59 × 1001
F18 Average ≈
0 ≈
0 ≈
0 7.39 × 10− 01 7.30 × 10− 01 1.27 × 10− 02
Stdv 0 ≈
0 ≈
0 2.69 × 10− 02 8.92 × 10− 01 4.54 × 10− 03
Best ≈
0≈ ≈
0 ≈
0 6.68 × 10− 01 1.74 × 10− 03 5.78 × 10− 03
03
F19 Average 0≈ 5.07 × 10− 0 1.01 × 100 1.29 × 10− 01 2.95 × 10− 02
02
Stdv 0 2.78 × 10− 0 9.19 × 10− 03 1.74 × 10− 01 3.81 × 10− 02
Best 0≈ 0 0 9.84 × 10− 01 7.68 × 10− 04 1.93 × 10− 03
03 02
F20 Average 2.72 × 10− 3.81 × 10− ≈
0 8.38 × 10− 01 7.64 × 10− 01 4.32 × 10− 01
03 02
Stdv 3.07 × 10− 4.59 × 10− ≈
0 1.77 × 10− 01 1.03 × 100 5.15 × 10− 01
04 03
Best 2.88 × 10− 5.78 × 10− ≈
0 3.91 × 10− 01 3.67 × 10− 05 2.68 × 10− 06
01 01 02
F21 Average 1.27 × 10− 4.96 × 10− 1.71 × 10− 3.24 × 100 9.43 × 1002 2.71 × 10− 03
02 01 02
Stdv 9.13 × 10− 2.70 × 10− 3.40 × 10− 2.55 × 10− 01 4.17 × 1003 5.44 × 10− 03
02 01
Best 1.03 × 10− 1.21 × 10− ≈
0 2.57 × 100 4.03 × 10− 04 1.51 × 10− 05
F22 Average 0≈ ≈
0 0 4.99 × 10− 04 2.08 × 10− 04 ≈
0
Stdv 0 ≈
0 0 2.71 × 10− 05 5.78 × 10− 04 ≈
0
Best 0≈ ≈
0 0 4.43 × 10− 04 1.31 × 10− 07 ≈
0
F23 Average 0≈ 0 0 6.76 × 10− 02 9.87 × 10− 02 3.50 × 10− 01
Stdv 0 0 0 2.51 × 10− 03 1.05 × 10− 01 2.25 × 10− 01
Best 0≈ 0 0 6.16 × 10− 02 5.93 × 10− 07 3.22 × 10− 06
01 02
F24 Average 0 1.23 × 10− 9.99 × 10− 2.99 × 10− 01 9.11 × 10− 01 3.60 × 10− 01
02 09
Stdv 0 7.74 × 10− 1.82 × 10− 3.91 × 10− 03 2.78 × 10− 01 5.63 × 10− 02
02
Best 0 ≈
0 9.99 × 10− 2.78 × 10− 01 5.00 × 10− 01 3.00 × 10− 01

calculated and tabulated, showing Average (mean), stdv (standard de­ Table 13 shows the number of events where WOAmM is better than,
viation); the best value is presented in Table 9. In the Table, boldface more awful than, or like the methods HIWOA, OWOA, ACWOA, AWOA,
indicates the superiority of the proposed method. Table 10 shows the and OBSCA, considering the mean value of each function. It can be seen
number of events where WOAmM is better than, worse than, or similar that WOAmM is better on 9, 10, 10, 8, 10 functions and no similar result
to WOA, MFO, BOA, SCA, and JAYA, considering each function’s mean is found. On 1, 0, 0, 2, 0 occasions, WOAmM performs worse than the
value. It can be seen that WOAmM is better on 10, 9, 10, 9, 10 functions respective algorithms.
and no similar result is found. On 0, 1, 0, 1, 0 occasions, WOAmM The average value evaluated by WOAmM on functions F38, F39, F40,
performs worse than the respective algorithms. F41, F42, F43, F44, and F46 is better than other algorithms. A similar
The average values evaluated by WOAmM on functions F38 to F46 average value is evaluated by the algorithms HIWOA, AWOA, and
are better than all other competitors. A similar average cost is estimated WOAmM on function F37. However, WOAmM provides a higher stan­
by MFO, BOA, SCA & WOAmM on function F37. Nevertheless, the dard deviation than these two algorithms on F37. AWOA and WOAmM
higher standard deviation makes WOAmM worse than MFO and SCA on evaluate average results close to the optimal value on function F45, but
function F37. The results of Friedman’s test (Table 11) expose the su­ AWOA is slightly better. The result of Friedman’s test (Table 14) exposes
premacy of the proposed algorithm. The boxplot analysis (Fig. 3) con­ the supremacy of the proposed algorithm. The boxplot analysis (Fig. 4)
firms the stability of the algorithm while finding the results. All these confirms the algorithm’s consistency in finding the optimal solution. All
results support the enhanced ability of WOAmM in solving complicated these results approve the efficiency of WOAmM in solving complex
problem optimization. optimization problems.

6.4. Comparison of results of CEC 2019 functions with modified 6.5. Convergence analysis
algorithms
Fig. 5 (a, b, c, d, e, f) presents some of the convergence graphs of
CEC 2019 benchmark functions are evaluated using WOAmM, and WOAmM with its component algorithms WOA and SOS, also with other
the results are compared with the results of the OBSCA (Elaziz et al., three well-known algorithms BOA, DE, and PSO. Unimodal functions for
2017), AWOA (Sun and Zhang, 2018), ACWOA (Khashan et al., 2018), which convergence graphs are given here are Sphere, Cigar, and
OWOA (Alamri et al., 2018) and HIWOA (Tang et al., 2019). For all the Schwefel 1.2. and convergence graphs of the multimodal functions
algorithms, population size 50 is used, and the maximum number of inverted cosine mixture, Rastrigin, and Salomon are also incorporated
iterations 1000 is used as the termination criteria. Every algorithm is here. Fig. 6 (a, b, c, d, e, f) present the convergence graphs of WOAmM
executed 30 times for each function listed in Table-(c) of Appendix A. and other modified algorithms included for comparison, i.e., QIWOA,
Results are calculated and tabulated, showing Average (mean), stdv LWOA, IWOA, mSOS, mMBOA. Unimodal functions for which conver­
(standard deviation); the best value is presented in Tables 12. In the gence graphs are given here are Cigar, Sphere. Convergence graphs of
Table, the boldface indicates the superiority of the proposed method. the multimodal functions Salomon and inverted cosine mixture are also

8
S. Chakraborty et al. Computers & Industrial Engineering 153 (2021) 107086

Table 2c
Optimization experimental results for functions F25 to F36 using different metaheuristics with NP = 30 & D = 30.
Function ID WOAmM WOA SOS BOA DE PSO
02 − 02 01
F25 Average ≈
0 5.43 × 10 ≈
0 7.36 × 10 1.96 × 10 8.57 × 10− 01
Stdv 0 9.23 × 1001 ≈
0 3.25 × 10− 03 9.13 × 100 1.04 × 100
Best 0 3.28 × 1002 ≈
0 6.59 × 10− 02 4.98 × 100 1.46 × 10− 01
F26 Average 4.23 × 10− 04 7.01 × 10− 04 3.07 × 10− 04 1.02 × 10− 02 1.04 × 10− 03 4.30 × 10− 04
Stdv 2.76 × 10− 04 4.69 × 10− 04 ≈
0 2.28 × 10− 02 3.65 × 10− 03 3.45 × 10− 04
Best 3.08 × 10− 04 3.08 × 10− 04 3.07 × 10− 04 4.43 × 10− 04 3.07 × 10− 04 3.07 × 10− 04
F27 Average 3.00 × 100 3.00 × 100 3.00 × 100 8.65 × 100 3.00 × 100 4.80 × 100
Stdv 2.42 × 10− 06 2.48 × 10− 04 ≈
0 8.03 × 100 ≈
0 6.85 × 100
Best 3.00 £ 100≈ 3.00 × 100 3.00 × 100 3.04 × 100 3.00 × 100 3.00 × 100
F28 Average ¡2.81 £ 100 − 3.00 × 10− 01 − 3.00 × 10− 01 − 4.19 × 10− 01 − 3.00 × 10− 01 − 2.28 × 10− 01
Stdv 5.41 × 10− 01 ≈
0 ≈
0 1.51 × 10− 02 ≈
0 2.91 × 10− 02
Best ¡3.71 £ 100 − 3.00 × 10− 01 − 3.00 × 10− 01 − 4.41 × 10− 01 − 3.00 × 10− 01 − 2.74 × 10− 01
F29 Average 0≈ 0 0 2.88 × 10− 01 0 0
Stdv 0 0 0 4.33 × 10− 02 0 0
Best 0≈ 0 0 1.85 × 10− 01 0 0
F30 Average 0≈ 1.46 × 10− 02 0 2.06 × 10− 01 0 0
Stdv 0 5.54 × 10− 02 0 3.49 × 10− 02 0 0
Best 0≈ 0 0 6.19 × 10− 02 0 0
F31 Average 0≈ ≈
0 0 9.15 × 10− 02 0 0
Stdv 0 ≈
0 0 1.96 × 10− 02 0 0
Best 0≈ 0 0 5.70 × 10− 02 0 0
F32 Average 2.11 × 10− 01 2.43 × 100 2.12 × 10− 06 1.42 × 1001 3.19 × 10− 01 5.27 × 10− 01
Stdv 2.40 × 10− 01 3.10 × 100 4.22 × 10− 06 7.71 × 100 1.05 × 100 1.52 × 100
Best 1.76 × 10− 05 1.20 × 10− 03 7.73 × 10− 09 1.37 × 100 0 1.27 × 10− 03
F33 Average 8.84 £ 1001≈ 8.84 × 1001 8.84 × 1001 8.85 × 1001 8.84 × 1001 8.84 × 1001
Stdv 4.10 × 10− 06 4.31 × 10− 04 6.81 × 10− 05 1.22 × 10− 01 ≈
0 ≈
0
Best 8.84 £ 1001≈ 8.84 × 1001 8.84 × 1001 8.84 × 1001 8.84 × 1001 8.84 × 1001
F34 Average 1≈ 1 1 1.07 1 1
Stdv 0 ≈
0 0 2.40 × 10− 02 0 0
Best 0≈ 1 0 1.03 0 0
F35 Average − 1.06 × 1002 − 1.06 × 1002 − 1.07 × 1002 − 9.55 × 1001 − 1.07 × 1002 − 1.07 × 1002
Stdv 3.55 3.55 ≈
0 1.33 × 1001 ≈
0 ≈
0
Best − 1.07 × 1002 − 1.06 × 1002 − 1.07 × 1002 − 1.07 × 1002 − 1.07 × 1002 − 1.07 × 1002
F36 Average ¡1≈ − 9.87 × 10− 01 − 1 − 9.49 × 10− 01 − 1 − 1
Stdv 0 2.59 × 10− 02 0 1.95 × 10− 02 0 0
Best ¡1≈ − 1 − 1 9.49 × 10− 01 − 1 − 1

6.6. Description of real-life problems


Table 3
Comparison of experimental results, where the numbers represent the number of The effectiveness of any optimization algorithm depends on its effi­
functions. ciency in solving the real-life optimization problem. Real-life optimi­
WOAmM WOA SOS BOA DE PSO zation problems can be without constraints or constraints. Here we have
Superior to 31 17 34 28 28
solved one problem from the unconstrained category and five other
Similar to 5 9 1 4 4 problems from the constrained class. The description of the problems is
Inferior to 0 10 1 4 4 as follows:

6.6.1. Gear train design problem


This design problem is introduced by Sandgren (Sandgren, 1990) and
Table 4
is unconstrained in nature. Four decision variables namely, y1 ; y2 ; y3 ;
Friedman’s rank test with basic algorithms.
andy4 which indicates the number of teeth in each wheel of the gear. All
Method Rank Average Rank P-value the variables are positive integers and lie in the boundary (Sadollah
sum rank
et al., 2013; Price et al., 2018). The gear train is shown in Fig. 7. The gear
WOAmM 67 1.86 1 P-value (4.34E-19 < 0.01) indicates ratio for reducing a gear train is defined as the output shaft’s angular
WOA 129 3.583 3 that H0 is rejected at 1% level of
velocity and input shaft ratio. This design problem aims to minimize the
SOS 74.5 2.069 2 significance. i.e., there is a significant ( )
BOA 182.5 5.069 6 difference between the performance cost of gear ratio as close as possible to 6.931 1
. The mathematical
DE 160.5 4.458 5 of different methods at a 1% level of
PSO 142.5 3.958 4 significance. formulation of this problem is given by
Objective function:
[( ) ]
incorporated here. The best solution’s fitness value in each iteration is 1
Minf (x) = − (y3 y2 /y1 y4 )2
considered to draw convergence curves in these figures. From these 6.931
figures, it can be noted that the proposed WOAmM algorithm converges
faster for both types of functions, which establish that the exploration Subject to:
and exploitation capacities of the proposed algorithm are more balanced 12 ≤ yk ≤ 60, k = 1, 2, ⋯., 4.
than the other chosen algorithms.
6.6.2. Cantilever beam design problem
The cantilever beam is composed of five hollow square blocks with
constant thickness. The beam is rigidly supported at the first block, and

9
S. Chakraborty et al. Computers & Industrial Engineering 153 (2021) 107086

Fig. 2. Box plot comparison of function mean values of WOAmM with the basic methods. (N.B. Images of box plot analysis with basic methods for all the functions
given in the Appendix B.)

Table 5 Min.f (x) = 0.06224{x1 + x2 + x3 + x4 + x5 },


Parameters of the modified algorithms. Subject to:
Method Parameter values
61 37 19 7 1
QIWOA a = (rand − 0.5)**1(iter − maxiter ), V = exp(10*a), p = rand[0, 1], q = h(x) = + + + + ≤ 1,
x31 x32 x33 x34 x35
rand[0, 1]
LWOA a = 2(1 − (iter − maxiter ) ), p = rand[0, 1], b = 1, β = 1.5 where 0.01 ≤ x(k) ≤ 100, k = 1, 2⋯⋯5.
( )
IWOA 2
a = 2 − iter , A = 2*a*rand[0, 1]*a, C = 2*rand[0, 1], p =
maxiter
(
iter
) 6.6.3. Three-bar truss design problem
rand[0, 1], l = rand[ − 1, 1]λ = 1 − The problem, depicted in Fig. 9, minimizes a statically loaded three-
maxiter
mSOS BF1 = 1, BF2 = 1 bar truss volume while satisfying three constraints on stress, deflection,
m-MBOA p = 0.8, c = 0.01, a = 0.1to0.3 and buckling. This problem needs to optimize two variables (x1 and x2 )
WOAmM Parameters of WOA, BF1 = Either1or2,BF2 = Either1or2 to adjust the sectional areas. The problem contains a difficult, con­
strained search space. The mathematical formulation for this problem is
given below:
there is a vertical force acting at the free end of the fifth block. This
problem needs to minimize the beam’s weight and only meet the →
x = {x1 , x2 , }
constraint requirement on an upper limit on the open end’s vertical
Objective function:
displacement. Five decision variables of the problem are, x1 , x2 , x3 , x4 , x5
{ √̅̅̅ }
which are the lengths of the different blocks, respectively. The pictorial Min.f (x) = L x2 + 2 2 x1 ,
representation of the problem is given in Fig. 8. The mathematical
formulation for this problem is given below: Subject to:
Objective function:

10
S. Chakraborty et al. Computers & Industrial Engineering 153 (2021) 107086

Table 6a
Optimization experimental results for functions F1 to F12 using different modified algorithms with NP = 30 & D = 100.
Function ID WOAmM QIWOA LWOA IWOA mSOS mMBOA

F1 Average 0≈ 0 ≈
0 ≈
0 ≈
0 ≈
0
Stdv 0 0 0 ≈
0 0 ≈
0
Best 0≈ 0 ≈
0 ≈
0 ≈
0 0
F2 Average ≈
0 0 ≈
0 ≈
0 ≈
0 0
Stdv 0 0 ≈
0 ≈
0 ≈
0 0
Best ≈
0 0 ≈
0 ≈
0 ≈
0 0
F3 Average 0≈ 0 ≈
0 9.23 × 1004 ≈
0 ≈
0
Stdv 0 0 ≈
0 2.62 × 1004 0 ≈
0
Best 0≈ 0 ≈
0 4.34 × 1004 ≈
0 0
F4 Average ≈
0 0 ≈
0 2.18 × 1001 ≈
0 ≈
0
Stdv 0 0 ≈
0 3.68 × 100 ≈
0 ≈
0
Best ≈
0 0 ≈
0 1.35 × 1001 ≈
0 0
F5 Average 9.72 × 1001 0 3.51 × 10− 01 9.53 × 1001 9.89 × 1001 9.84 × 1001
Stdv 4.65 × 10− 01 0 1.17 × 100 1.61 × 100 3.21 × 10− 02 2.59 × 10− 01
Best 9.63 × 1001 0 5.30 × 1001 9.30 × 1001 9.88 × 1001 9.76 × 1001
F6 Average 0≈ 0 5.16 × 100 8.33 × 10− 01 0 0
Stdv 0 0 2.12 × 100 1.44 × 100 0 0
Best 0≈ 0 1.39 × 100 0 0 0
F7 Average 1.24 × 10− 03 3.19 × 10− 04 9.95 × 10− 05 7.88 × 10− 02
5.29 × 10− 02
9.17 × 10− 04
Stdv 4.50 × 10− 03 2.55 × 10− 04 1.15 × 10− 04 5.96 × 10− 02
3.75 × 10− 02
3.66 × 10− 04
Best 1.66 × 10¡06 9.02 × 10− 06 1.29 × 10− 05 9.83 × 10− 03
1.19 × 10− 03
3.08 × 10− 04
F8 Average 0≈ 0 0 ≈
0 ≈
0 ≈
0
Stdv 0 0 0 ≈
0 ≈
0 ≈
0
Best 0≈ 0 0 ≈
0 ≈
0 0
F9 Average ≈
0 2.13 × 100 3.15 × 10− 05 ≈
0 ≈
0 ≈
0
Stdv ≈
0 1.17 × 1001 2.38 × 10− 05 ≈
0 0 1.10 × 10− 09
Best ≈
0 ≈
0 ≈
0 ≈
0 0 0
F10 Average 0≈ 0 0 ≈
0 0 ≈
0
Stdv 0 0 0 ≈
0 0 ≈
0
Best 0≈ 0 0 ≈
0 0 0
F11 Average 0≈ 0 ≈
0 ≈
0 ≈
0 ≈
0
Stdv 0 0 0 ≈
0 ≈
0 ≈
0
Best 0≈ 0 ≈
0 ≈
0 ≈
0 0
F12 Average 0 2.83 × 10− 01 1.43 × 1001 ≈
0 ≈
0 ≈
0
Stdv 0 6.1 × 10− 01 2.89 ≈
0 0 ≈
0
Best 0≈ ≈
0 8 ≈
0 ≈
0 0

Table 6b
Optimization experimental results for functions F13 to F24 using different modified algorithms with NP = 30 & D = 100.
Function ID WOAmM QIWOA LWOA IWOA mSOS mMBOA

F13 Average 0≈ ≈
0 ≈
0 ≈
0 0 ≈
0
Stdv 0 ≈
0 ≈
0 ≈
0 0 ≈
0
Best 0≈ ≈
0 ≈
0 ≈
0 0 0
F14 Average 0 ≈
0 1.93 × 1002 ≈
0 ≈
0 ≈
0
Stdv 0 ≈
0 3.04 × 1001 ≈
0 0 ≈
0
Best 0 ≈
0 1.35 × 1002 ≈
0 ≈
0 0
F15 Average 0≈ ≈
0 ≈
0 0 0 ≈
0
Stdv 0 ≈
0 ≈
0 0 0 ≈
0
Best 0≈ ≈
0 ≈
0 0 0 0
F16 Average 7.95 × 10− 08 2.41 × 10− 04
2.82 × 10− 06 ≈
0 0 2.23 × 10− 09
Stdv 1.33 × 10− 07 9.19 × 10− 04
7.55 × 10− 06 ≈
0 0 2.97 × 10− 09
Best ≈
0 0 ≈
0 0 0 ≈
0
F17 Average 0≈ 0 0 8.01 × 1001 0 0
Stdv 0 0 0 5.88 × 1001 0 0
Best 0≈ 0 0 0 0 0
F18 Average ≈
0≈ ≈
0 ≈
0 ≈
0 ≈
0 ≈
0
Stdv 0 0 0 ≈
0 ≈
0 ≈
0
Best ≈
0≈ ≈
0 ≈
0 ≈
0 ≈
0 ≈
0
F19 Average 0≈ 0 0 3.53 × 10− 03 0 0
Stdv 0 0 0 8.00 × 10− 03 0 0
Best 0≈ 0 0 0 0 0
F20 Average 1.44 × 10− 02 5.44 × 10− 04
1.76 × 10− 02 1.13 × 100 9.43 × 10− 01 3.21 × 10− 01
Stdv 5.33 × 10− 03 6.13 × 10− 04
3.96 × 10− 03 1.22 × 100 9.91 × 10− 02 6.25 × 10− 02
Best 7.78 × 10− 03 1.43 × 10− 05
7.28 × 10− 03 5.99 × 10− 02 7.33 × 10− 01 2.12 × 10− 01
F21 Average 1.45 × 100 6.64 × 10− 04
1.83 × 100 4.31 × 100 9.99 × 100 9.81 × 100
Stdv 5.10 × 10− 01 7.87 × 10− 04
5.89 × 10− 01 8.08 × 10− 01 3.88 × 10− 03 5.60 × 10− 01
Best 5.57 × 10− 01 7.39 × 10− 06
7.72 × 10− 01 2.17 × 100 9.98 × 100 7.49 × 100
F22 Average 0≈ 0 0 ≈
0 0 ≈
0
Stdv 0 0 0 ≈
0 0 ≈
0
Best 0≈ 0 0 ≈
0 0 ≈
0
F23 Average 0≈ 0 0 ≈
0 0 0
Stdv 0 0 0 ≈
0 0 0
Best 0≈ 0 0 0 0 0
F24 Average 0≈ 0 ≈
0 ≈
0 ≈
0 6.66 × 10− 03
Stdv 0 0 ≈
0 ≈
0 ≈
0 2.53 × 10− 02
Best 0≈ 0 ≈
0 0 ≈
0 0
11
S. Chakraborty et al. Computers & Industrial Engineering 153 (2021) 107086

Table 6c
Optimization experimental results for functions F25 to F36 using different modified algorithms with NP = 30 & D = 100.
Function ID WOAmM QIWOA LWOA IWOA mSOS mMBOA
09 03 02
F25 Average ≈
0 6.07 × 10 1.81 × 10 8.21 × 10 ≈
0 3.51 × 100
Stdv 0 3.18 × 1010 2.51 × 1002 1.52 × 1002 0 8.97 × 100
Best ≈
0 2.63 × 1002 1.41 × 1003 4.46 × 1002 ≈
0 0
F26 Average 4.23 × 10− 04 1.47 × 10− 03 7.27 × 10− 04 3.75 × 10− 03 1.73 × 10− 03 3.55 × 10− 04
Stdv 2.76 × 10− 04 2.79 × 10− 03 3.47 × 10− 04 7.56 × 10− 03 5.07 × 10− 03 1.82 × 10− 04
Best 3.08 × 10− 04 3.08 × 10− 04 3.11 × 10− 04 3.07 × 10− 04 3.07 × 10− 04 3.07 × 10− 04
F27 Average 3.00 × 100≈ 4.90 × 100 3.00 × 100 7.50 × 100 3.00 × 100 3.00 × 100
Stdv 2.42 × 10− 06 7.19 × 100 1.34 × 10− 07 1.60 × 1001 ≈
0 ≈
0
Best 3.00 × 100≈ 3.00 × 100 3.00 × 100 3.00 × 100 3.00 × 100 3.00 × 100
F28 Average − 2.81 × 100 − 3.00 × 10− 01 − 3.54 × 100 − 3.00 × 10− 01 − 3.00 × 10− 01 − 3.86 × 100
Stdv 5.14 × 10− 01 ≈
0 3.78 × 10− 02 ≈
0 ≈
0 ≈
0
Best − 3.71 × 100 − 3.00 × 10− 01 − 3.70 × 100 − 3.00 × 10− 01 − 3.00 × 10− 01 − 3.86 × 100
F29 Average 0≈ 0 0 0 0 0
Stdv 0 0 0 0 0 0
Best 0≈ 0 0 0 0 0
F30 Average 0≈ 2.58 × 10− 04 2.18 × 10− 02 0 0 0
Stdv 0 1.11 × 10− 03 6.66 × 10− 02 0 0 0
Best 0≈ 8.06 × 10− 09 0 0 0 0
F31 Average 0≈ 1.25 × 10− 03 ≈
0 0 0 0
Stdv 0 2.34 × 10− 03 3.45 × 10− 09 0 0 0
Best 0≈ 1.08 × 10− 07 ≈
0 0 0 0
F32 Average 2.11 × 10− 01 3.44 × 10− 02 7.44 × 10− 01 ≈
0 5.42 × 10− 02 9.47 × 10− 02
Stdv 2.40 × 10− 01 4.25 × 10− 02 9.19 × 10− 01 ≈
0 1.71 × 10− 01 5.09 × 10− 01
Best 1.76 × 10− 05 1.36 × 10− 06 6.18 × 10− 03 ≈
0 0 4.32 × 10− 06
F33 Average 8.84 × 1001≈ 8.87 × 1001 8.84 × 1001 8.84 × 1001 8.84 × 1001 ≈
0
Stdv 4.10 × 10− 06 1.50 × 100 3.74 × 10− 05 ≈
0 1.13 × 10− 03 ≈
0
Best 8.84 × 1001≈ 8.84 × 1001 8.84 × 1001 8.84 × 1001 8.84 × 1001 ≈
0
F34 Average 1≈ 1 1 1 1 1
Stdv 0 7.14 × 10− 03 0 0 0 0
Best 0≈ 1 0 0 0 0
F35 Average − 1.06 × 1002 − 1.02 × 1002 − 1.06 × 1002 − 1.07 × 1002 − 1.07 × 1002 − 1.07 × 1002
Stdv 3.55 8.81 3.55 ≈
0 ≈
0 ≈
0
Best − 1.07 × 1002 − 1.07 × 1002 − 1.07 × 1002 − 1.07 × 1002 − 1.07 × 1002 − 1.07 × 1002
F36 Average ¡1≈ − 9.91 × 10− 01 − 9.87 × 10− 01 − 9.83 × 10− 01 − 1 − 1
Stdv 0 1.92 × 10− 02 − 2.59 × 10− 02 2.87 × 10− 02 0 0
Best ¡1≈ − 1 − 1 9.49 × 10− 01 − 1 − 1

Table 7
Comparison of experimental results, where the numbers represent the number of Table 8
functions. Friedman’s rank test with modified algorithms.

WOAmM QIWOA LWOA IWOA mSOS mMBOA Method Rank Average Rank P-value
sum rank
Superior to 15 23 26 17 19
Similar to 14 9 5 14 9 WOAmM 90.5 2.514 1 P-value (0.000116 < 0.01) indicates
Inferior to 7 4 5 5 8 QIWOA 122 3.389 2 that Ho is rejected at 1% level of
LWOA 132.5 3.681 4 significance. i.e., there is a significant
IWOA 160 4.444 6 difference between the performance
x2 mSOS 116.5 3.236 3 of different methods at a 1% level of
h1 (x) = √̅̅̅ P − σ ≤ 0, mMBOA 134.5 3.736 5 significance.
2x2 x1 + 2x21
√̅̅̅
x2 + 2x1
h2 (x) = √̅̅̅ P − σ ≤ 0,
2x2 x1 + 2x21
compressor discharge pressure with flow rate. The other two variables
1 x3 and x4 are the diameter of the pipe (in inches) and volume flow rate
h3 (x) = √̅̅̅ P − σ ≤ 0, (ft3/sec), respectively. The problem has one inequality constraint. The
x1 + 2x2
mathematical formulation of the problem is given below:
where, →
x = {x1 , x2 , x3 , x4 }
0 ≤ x1 , x2 ≤ 1, and Objective function:
P = 2, L = 100 & σ = 2. 1 − 2 1
Min.f (x) = 8.61*105 x21 x2 x33 x24 + 3.69*104 x3 + 7.72*108 x1− 1 x0.219
2

− 765.43*106 x−1 1 ,
6.6.4. Gas transmission compressor design problem
Beightler and Phillips designed the problem (Beightler and Phillips, Subject to:
1976). It is a mechanical design problem with four variables. The task is
to determine the value of variables so that it will deliver 100 million cu. h(x) = x4 x−2 2 + x−2 2 − 1 ≤ 0,
Ft. of gas per day with the lowest cost for the gas pipeline transmission
system. Here, variable x1 is the length between the compressor station where,
(in miles) and x2 is the compressor ratio, which is found dividing

12
S. Chakraborty et al. Computers & Industrial Engineering 153 (2021) 107086

Table 9
Results for comparison of CEC 2019 functions F37 to F46 with different basic algorithms.
Function ID WOAmM WOA MFO BOA SCA JAYA
6
F37 Average 1 3.35 × 10 1 1 1 8.39 × 106
Stdv ≈
0 3.94 × 106 0 4.13 × 10− 8 0 3.12 × 106
Best 1 4.54 × 102 1 1 1 3.87 × 106
F38 Average 4.31 7.71 × 1003 5 5.01 5 5.35 × 1003
Stdv 9.04 × 10− 2 2.73 × 1003 0 3.47 × 10− 03 0 9.02 × 1002
Best 4.23 3.56 × 1003 5 5 5 3.71 × 1003
F39 Average 2.88 4.11 7.43 4.77 6.12 9.24
Stdv 1.11 1.82 5.02 × 10− 02 9.38 × 10− 01 1.45 9.91 × 10− 01
Best 1.41 1.43 7.33 3.28 4.41 6.40
F40 Average 3.42 × 1001 5.04 × 1001 7.80 × 1001 9.88 × 1001 7.52 × 1001 3.71 × 1001
Stdv 8.42 2.09 × 1001 7.84 1.17 × 1001 1.52 × 1001 5.05
Best 1.89 × 1001 1.30 × 1001 7.11 × 1001 7.85 × 1001 4.69 × 1001 2.61 × 1001
F41 Average 1.60 2.05 5.48 × 1001 1.18 × 1002 1.49 × 1001 2.70
Stdv 2.48 × 10− 01 4.55 × 10− 01 1.09 × 1001 2.19 × 1001 9.61 2.38 × 10− 01
Best 1.25 1.45 3.79 × 1001 7.42 × 1001 4.33 2.22
F42 Average 5.55 8.13 1.05 × 1001 1.12 × 1001 8.62 7.67
Stdv 1.56 1.83 3.87 × 10− 01 1.22 1.54 1.06
Best 3.33 3.85 1.04 × 1001 6.66 6.54 5.81
F43 Average 1.14 × 1003 1.30 × 1003 1.69 × 1003 2.81 × 1003 2.6 × 1003 1.22 × 1003
Stdv 2.62 × 1002 3.5 × 1002 1.81 × 1002 2.07 × 1002 3.88 × 1002 1.82 × 1002
Best 4.9 × 1002 5.42 × 1002 1.59 × 1003 1.65 × 1003 1.44 × 1003 9.03 × 1002
F44 Average 4.19 4.53 4.49 4.96 5.16 4.45
Stdv 3.67 × 10− 01 3.29 × 10− 01 4.66 × 10− 04 2.08 × 10− 01 1.57 × 10− 01 1.58 × 10− 01
Best 3.23 3.93 4.49 4.42 4.74 4.08
F45 Average 1.34 1.36 2.32 4.52 1.93 1.6
Stdv 1.2 × 10− 01 1.78 × 10− 01 9.3 × 10− 01 4.62 × 10− 01 2.15 × 10− 01 1.17 × 10− 01
Best 1.06 1.11 1.52 3.23 1.48 1.36
F46 Average 2.11 × 1001 2.12 × 1001 2.13 × 1001 2.15 × 1001 2.20 × 1001 2.14 × 1001
Stdv 7.36 × 1002 1.05 × 10− 01 8.55 × 10− 02 8.73 × 10− 02 1.78 × 10− 01 8.64 × 10− 02
Best 2.1 × 1001 2.1 × 1001 2.13 × 1001 2.13 × 1001 2.16 × 1001 2.12 × 1001

pressure vessel capped at both ends with hemispherical heads should


Table 10 work under the pressure of 3,000 psi (2.1 × 107 Pa) and a minimum
Result of comparison of CEC 2019 functions with basic algorithms.
volume of 750 ft3 (21.24 m3) according to the ASME boiler code
WOAmM WOA MFO BOA SCA JAYA requirement. The total cost of welding, material, and forming define an
Superior to 10 9 10 9 10 objective function to be minimized. Both thickness variables (Tshl , Thd )
Similar to 0 0 0 0 0 must be integer multiple values of 0.0625 in., which is the available
Inferior to 0 1 0 1 0 thickness of rolled steel plates. Mirjalili and Lewis (Mirjalili and Lewis,
2016) solved this problem. A graphical illustration of the problem is
given in Fig. 10. Mathematically, this problem can be represented as:
Table 11 →
x = {x1 , x2 , x3 , x4 }
Friedman’s rank test with basic algorithms using results of CEC19 functions.
Method Rank Average Rank P-value
Objective function:
sum rank
Min.f (x) = 0.6224x1 x3 x4 + 1.7781x2 x32 + 3.1661x21 x4 + 19.84x12 x3 ,
WOAmM 11.5 1.15 1 P-value (0.00017 < 0.01) indicates
WOA 32 3.2 2 that Ho is rejected at 1% level of Subject to:
MFO 40 4 4 significance. i.e., there is a significant
BOA 48.5 4.85 6 difference between the performance h1 (x) = − x1 + 0.0193x3 ≤ 0,
SCA 43 4.3 5 of different methods at a 1% level of
JAYA 35 3.5 3 significance. h2 (x) = − x3 + 0.00954x3 ≤ 0,

4 3
20 ≤ x1 ≤ 50, h3 (x) = − πx23 x4 − π x + 1296000 ≤ 0,
3 3

1 ≤ x2 ≤ 10, h4 (x) = x4 − 240 ≤ 0,

20 ≤ x3 ≤ 50, where,

0.1 ≤ x4 ≤ 60. 0 ≤ x1 ≤ 99,

0 ≤ x2 ≤ 99,
6.6.5. Pressure vessel design problem
In this problem, the goal is to achieve a minimum cost of cylindrical 10 ≤ x3 ≤ 200,
pressure vessel having decision variables the thickness of the shell (Tshl ),
head (Thd ), inner radius (Ri ), and range of cross-section minus head (Lc ). 10 ≤ x4 ≤ 200.
This vessel is related to welding, structure, and material, and its ends are
covered. The variables in the problem are represented by x1 ,x2 ,x3 and x4 6.6.6. Car side impact design problem
respectively. This problem is one of the most well-known benchmark This problem was initially proposed by Gu et al. (2001). The car is
design tests with mixed variables (continuous/discrete). A cylindrical

13
S. Chakraborty et al. Computers & Industrial Engineering 153 (2021) 107086

Fig. 3. Box plot comparison of results of WOAmM with the basic methods using CEC19 functions. (N.B. Images of box plot analysis with basic methods for all the CEC
functions are given in Appendix B.)

Table 12
Results for comparison of CEC 2019 functions F37 to F46 with different modified methods.
Function ID WOAmM HIWOA OWOA ACWOA AWOA OBSCA

F37 Average 1 1 3.02 × 106 1 1 1


Stdv ≈
0 0 3.71 × 106 ≈
0 0 ≈
0
Best 1 1 7.17 × 102 1 1 1
F38 Average 4.31 5 7.24 × 103 5 5 5
Stdv 9.04 × 10− 2 3.29 × 10− 04 2.59 × 103 0 1.89 × 10− 6 0
Best 4.23 5 2.88 × 103 5 5 5
F39 Average 2.88 4.64 3.76 6.39 3.56 5.84
Stdv 1.11 1.31 1.74 1.13 1.62 1.42
Best 1.41 2.38 1.42 3.93 1.32 3.28
F40 Average 3.42 × 1001 7.43 × 1001 4.30 × 1001 7.99 × 1001 5.41 × 1001 6.06 × 1001
Stdv 8.42 1.14 × 1001 1.64 × 1001 1.63 × 1001 1.58 × 1001 1.75 × 1001
Best 1.89 × 1001 5.57 × 1001 2.20 × 1001 5.79 × 1001 3.04 × 1001 3.09 × 1001
F41 Average 1.60 4.18 × 1001 1.90 4.34 × 1001 2.33 3 × 1001
Stdv 2.48 × 10− 01 1.77 × 1001 4.88 × 10− 01 1.57 × 1001 4.94 × 10− 01 1.55 × 1001
Best 1.25 1.30 × 1001 1.28 1.54 × 1001 1.64 3.95
F42 Average 5.55 8.69 8.37 1.03 × 1001 7.66 8.74
Stdv 1.56 1.12 1.37 1.32 1.43 1.31
Best 3.33 6.69 5.93 7.91 4.69 5.22
F43 Average 1.14 × 1003 1.53 × 1003 1.24 × 1003 1.67 × 1003 1.24 × 1003 1.58 × 1003
Stdv 2.62 × 1002 2.81 × 1002 2.67 × 1002 2.76 × 1002 2.76 × 1002 3.04 × 1002
Best 4.9 × 1002 7.14 × 1002 5.36 × 1002 1.15 × 1003 6.78 × 1002 1.06 × 1003
F44 Average 4.19 4.61 4.48 4.77 4.55 4.60
Stdv 3.67 × 10− 01 2.52 × 10− 01 3.54 × 10− 01 2.65 × 10− 01 2.81 × 10− 01 2.54 × 10− 01
Best 3.23 4.05 3.54 3.88 3.81 4.08
F45 Average 1.34 1.58 1.43 2.03 1.33 1.68
Stdv 1.2 × 10− 01 9.32 × 10− 02 2.04 × 10− 01 8.19 × 10− 01 1.72 × 10− 01 5.48 × 10− 01
Best 1.06 1.49 1.16 1.33 1.13 1.22
F46 Average 2.11 × 1001 2.14 × 1001 2.12 × 1001 2.14 × 1001 2.12 × 1001 2.14 × 1001
Stdv 7.36 × 1002 1.52 × 10− 01 1.22 × 10− 01 9.31 × 10− 02 1.23 × 10− 01 6.9 × 10− 02
Best 2.1 × 1001 2.1 × 1001 2.1 × 1001 2.12 × 1001 2.1 × 1001 2.13 × 1001

14
S. Chakraborty et al. Computers & Industrial Engineering 153 (2021) 107086

exposed to a side impact on the foundation of the European Enhanced →


Vehicle Safety Committee (EEVC) procedures. The objective is to mini­ a = {a1 , a2 , a3 , a4 , a5 , a6 , a7 , a8 , a9 , a10 , a11 }
mize the car’s total weight using eleven mixed variables while main­ Objective function:
taining safety performance according to the standard. These variables
represent the thickness and material of critical parts of the car. The 8th Minf (a) = 1.98 + 4.90a1 + 6.67a2 + 6.98a3 + 4.01a4 + 1.78a5 + 2.73a7 ,
and the 9th variables are discrete, and these are material design vari­ Subject to:
ables, while the rest of the variables are continuous and represents
thickness design variables. h1 (a) = 1.16 − 0.3717a2 a4 − 0.00931a2 a10 − 0.484a3 a9 +
The symbols a1 , a2 , a3 , a4 , a5 , a6 , a7 , a8 , a9 , a10 , a11 are used here to
0.01343a6 a10 ≤ 1,
represent the variables thickness of B-pillar inner, thickness of B-pillar
reinforcement, thickness of floor side inner, thickness of cross members,
h2 (a) = 0.261 − 0.0159a1 a2 − 0.188a1 a8 − 0.019a2 a7 + 0.0144a3 a5
thickness of door beam, thickness of door beltline reinforcement,
thickness of roof rail, material of B-pillar inner, material of floor side + 0.0008757a5 a10 + 0.080405a6 a9 + 0.00139a8 a11
inner, barrier height, barrier hitting position respectively. The problem + 0.00001575a10 a11 ≤ 0.32,
is subjected to ten inequality constraints. The car side impact design is
considered a real case of a mechanical optimization problem with mixed h3 (a) = 0.214 + 0.00817a5 − 0.131a1 a8 − 0.0704a1 a9 + 0.03099a2 a6
discrete and continuous design variables. This problem can be mathe­
− 0.018a2 a7 + 0.0208a3 a8 + 0.121a3 a9 − 0.00364a5 a6
matically described as:
+ 0.0007715a5 a10 − 0.0005354a6 a10 + 0.00121a8 a11 ≤ 0.32,

Table 13 h4 (a) = 0.074 − 0.061a2 − 0.163a3 a8 + 0.001232a3 a10 − 0.166a7 a9


Result of comparison of CEC 2019 functions with modified methods.
+ 0.227a22 ≤ 0.32,
WOAmM HIWOA OWOA ACWOA AWOA OBSCA

Superior to 9 10 10 8 10 h5 (a) = 28.98 + 3.818a3 − 4.2a1 a2 + 0.0207a5 a10 + 6.63a6 a9 −


Similar to 0 0 0 0 0
Inferior to 1 0 0 2 0
7.7a7 a8 + 0.32a9 a10 ≤ 32,

Table 14 h6 (a) = 33.86 + 2.95a3 + 0.1792a10 − 5.05a1 a2 − 11.0a2 a8 − 0.0215a5 a10


Friedman’s rank test with modified algorithms using results of CEC19 functions. − 9.98a7 a8 + 22.0a8 a9 ≤ 32,
Method Rank Average Rank P-value
sum rank h7 (a) = 46.36 − 9.9a2 − 12.9a1 a8 + 0.1107a3 a10 ≤ 32,
WOAmM 13 1.3 1 P-value (2.90E-06 < 0.01) indicates
HIWOA 42.5 4.25 4 that Ho is rejected at 1% level of h8 (a) = 4.72 − 0.5a4 − 0.19a2 a3 − 0.0122a4 a10 + 0.009325a6 a10 +
OWOA 32 3.2 3 significance. i.e., there is a significant
ACWOA 53.5 5.35 6 difference in the performance of 0.000191a211 ≤ 4,
AWOA 25.5 2.55 2 different methods at a 1% level of
OBSCA 43.5 4.35 5 significance.

Fig. 4. Box plot comparison of results of WOAmM with the modified methods using CEC19 functions. (N.B. Images of box plot analysis with modified methods for all
the CEC functions are given in Appendix B.)

15
S. Chakraborty et al. Computers & Industrial Engineering 153 (2021) 107086

Fig. 5. Convergence graphs of WOAmM compared with WOA, SOS, BOA, DE, PSO.

h9 (a) = 10.58 − 0.674a1 a2 − 1.95a2 a8 + 0.02054a3 a10 − Yildiz et al. (Yildiz et al., 2019).

0.0198a4 a10 + 0.028a(6)a(10) ≤ 9.9, 6.7.1. Gear train design problem


The result of RLP1 is compared with other 13 well-known algo­
h10 (a) = 16.45 − 0.489a3 a7 − 0.843a5 a6 + 0.0432a9 a10 − 0.0556a9 a11 rithms, including the works cited in (Sharma and Saha, 2019), and given
− 0.000786a211 ≤ 15.7, in Table 15. The abbreviation of the different algorithms used for
comparison are as follows: BA-Bat algorithm, CSA-Cuckoo search algo­
where, rithm, DA- Dragon-fly algorithm, FA- Firefly algorithm, FPA- Flower
pollination algorithm, MFO- Moth-flame optimization algorithm, PBO-
0.5 ≤ ai ≤ 1.5, i = 1, 2, 3, 4, 5, 6, 7 Polar bear optimization algorithm, WWO- Water wave optimization
algorithm, SA- Simulated annealing, GA- Genetic algorithm, PSO- Par­
a8 , a9 ∈ (0.192, 0.345),
ticle swarm optimization, ACO- Ant colony optimization, mMBOA.-
Butterfly optimization algorithm modified with mutualism. Table 15
− 30 ≤ a10 , a11 ≤ 30.
portrays the minimum cost of gear ratio and the variables’ corre­
sponding values to attain the minimum cost. The best result found by the
6.7. Performance analysis of WOAmM using real-life optimization
algorithms is compared. The cost of minimum gear ratio f(x) evaluated
problems
by the algorithms unveils that the proposed WOAmM provides the best
result compared to other algorithms, whereas mMBOA is the 2nd best
WOAmM is applied to solve a total of six problems, gear train design
algorithm. The best result is marked in bold in the Table.
(RLP1), cantilever beam design (RLP-2), three-bar truss design (RLP-3),
gas transmission compressor design (RLP4), pressure vessel design (RLP-
6.7.2. Cantilever beam design problem
5), car side impact design (RLP-6). Among them, RLP-1, RLP-2, and RLP-
The result of RLP-2 is compared with the results of nine other well-
3 are taken from the literature, Sharma et al. (Sharma and Saha, 2019).
known algorithms found in the literature (Sharma and Saha, 2019;
RLP-4 is found in the literature (Kumar et al., 2020). RLP-5 is solved by
Nama et al., 2020), Sharma et al. & Nama et al. The corresponding
Mirjalili & Lewis (Mirjalili and Lewis, 2016); and RLP-6 is taken from

16
S. Chakraborty et al. Computers & Industrial Engineering 153 (2021) 107086

Fig. 6. Convergence graphs of WOAmM compared with QIWOA, LWOA, IWOA, mSOS, m-MBOA.

6.7.3. Three-bar truss design problem


The result found by WOAmM is compared to nine other algorithms
from the literature of Yildirim et al. (2018) & Sharma et al. (2019).
Table 17 contains the result found by the algorithms. It seems that the
optimal value of variables estimated by the algorithms MBA, PSO-DE,
CS, DE-DS, BAT, CA, AAA gives the minimum volume of the bar truss.
Though, the algorithm TSA evaluates result close to the optimal value,
does not satisfy the constraints. Thus, the solution is infeasible. Like a
few other algorithms used in this comparison, WOAmM gives an optimal
solution to this problem.

6.7.4. Gas transmission compressor design problem


Results are evaluated using population size 30 and 500 iterations as
Fig. 7. Gear train design (RLP-1). termination criteria. The best result is compared to six other well-
performing algorithms of the metaheuristic family. Table 18 contains
values of all the five variables to obtain the minimum weight of canti­ the result found by the algorithms. Variable values with the minimum
lever evaluated by every individual algorithm are given in Table 16. The calculated cost are displayed in Table 18. SOS, DE, and WOAmM
best value estimated by each algorithm is used for the comparison. Al­ calculate an optimal solution to this problem. WOA, the component
gorithms ALO, SOS, CS, MFO, GOA, and SSA evaluates 2nd lowest algorithm of WOAmM, estimates the cost close to the optimal solution,
minimum weight of the beam. Calculated cantilever weight by the al­ but WOAmM finds better result in terms of minimum value.
gorithms MMA, CGA-I, CSA-II is very close to the 2nd lowest minimum
weight. Though the minimum value of cantilever calculated by all the 6.7.5. Pressure vessel design problem
algorithms on this problem is very competitive, WOAmM finds the WOAmM found the result on evaluating the objective function using
cantilever’s minimum weight. population size 30 and 500 iterations as termination criteria compared

17
S. Chakraborty et al. Computers & Industrial Engineering 153 (2021) 107086

Table 15
Comparison of results on the gear train design problem.
Method y1 y2 y3 y4 f (x)
− 2 − 2 − 2 − 2
BA 5745 × 10 1948 × 10 1859 × 10 4369 × 10 1.53 × 10− 11
CSA 5563 × 10− 2 1672 × 10− 2 2127 × 10− 2 4431 × 10− 2 3.09 × 10− 13
DA 5244 × 10− 2 1700 × 10− 2 2299 × 10− 2 5167 × 10− 2 3.02 × 10− 11
FA 5005 × 10− 2 2439 × 10− 2 1402 × 10− 2 4635 × 10− 2 6.52 × 10− 13
FPA 5115 × 10− 2 2246 × 10− 2 1798 × 10− 2 5591 × 10− 2 4.83 × 10− 11
MFO 4422 × 10− 2 1877 × 10− 2 2114 × 10− 2 5703 × 10− 2 1.44 × 10− 14
PBO 5005 × 10− 2 2332 × 10− 2 1482 × 10− 2 4789 × 10− 2 1.37 × 10− 15
WWO 5527 × 10− 2 2431 × 10− 2 1503 × 10− 2 4583 × 10− 2 5.67 × 10− 12
SA 5134 × 10− 2 2133 × 10− 2 1498 × 10− 2 4743 × 10− 2 1.71 × 10− 04
GA 5257 × 10− 2 2310 × 10− 2 1693 × 10− 2 4824 × 10− 2 1.13 × 10− 04
PSO 5133 × 10− 2 2102 × 10− 2 1479 × 10− 2 4782 × 10− 2 3.08 × 10− 04
ACO 5149 × 10− 2 2136 × 10− 2 1583 × 10− 2 4734 × 10− 2 2.87 × 10− 05
mMBOA 3345 × 10− 2 1224 × 10− 2 1509 × 10− 2 4066 × 10− 2 3.36 × 10− 16
WOAmM 3536 £ 10¡2 1267 £ 10¡2 1267 £ 10¡2 1267 £ 10¡2 2.77 £ 10¡17

Table 16
Comparison of results on cantilever beam design problem.
Method x1 x2 x3 x4 x5 f (x)

WOAmM 603 £ 10 ¡2
524 £ 10 ¡2
460 £ 10 ¡2
341 £ 10 ¡2
221 £ 10 ¡2
1.3374
2 2 2 2 2
ALO 602 × 10− 531 × 10− 449 × 10− 350 × 10− 216 × 10− 1.3399
2 2 2 2 2
SOS 602 × 10− 530 × 10− 449 × 10− 350 × 10− 215 × 10− 1.3399
2 2 2 2 2
CS 601 × 10− 530 × 10− 450 × 10− 351 × 10− 215 × 10− 1.3399
2 2 2 2 2
MMA 601 × 10− 530 × 10− 449 × 10− 349 × 10− 215 × 10− 1.3400
2 2 2 2 2
CGA-I 601 × 10− 530 × 10− 449 × 10− 350 × 10− 215 × 10− 1.3400
2 2 2 2 2
CSA-II 601 × 10− 530 × 10− 449 × 10− 349 × 10− 215 × 10− 1.3400
2 2 2 2 2
MFO 598 × 10− 532 × 10− 450 × 10− 351 × 10− 216 × 10− 1.3399
2 2 2 2 2
GOA 601 × 10− 531 × 10− 448 × 10− 350 × 10− 216 × 10− 1.3399
Fig. 8. Cantilever beam design (RLP-2). SSA 601 × 10− 2
531 × 10− 2
449 × 10− 2
350 × 10− 2
216 × 10− 2
1.3399

with the best results cited by Mirjalili & Lewis (Mirjalili and Lewis,
2016) and given in Table 19. In the Table optimal value of variables Tshl ;
Thd , Ri , and Lc represented by variables x1 , x2, x3, x4 respectively and f(x)
represent the optimal cost. GSA algorithm performs the worst among the
algorithms on this problem. Algorithm ACO is not able to find the so­
lution satisfying the constraints. That is why the answer is shown as
infeasible. Algorithms like GA, ES, and DE estimate solution close to
each other. The solution found by WOAmM outperforms all other
solutions.

6.7.6. Car side impact design problem


Tables 20 and 21 shows the result found by WOAmM and compari­
son with the results cited by Yildiz et al. (Yildiz et al., 2019). Results are
taken using a population size of 35, and 850 is the maximum number of
iterations in one run. f(x) here represents the value of the optimal
Fig. 9. Three-bar truss design (RLP-3). weight of the car. f(x) is calculated based on objective function equation
using the variable values and satisfying the constraints. PSO, MFO, ALO,
ER-WCA estimates similar car weight on this problem. The calculated
optimal value of algorithms GWO, WCA, and MBA are close to each
other. WOA & SSA are the 2nd worst, and ABC is the worst performer on
this problem. WOAmM finds the minimum weight among all the
methods used for comparison here.

Table 17
Comparison of results on the three-bar truss design problem.
Method x1 x2 f (x)

WOAmM 78.94 £ 10¡2 40.61 £ 10¡2 2.6389 £ 102


MBA 78.86 × 10− 2 40.86 × 10− 2 2.6389 × 102
TSA 78.8 × 10− 2 40.80 × 10− 2 2.6368 × 102 (infeasible)
PSO-DE 78.87 × 10− 2 40.82 × 10− 2 2.6389 × 102
CS 78.87 × 10− 2 40.90 × 10− 2 2.6389 × 102
Ray & Sain 79.5 × 10− 2 39.50 × 10− 2 2.643 × 102
DE-DS 78.88 × 10− 2 40.82 × 10− 2 2.6389 × 102
Fig. 10. Pressure vessel design (RLP-5). BAT 78.86 × 10− 2 40.84 × 10− 2 2.6389 × 102
CA 78.86 × 10− 2 40.84 × 10− 2 2.6389 × 102
AAA 78.87 × 10− 2 40.81 × 10− 2 2.6389 × 102

18
S. Chakraborty et al. Computers & Industrial Engineering 153 (2021) 107086

Table 18 7. Conclusion
Comparison of results on gas transmission compressor design problem.
Method x1 x2 x3 x4 f (x) In this study, a new hybrid optimization algorithm called WOAmM is
2 6
proposed. It uses the stronger exploitation ability of WOA and explo­
WOAmM 0.50 £ 10 118 £ 10 ¡2
2483 £ 10 ¡2
38.31 £ 10 ¡2
2.9649 £ 10
WOA 0.50 × 102 118 × 10− 2
2458 × 10− 2
38.83 × 10− 2 2.9658 × 106
ration ability of modified SOS algorithm for creating an efficient and
SSA 2619 × 10− 2
110 × 10− 2
2147 × 10− 2
21.19 × 10− 2 3.0341 × 106 powerful meta-heuristic algorithm. In WOAmM, the approach has been
BOA 3319 × 10− 2
110 × 10− 2
2648 × 10− 2
21.62 × 10− 2 3.007 × 106 set up so that each population will explore and exploit during the
SOS 0.50 × 102 118 × 10− 2
2458 × 10− 2
38.83 × 10− 2 2.9649 × 106 searching process, which makes the proposed algorithm well-balanced
2 2 2
PSO 3179 × 10− 110 × 10− 3157 × 10− 22.24 × 10− 2 3.0509 × 106
DE 0.50 × 102 118 × 10− 2
2459 × 10− 2
38.84 × 10− 2 2.9649 × 106
in exploration and exploitation. It modifies and improves the popula­
tion in each iteration and is different from other algorithms like GA and
PSO. This algorithm’’s benefit is the need for less memory during

Table 19
Comparison of results on the pressure vessel design problem.
Method x1 x2 x3 x4 f (x)
− 2 − 2 − 2 2
WOAmM 125.89 × 10 10.0 × 10 6522 × 10 0.1000 × 10 3.3685 × 103
WOA 81.25 × 10− 2 43.75 × 10− 2 4209 × 10− 2
1.76638 × 102 6.05974 × 103
Improved HS 112.5 × 10− 2 62.5 × 10− 2 5829 × 10− 2
0.43692 × 102 7.19773 × 103
GSA 112.5 × 10− 2 62.5 × 10− 2 5598 × 10− 2
0.84454 × 102 8.53883 × 103
PSO (He & Wang) 81.25 × 10− 2 43.75 × 10− 2 4209 × 10− 2
1.7674 × 102 6.06107 × 103
GA (Coello) 81.25 × 10− 2 43.75 × 10− 2 4032 × 10− 2
2.0000 × 102 6.28874 × 103
GA (Coello and Montes) 81.25 × 10− 2 43.75 × 10− 2 4209 × 10− 2
1.7665 × 102 6.05994 × 103
GA (Deb & Gene) 93.75 × 10− 2 50.00 × 10− 2 4832 × 10− 2
1.1267 × 102 6.41038 × 103
ES (Montes & Coello) 81.25 × 10− 2 43.75 × 10− 2 4209 × 10− 2
1.7664 × 102 6.05974 × 103
DE (Huang et al.) 81.25 × 10− 2 43.75 × 10− 2 4209 × 10− 2
1.7663 × 102 6.05973 × 103
ACO (Kaveh & Talataheri) 81.25 × 10− 2 43.75 × 10− 2 4210 × 10− 2
1.7657 × 102 6.05908 × 103 (infeasible)
Lagrangian multiplier (Kannan) 112.5 × 10− 2 62.5 × 10− 2 5829 × 10− 2
0.4369 × 102 7.19804 × 103
Branch-bound (Sandgren) 112.5 × 10− 2 62.5 × 10− 2 4770 × 10− 2
1.1770 × 102 8.12910 × 103

Table 20
Comparison of results on the car side impact design problem.
Variables WOAmM ABC PSO MFO ALO ER-WCA
− 2 − 2 − 2 − 2 2
a1 50 £ 10 ¡2
50 × 10 50 × 10 50 × 10 50 × 10 50 × 10−
2 2 2 2 2
a2 96.01 £ 10¡2 106.24 × 10− 111.65 × 10− 111.65 × 10− 111.59 × 10− 111.86 × 10−
− 2 − 2 − 2 − 2 − 2
a3 86.74 £ 10 ¡2
51.48 × 10 50 × 10 50 × 10 50 × 10 50 × 10
2 2 2 2 2
a4 50 £ 10¡2 144.91 × 10− 130.18 × 10− 130.19 × 10− 130.28 × 10− 129.84 × 10−
2 2 2 2 2
a5 64.33 £ 10¡2 50 × 10− 50 × 10− 50 × 10− 50 × 10− 50 × 10−
2 2 2 2 2
a6 182.95 £ 10¡2 150 × 10− 150 × 10− 150 × 10− 150 × 10− 150 × 10−
2 2 2 2 2
a7 50 £ 10¡2 50 × 10− 50 × 10− 50 × 10− 50 × 10− 50 × 10−
2 2 2 2 2
a8 34.50 £ 10¡2 34.50 × 10− 34.50 × 10− 34.50 × 10− 34.50 × 10− 34.50 × 10−
− 2 − 2 − 2 − 2 − 2
a9 34.50 £ 10 ¡2
19.2 × 10 34.50 × 10 34.50 × 10 19.2 × 10 19.2 × 10
a10 ¡0.3000 £ 102 − 0.2934 × 102 − 0.1952 × 102 − 0.1953 × 102 − 0.1963 × 102 − 0.1914 × 102
1 1 1 1 1
a11 ¡48.066 £ 10¡1 7.4109 × 10− − 0.1929 × 10− 0.00006 × 10− 0.2364 × 10− − 0.1527 × 10−
1 1 1 1 1 1
f (a) 2.14034 £ 10 2.3175 × 10 2.2842 × 10 2.2842 × 10 2.2842 × 10 2.2842 × 10

Table 21
Comparison of results on the car side impact design problem.
Variables WOAmM GWO WCA MBA SSA WOA
2 2 2 2 2
a1 50 £ 10¡2 50 × 10− 50 × 10− 50 × 10− 50 × 10− 50 × 10−
− 2 − 2 − 2 − 2 2
a2 96.01 £ 10 ¡2
111.14 × 10 111.55 × 10 111.72 × 10 110.93 × 10 110.80 × 10−
2 2 2 2 2
a3 86.74 £ 10¡2 50 × 10− 50 × 10− 50 × 10− 50 × 10− 53.44 × 10−
2 2 2 2 2
a4 50 £ 10¡2 131.22 × 10− 130.34 × 10− 130.08 × 10− 131.48 × 10− 130.57 × 10−
− 2 − 2 − 2 − 2 − 2
a5 64.33 £ 10 ¡2
50.12 × 10 50 × 10 50 × 10 50 × 10 50 × 10
2 2 2 2 2
a6 182.95 £ 10¡2 150 × 10− 150 × 10− 149.99 × 10− 149.99 × 10− 147.38 × 10−
2 2 2 2 2
a7 50 £ 10¡2 50 × 10− 50 × 10− 50 × 10− 50 × 10− 50 × 10−
2 2 2 2 2
a8 34.50 £ 10¡2 34.50 × 10− 34.50 × 10− 34.50 × 10− 34.5050 × 10− 34.50 × 10−
2 2 2 2 2
a9 34.50 £ 10¡2 19.2 × 10− 19.2 × 10− 34.50 × 10− 19.2 × 10− 19.2 × 10−
a10 ¡0.3000 £ 102 − 0.2060 × 102 − 0.1969 × 102 − 0.1940 × 102 − 0.2082 × 102 − 0.1969 × 102
− 1 − 1 − 1 − 1 1
a11 ¡48.066 £ 10 ¡1
− 2.5531 × 10 − 0.2385 × 10 − 3.7320 × 10 4.4129 × 10 34.8169 × 10−
f (a) 2.14034 £ 101 2.2852 × 101 2.2843 × 101 2.2846 × 101 2.3042 × 101 2.3042 × 101

19
S. Chakraborty et al. Computers & Industrial Engineering 153 (2021) 107086

optimization. It does not store many promising solutions obtained so far Some future research directions are mentioned below:
and fewer control parameters to be tuned. However, its computational
complexity might be higher than the other algorithm. (i) The work can be further enhanced by considering the adaptive
The proposed WOAmM is applied to solve thirty standard classical parameter setting of the different parameters of WOA and benefit
benchmark problems consisting of unimodal, simple multimodal func­ factors of SOS.
tions of variable and fixed dimensions. Also, for further analysis, we (ii) The present work can be extended to a multi-objective optimi­
have used it to solve the recent IEEE CEC 2019 suite. For the perfor­ zation problem.
mance analysis, the simulation results are compared with a wide range (iii) The proposed algorithm can solve various problems of human­
of state-of-the-art algorithms consisting of basic and modified, and ities, science, engineering, and industry.
hybrid algorithms. The Friedman rank test and boxplot analysis have
been used to examine the proposed algorithm’s statistical superiority CRediT authorship contribution statement
over the compared algorithms. The convergence graphs demonstrate the
faster convergence of the proposed algorithm than the algorithms Sanjoy Chakraborty: Conceptualization, Methodology, Investiga­
considered for this study’s assessment. Further, to pass judgment on the tion, Writing - original draft. Apu Kumar Saha: Conceptualization,
presentation of the technique on certifiable enhancement issues, six real- Methodology, Writing - review & editing, Supervision. Sushmita
world engineering optimization problems from both the categories of Sharma: Methodology, Validation. Seyedali Mirjalili: Writing - review
optimization problems (unconstrained and constrained) are also solved. & editing, Supervision. Ratul Chakraborty: Methodology.
The result is compared with a wide range of well-known algorithms. It is
measured that the recommended WOAmM is more effective than the Acknowledgment
algorithms compared. From all the above analysis, it can be said that the
proposed method is an efficient and robust one in handling premature The authors are extremely thankful to the editor and the reviewers
converge and faster convergence. Thus it can be recommended to solve for their valuable suggestions and comments, which helped improve the
optimization problems from academia and industry. manuscript.

Appendix I

Table-(a): Variable & fixed dimension unimodal functions.

Function ID Name Equation Search Space Dimension (D) Optimal value



F1 Sphere F(x)= Dk=1 x2k [− 100,100] 30 0
∑ ∏
F2 Schwefel 2.22 F(x)= Dk=1 |xk | + Dk=1 |xk | [− 10,10] 30 0
F3 Schwefel 1.12 ∑D (∑k )2
[− 100,100] 30 0
F(x)= k=1 l=1 xl

F4 Schwefel 2.21 F(x)=max[|xk |, 1 ≤ k ≤ D ] [− 100,100] 30 0


k
( )
F5 Rosenbrock ∑ 1
F(x)= D− 2 2
+ (xk − 1)2 ] [− 30,30] 30 0
k=1 [100( xk+1 − xk
∑D
F6 Step F(x)= k=1 (|xk + 0.5| )2 [− 100,100] 30 0

F7 Quartic F(x)= Dk=1 x4k + random(0, 1) [− 1.28,1.28] 30 0

F8 Cigar F(x)=x21 + 106 Dk− 2 x6k [− 100,100] 30 0
F9 Powell ∑D/4
F(x) = k=1 [(x4k− 3 + 10x4k− 2 )2 + 5(x4k− 1 + x4k )2 + (x4k− 2
4
+ 2x4k− 1 ) + [− 4,5] 30 0
10(x4k− + 10x4k )4 ]
3
F10 Tablet 6 2

F(x)=10 x1 + Dk− 2 x6k [− 1,1] 30 0
F11 Elliptic ∑D ( 6 )(k− 1)(D− 1) 2
F(x)= k=2 10 ⋅xk [− 100,100] 30 0
F12 Brown ∑ ( ) x2 +1 ( ) x2 +1
F(x)= n− 1 x2 ( k+1 ) + x2 ( k ) [− 1, 4] 30 0
k=1 k k+1
F13 Chung Reynolds ∑D ( 2 2
)
[− 100,100] 30 0
F(x) = k= xk
∑D
F14 Powell sum F(x) = k+1 [− 1,1] 30 0
k=1 |xk |
Fixed dimension unimodal functions
( )
F15 Matyas F(x)=0.26( x21 + x22 − 0.48x1 x2 [− 10,10] 2 0
( )
F16 Leon 2
F(x)=100 x2 − x31 + (1 − x1 )2 [− 1.2,1.2] 2 0

Table− (b): Variable & fixed dimension multimodal functions.

Function Name Equation Search Space Dimension Optimal


ID (D) value
∑D
F17 Rastrigin F(x)= 2
− 10cos(2Πx) + 10]
k=1 [xk
[− 5.12,5.12] 30 0
√̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅ ( )
F18 Ackley 1 ∑D 2 1 ∑D [–32,32] 30 0
F(x) = − 20exp(− 0.2 x k − exp cos2Πx k ) + 20 + e
D k=1 D k=1
F19 Griewank 1 ∑D ∏D x k [− 600,600] 30 0
F(x)= x2 − cos(√̅̅̅ ) + 1
4000 K=1 k k=1
k
(continued on next page)

20
S. Chakraborty et al. Computers & Industrial Engineering 153 (2021) 107086

(continued )
Function Name Equation Search Space Dimension Optimal
ID (D) value
{ }
F20 Penalized 1.1 Π ∑D− 1 ( )2 [ ( )] ( )2 ∑ [− 50,50] 30 0
F(x)= 10sin(Πy1 ) + jk − 1 1 + 10sin2 Πyk+1 + yk − 1 + Dk=1 μ(xk ,
D k=1

10, 100, 4)

xk + 1 ⎨ p(xk − a)m > a
yk = 1 + μ(xk , p, a, m) 0 − a < xk < a
4 ⎩
p( − xk − a)m xk < − a
∑ [ ] [ ]
F21 Penalized 1.2 F(x)=0.1{sin2 (3πx1 ) + Dk=1 (xk − 1)2 1 + sin2 (3πxk + 1) + (xk − 1)2 1 + sin2 (2πxk ) + [− 50,50] 30 0
∑D
k=1 μ(xk , 5, 100, 4)}
F22 Csendes ∑ 1 [− 1,1] 30 0
F(x)= Dk=1 x6k (2 + sin )
xk
∑D ∑D 2
F23 Inverted Cosine F(x)=0.1D − (0.1 k=1 cos(5Πxk ) − k=1 xk )
[− 1,1] 30 0
Mixture
( √̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅ ) √∑
̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅
F24 Salomon ∑D 2 D [− 100,100] 30 0
F(x)=1 − cos 2Π k=1 xk + 0.1 k=1 xk
2

(∑ )2 (∑ )4
F25 Zakharov ∑D 2 D D [− 5,10] 30 0
F(x)= k=1 xk + k=1 0.5kxk + k=1 0.5kxk

Fixed dimension multimodal functions


( ) ⎤2
F26 Kawalik ⎡ [− 5,5] 4 0.0003
∑ x1 b2k + bk x2
F(x)= 11k=1
⎣a k − ⎦
b2k + bk x3 + x4
F27 Gold Stein & F(x) = [1+((x, +x2 + 1)2 (19 − 14x1 + 3x21 − 14x2 + 6x1 x2 + 3x22 )] × [30 + (2x1 − 3x2 )2 × (18 − [− 2,2] 2 3.0
Price 32x1 + 12x21 + 48x2 − 36x1 x2 + 27x22 )]
∑4 ∑3
F28 Hartman3 F(x) =− c exp(− 2
l=1 akl (xl − pkl ) )
[1,3] 3 − 3.86
k̇=1 k
F29 Bohachevsky1 F(x) =x21 + 2x22 − 0.3cos(3Πx1 ) − 0.4cos(4Πx2 ) + 0.7 [− 100,100] 2 0
F30 Bohachevsky2 F(x) =x21 + 2x22 − 0.3cos(3Πx1 ).0.4cos(4Πx2 ) + 0.3 [− 100,100] 2 0
F31 Bohachevsky3 F(x) =x21 + 2x22 − 0.3cos(3Πx1 + 4Πx2 ) + 0.3 [− 50,50] 2 0
( )
F32 Colville 2
F(x) = 100(x1 − x2 )2 + (1 − x1 )2 + 90(x4 − x23 ) + (1 − x3 )2 + 10.1 (x2 − 1)2 + (x4 − 1)2 + [− 10,10] 4 0

19.8(x2 − 1)(x4 − 1)
∏ ∑
F33 Shubert F (x)= 2k=1 ( 5l=1 cos((l + 1)xk + l) [− 10,10] 2 − 186.73
⃒ 2 ⃒
F34 Bartels Conn F(x)=⃒x1 + x22 + x1 x2 ⃒ + |sin(x1 ) | + |cos(x2 ) | [− 500,500] 02 1
F35 Bird F(x)=sin(x1 )e(1− cos(x2 ) )2
+ cos(x2 )e(1− sin(x1 ) )2 [− 2Π, 2Π ] 02 − 106.764
+ (x1 − x2 )2
( √̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅ )
F36 Drop wave 1 + cos 12 x21 + x22 [− 5.12, 02 0
F(x) = − ( ) 5.12]
0.5 x21 + x22 + 2

Table- (c): CEC 2019 benchmark functions.

Function ID Function Dimension Search space Optimum value

F37 Storn’s Chebyshev Polynomial Fitting Problem 9 [− 8192, 8192] 1


F38 Inverse Hilbert Matrix Problem 16 [− 16384, 16384] 1
F39 Lennard-Jones Minimum Energy Cluster 18 [− 4,4] 1
F40 Rastrigin’s Function 10 [− 100,100] 1
F41 Griewangk’s Function 10 [− 100,100] 1
F42 Weierstrass Function 10 [− 100,100] 1
F43 Modified Schwefel’s Function 10 [− 100,100] 1
F44 Expanded Schaffer’s F6 Function 10 [− 100,100] 1
F45 Happy Cat Function 10 [− 100,100] 1
F46 Ackley Function 10 [− 100,100] 1

21
S. Chakraborty et al. Computers & Industrial Engineering 153 (2021) 107086

Appendix B

See Figs. 1–3

Fig. 1. Box plot comparison with basic methods using classical benchmark functions.

22
S. Chakraborty et al. Computers & Industrial Engineering 153 (2021) 107086

Fig. 1. (continued).

23
S. Chakraborty et al. Computers & Industrial Engineering 153 (2021) 107086

Fig. 1. (continued).

24
S. Chakraborty et al. Computers & Industrial Engineering 153 (2021) 107086

Fig. 2. Box plot comparison with basic methods using CEC 2019 functions.

25
S. Chakraborty et al. Computers & Industrial Engineering 153 (2021) 107086

Fig. 3. Box plot comparison with modified methods using CEC 2019 functions.

26
S. Chakraborty et al. Computers & Industrial Engineering 153 (2021) 107086

References Chakraborty, F., Nandi, D., & Roy, P. K. (2019). Oppositional symbiotic organisms search
optimization for multilevel thresholding of color image. Applied Soft Computing, 82,
Article 105577. https://doi.org/10.1016/j.asoc.2019.105577
Mohapatra, P., Das, K. N., & Roy, S. (2017). A modified competitive swarm optimizer for
Kumar, S., Tejani, G. G., & Mirjalili, S. (2018). Modified symbiotic organisms search for
large scale optimization problems. Applied Soft Computing, 59, 340–362.
structural optimization. Engineering with Computers. https://doi.org/10.1007/
Sun, Y., Yang, T., & Liu, Z. (2019). A whale optimization algorithm based on quadratic
s00366-018-0662-y
interpolation for high-dimensional global optimization problems. Applied Soft
Dinh-Cong, D., Nguyen, Thoi T., & Nguyen, D. T. (2020). A FE model updating technique
Computing, 85, Article 105744. https://doi.org/10.1016/j.asoc.2019.105744
based on SAP2000-OAPI and enhanced SOS algorithm for damage assessment of full-
Luo, J., & Shi, B. (2019). A hybrid whale optimization algorithm based on modified
scale structures. Applied Soft Computing, 106100. https://doi.org/10.1016/j.
differential evolution for global optimization problems. Applied Intelligence, 49(5),
asoc.2020.106100
1982–2000.
Zheng, X., Lai, W., Chen, H., & Fang, S. (2020). Data Prediction of Mobile Network Trac
Kaur, G., & Arora, S. (2018). Chaotic whale optimization algorithm. Journal of
in Public Scenes by SOS-vSVR Method. Sensors (Basel). https://doi.org/10.3390/
Computational Design and Engineering, 5(3), 275–284.
s20030603
Smith, R. V., Osman, I., Colin, R., & Simth, G. (1996). Modern Heuristic Search Methods.
Ling, Y., Zhou, Y., & Luo, Q. (2017). Lévy flight trajectory-based whale optimization
Angeline, P. J. (1994). Genetic programming: On the programming of computers by
algorithm for global optimization. IEEE Access, 5(99), 6168–6186. https://doi.org/
means of natural selection. Biosystems, 33(1), 69–73. https://doi.org/10.1016/0303-
10.1109/access.2017.2695498
2647(94)90062-0
Do, D. T. T., & Lee, J. (2017). A modified symbiotic organism search (mSOS) algorithm
Storn, R., & Price, K. (1997). Journal of Global Optimization, 11(4), 341–359. https://doi.
for optimization of pin-jointed structures. Applied Soft Computing, 61, 683–699.
org/10.1023/a:1008202821328
https://doi.org/10.1016/j.asoc.2017.08.002
Kaveh, A., & Bakhshpoori, T. (2016). A new metaheuristic for continuous structural
Kennedy, J., & Eberhart, R. (1995). Particle swarm optimization. Proceedings of ICNN’95—
optimization: Water evaporation optimization. Structural and Multidisciplinary
international conference on neural networks, 4, 1942–1948. https://doi.org/10.1109/
Optimization, 54(1), 23–43. https://doi.org/10.1007/s00158-015-1396-8
ICNN.1995.488968
Zhao, W., Wang, L., & Zhang, Z. (2018). Atom search optimization and its application to
Arora, S., & Singh, S. (2018). Butterfly optimization algorithm: A novel approach for
solve a hydrogeologic parameter estimation problem. Knowledge-Based Systems.
global optimization. Soft Computing, 23, 715. https://doi.org/10.1007/s00500-018-
https://doi.org/10.1016/j.knosys.2018.08.030
3102-4
Faramarzi, A., Heidarinejad, M., Stephens, B., & Mirjalili, S. (2019). Equilibrium
Mostafa Bozorgi, S., & Yazdani, S. (2019). IWOA: An Improved whale optimization
optimizer: A novel optimization algorithm. Knowledge-Based Systems. https://doi.
algorithm for optimization problems. Journal of Computational Design and
org/10.1016/j.knosys.2019.105190
Engineering. https://doi.org/10.1016/j.jcde.2019.02.002
Rao, R. V., Savsani, V. J., & Vakharia, D. P. (2011). Teaching–learning-based
Sharma, S., & Saha, A. K. (2019). m-MBOA: A novel butterfly optimization algorithm
optimization: A novel method for constrained mechanical design optimization
enhanced with mutualism scheme. Soft Computing. https://doi.org/10.1007/s00500-
problems. Computer-Aided Design, 43(3), 303–315. https://doi.org/10.1016/j.
019-04234-6
cad.2010.12.015
Xu, Z., Yu, Y., Yachi, H., Ji, J., Todo, Y., & Gao, S. (2018). A Novel Memetic Whale
Sadollah, A., Bahreininejad, A., Eskandar, H., & Hamdi, M. (2013). Mine blast algorithm:
Optimization Algorithm for Optimization. Advances in Swarm Intelligence, 384–396.
A new population-based algorithm for solving constrained engineering optimization
https://doi.org/10.1007/978-3-319-93815-8_37
problems. Applied Soft Computing, 13(5), 2592–2612. https://doi.org/10.1016/j.
Chen, H., Xu, Y., Wang, M., & Zhao, X. (2019). A Balanced Whale Optimization
asoc.2012.11.026
Algorithm for Constrained Engineering Design Problems. Applied Mathematical
Fathollahi-Fard, A. M., Hajiaghaei-Keshteli, M., & Tavakkoli-Moghaddam, R. (2018). The
Modelling. https://doi.org/10.1016/j.apm.2019.02.004
Social Engineering Optimizer (SEO). Engineering Applications of Artificial Intelligence,
Zhang, Q., & Liu, L. (2019). Whale Optimization Algorithm based on Lamarckian
72, 267–293. https://doi.org/10.1016/j.engappai.2018.04.009
learning for global optimization problems. IEEE. Access, 1(1). https://doi.org/
Cheng, M. Y., & Prayogo, D. (2014). Symbiotic organisms search: A new metaheuristic
10.1109/access.2019.2905009
optimization algorithm. Computers & Structures, 139, 98–112. https://doi.org/
Chen, H., Yang, C., Heidari, A. A., & Zhao, X. (2019). An Efficient Double Adaptive
10.1016/j.compstruc.2014.03.007
Random Spare Reinforced Whale Optimization Algorithm. Expert Systems with
Mirjalili, S., Gandomi, A. H., Mirjalili, S. Z., Saremi, S., Faris, H., & Mirjalili, S. M. (2017).
Applications, 113018. https://doi.org/10.1016/j.eswa.2019.113018
Salp Swarm Algorithm: A bio-inspired optimizer for engineering design problems.
Yildiz, A. R. (2019). A novel hybrid whale–Nelder–Mead algorithm for optimization of
Advances in Engineering Software, 114, 163–191. https://doi.org/10.1016/j.
design and manufacturing problems. The International Journal of Advanced
advengsoft.2017.07.002
Manufacturing Technology. https://doi.org/10.1007/s00170-019-04532-1
Jain, M., Singh, V., & Rani, A. (2018). A novel nature-inspired algorithm for
Natesan, G., & Chokkalingam, A. (2019). Multi-Objective Task Scheduling Using Hybrid
optimization: Squirrel search algorithm. Swarm and Evolutionary Computation, 44.
Whale Genetic Optimization Algorithm in Heterogeneous Computing Environment.
https://doi.org/10.1016/j.swevo.2018.02.013
Wireless Personal Communications, 110. https://doi.org/10.1007/s11277-019-06817-
Anandita, S., Rosmansyah, Y., Dabarsyah, B., & Choi, J. U. (2015). Implementation of
w
dendritic cell algorithm as an anomaly detection method for port scanning attack. In
Xiong, G., Zhang, J., Shi, D., Zhu, L., Yuan, X., & Yao, G. (2019). Modified Search
2015 International Conference on Information Technology Systems and Innovation
Strategies Assisted Crossover Whale Optimization Algorithm with Selection Operator
(ICITSI). https://doi.org/10.1109/icitsi.2015.7437688
for Parameter Extraction of Solar Photovoltaic Models. Remote Sens, 11, 2795.
Sadollah, A., Sayyaadi, H., & Yadav, A. (2018). A dynamic metaheuristic optimization
https://doi.org/10.3390/rs11232795
model inspired by biological nervous systems: Neural network algorithm. Applied
Memarzadeh, R., Ghayoumizadeh, H., Dehghani, M., Madvar, H. R., Seifi, A. & Mortazav,
Soft Computing, 71, 747–782. https://doi.org/10.1016/j.asoc.2018.07.039
S. M. (2020). A Novel Equation for Longitudinal Dispersion Coefficient Prediction
Mirjalili, S., & Lewis, A. (2016). The whale optimization algorithm. Advances in
Based on the Hybrid of SSMD and Whale Optimization Algorithm. Science of The
Engineering Software, 95, 51–67. https://doi.org/10.1016/j.advengsoft.2016.01.008
Total Environment, 716, in press. doi: 10.1016/j.scitotenv.2020.137007.
Wolpert, D. H., & Macready, W. G. (1997). No free lunch theorems for optimization. IEEE
Nama, S., Saha, A. K., & Ghosh, S. (2016). A Hybrid Symbiosis Organisms Search
Trans Evolutionary Computation, 1(1), 67–82. https://doi.org/10.1109/4235.585893
algorithm and its application to real world problems. Memetic Computing, 9(3), 0194-
Mohammed, H. M., Umar, S. U., & Rashid, T. A. (2019). A systematic and meta-analysis
1. https://doi.org/10.1007/s12293-016-
survey of whale optimization algorithm. Computational Intelligence and Neuroscience.
Nama, S., & Saha, A. K. (2018). An ensemble symbiosis organisms search algorithm and
https://doi.org/10.1155/2019/8718571
its application to real world problems. Decision Science Letters, 7(2), 103–118.
Reddy, G. N., & Kumar, S. P. (2017). Multi objective task scheduling algorithm for cloud
Zhao, P., & Liu, S. (2019). An enhanced symbiotic organisms search algorithm with
computing using whale optimization technique. International Conference on Next
perturbed global crossover operator for global optimization. Journal of Intelligent &
Generation Computing Technologies, 286–297. https://doi.org/10.1007/978-981-10-
Fuzzy Systems, 1–15. https://doi.org/10.3233/jifs-190546
8657-1_22
Çelik, E. (2020). A powerful variant of symbiotic organisms search algorithm for global
Horng, M. F., Kien, D., Shieh, C. S., & Nguyen, T. T. (2017). A Multi-objective optimal
optimization. Engineering Applications of Artificial Intelligence, 87, Article 103294.
vehicle fuel consumption based on whale optimization algorithm. Advances in
https://doi.org/10.1016/j.engappai.2019.103294
Intelligent Information Hiding and Multimedia Signal Processing, 371–380. https://doi.
Nama, S., Saha, A. K., & Sharma, S. (2020). A novel improved symbiotic organisms
org/10.1007/978-3-319-50212-0_44
search algorithm. Computational Intelligence. https://doi.org/10.1111/coin.12290
Kumar, A., Bhalla, V., Kumar, P., Bhardwaj, T., & Jangir, N. (2018). Whale optimization
Li, S.-F., & Cheng, C.-Y. (2017). Particle Swarm Optimization with Fitness Adjustment
algorithm for constrained economic load dispatch problems—a cost optimization.
Parameters. Computers & Industrial Engineering. https://doi.org/10.1016/j.
Ambient Communications and Computer Systems, 353–366. https://doi.org/10.1007/
cie.2017.06.006
978-981-10-7386-1_31
Wang, Z., Zhang, J., & Yang, S. (2019). An improved particle swarm optimization
El Aziz, M. A., Ewees, A. A., Hassanien, A. E., Mudhsh, M., & Xiong, S. (2018). Multi-
algorithm for dynamic job shop scheduling problems with random job arrivals.
objective whale optimization algorithm for multilevel thresholding segmentation.
Swarm and Evolutionary Computation, 100594. https://doi.org/10.1016/j.
Advances in Software Computing and Machine Learning in Image Processing, 23–39.
swevo.2019.100594
https://doi.org/10.1007/978-3-319-63754-9_2
Rahman, I. U., Zakarya, M., Raza, M., & Khan, R. (2020). An n-state switching PSO
Hussien, A., Hassanien, A. E., Houssein, E., & Bhattacharyya, S. (2019). S-shaped binary
algorithm for scalable optimization. Soft Computing. https://doi.org/10.1007/
whale optimization algorithm for feature selection. Recent Trends in Signal and Image
s00500-020-05069-2
Processing, 79–87. https://doi.org/10.1007/978-981-10-8863-6_9
Tanabe, R., & Fukunaga, A. (2013). Success-history based parameter adaptation for
Ezugwu, A. E., & Prayogo, D. (2018). Symbiotic Organisms Search Algorithm: Theory,
Differential Evolution. 2013 IEEE Congress on. Evolutionary Computation.
recent Advances and applications. Expert Systems with Applications, 119, 184–209.
Tanabe, R., & Fukunaga, A. S. (2014). Improving the search performance of SHADE using
https://doi.org/10.1016/j.eswa.2018.10.045
linear population size reduction. 2014 IEEE Congress on Evolutionary Computation
(CEC).

27
S. Chakraborty et al. Computers & Industrial Engineering 153 (2021) 107086

Salgotra, R., Singh, U., Saha, S., & Nagar, A. (2019). New Improved SALSHADE-cnEpSin Sun, W., & Zhang, C. (2018). Analysis and forecasting of the carbon price using
Algorithm with Adaptive Parameters. In 2019 IEEE Congress on Evolutionary multi—resolution singular value decomposition and extreme learning machine
Computation (CEC). https://doi.org/10.1109/cec.2019.8789983 optimized by adaptive whale optimization algorithm. Applied Energy, 231. https://
Xia, J., & Zhengb, L. (2020). A hybrid algorithm based on cuckoo search and differential doi.org/10.1016/j.apenergy.2018.09.118
evolution for numerical optimization. Soft Computing, 4, 1–8. Khashan, N., El-Hosseini, M., Haikal, A., & Badawy, M. (2018). Biped Robot Stability
Nama, S., Saha, A. K., & Sharma, S. (2020). A Hybrid TLBO Algorithm by Quadratic Based on an A-C parametric Whale Optimization Algorithm. Journal of Computational
Approximation for Function Optimization and Its Application. Recent Trends and Science., 31. https://doi.org/10.1016/j.jocs.2018.12.005
Advances in Artificial Intelligence and Internet of Things, 291–341. Alamri, H. S., Alsariera, Y. A., & Zamli, K. Z. (2018). Opposition-based Whale
Nama, S., Saha, A. K., & Ghosh, S. (2017). Improved backtracking search algorithm for optimization algorithm. Advanced Science Letters, 24, 7461–7464. https://doi.org/
pseudo dynamic active earth pressure on retaining wall supporting c-Ф backfill. 10.1166/asl.2018.12959
Applied Soft Computing, 52, 885–897. Tang C., Sun W., Wu W. and Xue M. (2019), A hybrid improved whale optimization
Price, K. V., Awad, N. H., Ali, M. Z., & Suganthan, P. N. (2018). Problem definitions and Algorithm. 2019 IEEE 15th International Conference on Control and Automation
evaluation criteria for the 100-digit challenge special session and competition on (ICCA), Edinburgh, United Kingdom, 362-367. doi:10.1109/ICCA.2019.8900003.
single objective numerical optimization. Technical Report. Nanyang Technological Sandgren, E. (1990). Nonlinear integer and discrete programming in mechanical design
University. optimization.
Rahnamayan, S., Tizhoosh, H. R., & Salama, M. M. (2008). Opposition-based differential Beightler, C. S., & Phillips, D. T. (1976). Applied Geometric Programming. New York:
evolution. IEEE Transactions on Evolutionary computation, 12(1), 64–79. Wiley.
Mirjalili, S. (2015). Moth-flame optimization algorithm: A novel nature-inspired Gu, L., Yang, R. J., Tho, C. H., Makowskit, M., Faruquet, O., & Li, Y. (2001). Optimisation
heuristic paradigm. Knowledge-Based Systems, 89, 228–249. https://doi.org/ and robustness for crashworthiness of side impact. International Journal of Vehicle
10.1016/j.knosys.2015.07.006 Design, 26(4), 348. doi:10.1504/ijvd.2001.005210.6789.
Mirjalili, Seyedali (2016). SCA: A sine cosine algorithm for solving optimization Kumar, A., Wu, G., Ali, M. Z., Mallipeddi, R., Suganthan, P. N., & Das, S. (2020). A test-
problems. Knowl. Based Syst., 96, 120–133. https://doi.org/10.1016/j. suite of non-convex constrained optimization problems from the real-world and
knosys.2015.12.022 some baseline results. Swarm and Evolutionary Computation, 100693. https://doi.org/
Venkata Rao, R. (2016). Jaya: A simple and new optimization algorithm for solving 10.1016/j.swevo.2020.100693
constrained and unconstrained optimization problems. International Journal of Yildiz, A., Abderazek, H., & Mirjalili, S. (2019). A Comparative Study of Recent Non-
Industrial Engineering Computations, 19–34. https://doi.org/10.5267/j. traditional Methods for Mechanical Design Optimization. Archives of Computational
ijiec.2015.8.004 Methods in Engineering. https://doi.org/10.1007/s11831-019-09343-x
Elaziz, M. A., Oliva, D., & Xiong, S. (2017). An Improved Opposition-Based Sine Cosine
Algorithm for Global Optimization. Expert Systems with Applications, 90. https://doi.
org/10.1016/j.eswa.2017.07.043

28

You might also like