0% found this document useful (0 votes)
2 views

Secretary bird optimization algorithm

The document presents the Secretary Bird Optimization Algorithm (SBOA), a novel metaheuristic algorithm inspired by the survival behaviors of secretary birds, designed to solve complex global optimization problems. SBOA employs an exploration phase that simulates hunting and an exploitation phase that models evasion from predators, demonstrating superior performance in convergence speed and solution quality compared to 15 advanced algorithms across various benchmarks. The algorithm's effectiveness is further validated through applications in constrained engineering design problems and path planning for unmanned aerial vehicles.

Uploaded by

Trong Nghia
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
2 views

Secretary bird optimization algorithm

The document presents the Secretary Bird Optimization Algorithm (SBOA), a novel metaheuristic algorithm inspired by the survival behaviors of secretary birds, designed to solve complex global optimization problems. SBOA employs an exploration phase that simulates hunting and an exploitation phase that models evasion from predators, demonstrating superior performance in convergence speed and solution quality compared to 15 advanced algorithms across various benchmarks. The algorithm's effectiveness is further validated through applications in constrained engineering design problems and path planning for unmanned aerial vehicles.

Uploaded by

Trong Nghia
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 102

Artificial Intelligence Review (2024) 57:123

https://doi.org/10.1007/s10462-024-10729-y

Secretary bird optimization algorithm: a new metaheuristic


for solving global optimization problems

Youfa Fu1 · Dan Liu1 · Jiadui Chen1 · Ling He1

Accepted: 10 February 2024 / Published online: 23 April 2024


© The Author(s) 2024

Abstract
This study introduces a novel population-based metaheuristic algorithm called secretary
bird optimization algorithm (SBOA), inspired by the survival behavior of secretary birds
in their natural environment. Survival for secretary birds involves continuous hunting for
prey and evading pursuit from predators. This information is crucial for proposing a new
metaheuristic algorithm that utilizes the survival abilities of secretary birds to address real-
world optimization problems. The algorithm’s exploration phase simulates secretary birds
hunting snakes, while the exploitation phase models their escape from predators. During
this phase, secretary birds observe the environment and choose the most suitable way to
reach a secure refuge. These two phases are iteratively repeated, subject to termination cri-
teria, to find the optimal solution to the optimization problem. To validate the performance
of SBOA, experiments were conducted to assess convergence speed, convergence behavior,
and other relevant aspects. Furthermore, we compared SBOA with 15 advanced algorithms
using the CEC-2017 and CEC-2022 benchmark suites. All test results consistently dem-
onstrated the outstanding performance of SBOA in terms of solution quality, convergence
speed, and stability. Lastly, SBOA was employed to tackle 12 constrained engineering
design problems and perform three-dimensional path planning for Unmanned Aerial Vehi-
cles. The results demonstrate that, compared to contrasted optimizers, the proposed SBOA
can find better solutions at a faster pace, showcasing its significant potential in addressing
real-world optimization problems.

Keywords Secretary bird optimization algorithm · Nature-inspired optimization · Heuristic


algorithm · Exploration and exploitation · Engineering design problems

* Dan Liu
[email protected]
Youfa Fu
[email protected]; [email protected]
Jiadui Chen
[email protected]
Ling He
[email protected]
1
Key Laboratory of Advanced Manufacturing Technology, Ministry of Education, Guizhou
University, Guiyang 550025, Guizhou, China

13
Vol.:(0123456789)
123 Page 2 of 102 Y. Fu et al.

1 Introduction

With the continuous development of society and technology, optimization problems have
become increasingly complex and challenging in various domains. The nature of these
problems encompasses a wide range of areas, including manufacturing, resource alloca-
tion, path planning, financial portfolio optimization, and others. They involve multiple
decision variables, numerous constraints, and diverse objective functions. In the face of
real-world constraints such as resource scarcity, cost control, and efficiency requirements,
finding optimal solutions has become an urgent imperative (Zhou et al. 2011).
Traditional mathematical optimization methods, while performing well in certain
cases, often exhibit limitations when dealing with complex, high-dimensional, nonlinear,
and multimodal problems. Issues such as local optima, slow convergence rates, difficulty
in parameter tuning, high-dimensional problems, and computational costs have been per-
sistent challenges for researchers and practitioners in the field of optimization (Faramarzi
et al. 2020b). Therefore, people are seeking new methods and technologies to address these
challenges. In this context, metaheuristic algorithms have emerged. They belong to a cat-
egory of intelligent search algorithms inspired by natural phenomena and mechanisms,
designed to find solutions to optimization problems through stochastic methods. Unlike tra-
ditional mathematical optimization methods, metaheuristic algorithms are better suited for
complex, multimodal, high-dimensional, and nonlinear optimization problems. These algo-
rithms, by simulating processes like evolution, swarm intelligence, and simulated anneal-
ing found in nature, exhibit robustness and global search capabilities, and as a result, they
have gained widespread popularity in various practical applications.
Despite the significant progress that metaheuristic algorithms have made in various
fields, they also face some challenges and issues, including susceptibility to getting stuck in
local optima, slow convergence rates, insufficient robustness, and high computational costs
(Agrawal et al. 2021). Despite the significant progress that metaheuristic algorithms have
made in various fields, they also face some challenges and issues, including susceptibility
to getting stuck in local optima, slow convergence rates, insufficient robustness, and high
computational costs (Wolpert and Macready 1997) explicitly stated that the significant
performance of an algorithm in solving a specific set of optimization problems does not
guarantee that it will perform equally well in other optimization problems. Therefore, not
all algorithms excel at in all optimization applications. The NFL theorem drives research-
ers to design innovative algorithms that can more effectively solve optimization prob-
lems by providing better solutions. In certain scenarios, algorithmic convergence to local
optima may occur due to an imbalance between development and exploration. To address
this issue, various approaches have been proposed. Nama et al. introduced a novel inte-
grated algorithm, denoted as e-mPSOBSA (Nama et al. 2023), based on the Backtracking
Search Algorithm (BSA) and Particle Swarm Optimization (PSO). This integration aims
to mitigate the imbalance between development and exploration. Nama et al. proposed an
improved version of the Backtracking Search Algorithm, named gQR-BSA (Nama et al.
2022b), to address challenges arising from the imbalance between exploitation and explo-
ration. He presented a bio-inspired multi-population self-adaptive Backtracking Search
Algorithm, referred to as ImBSA (Nama and Saha 2022), as a solution to the development-
exploration imbalance. Nama presented an enhanced symbiotic organism search algorithm,
mISOS (Nama 2021), to overcome challenges associated with development-exploration
imbalance. Chakraborty introduced an enhanced version of the Symbiotic Organism Search
algorithm, namely nwSOS (Chakraborty et al. 2022a), designed for solving optimization

13
Secretary bird optimization algorithm: a new metaheuristic… Page 3 of 102 123

problems in higher dimensions. Saha combined the exploration capability of SQI with the
development potential of SOS, proposing the hybrid symbiotic organism search (HSOS)
algorithm (Saha et al. 2021) and a new parameter setting-based modified differential evo-
lution for function optimization (Nama and Saha 2020). This combination enhances the
algorithm’s robustness and overall performance.
Engineering optimization problems have consistently posed significant challenges
within the field of engineering. These problems involve achieving specific objectives,
typically involve cost minimization, efficiency maximization, or performance optimiza-
tion, under limited resources. The challenges in engineering optimization arise from their
diversity and complexity. Problems can be either discrete or continuous, involve multiple
decision variables, are subject to various constraints, and may incorporate elements of ran-
domness and uncertainty. Consequently, selecting appropriate optimization algorithms is
crucial for solving these problems. In recent decades, researchers have developed various
types of optimization algorithms, including traditional mathematical programming meth-
ods, heuristic algorithms, and evolutionary algorithms, among others. However, these algo-
rithms still exhibit certain limitations when addressing such optimization problems. For
example, when dealing with more complex problems, they tend to have slow convergence
rates, limited search precision, and are prone to becoming trapped in local optima.
To overcome these challenges, this paper introduces a new optimization algorithm—the
secretary bird optimization algorithm (SBOA). Our motivation is to address the shortcom-
ings of existing algorithms in tackling complex engineering optimization problems. Spe-
cifically, our focus lies in improving the convergence speed of optimization algorithms,
enhancing optimization precision, and effectively avoiding local optima. By introducing
the secretary bird optimization algorithm, we aim to provide a more efficient and reliable
solution for the engineering domain, fostering new breakthroughs in the research and prac-
tical application of optimization problems. The primary contributions of this study are as
follows:

• Introduced a novel population-based metaheuristic algorithm named secretary bird


optimization algorithm (SBOA), which is designed to simulate the survival capabilities
of secretary birds in their natural environment.
• SBOA is inspired by the hunting and evading abilities of secretary birds in dealing
with predators, where survival adaptability is divided into two phases: exploration and
exploitation.
• The exploration phase of the algorithm simulates the behavior of secretary birds cap-
turing snakes, while the exploitation phase simulates their behavior of evading preda-
tors.
• Mathematical modeling is employed to describe and analyze each stage of SBOA.
• Evaluate the effectiveness and robustness of SBOA by solving 42 benchmark functions,
including unimodal, multimodal, hybrid, and composition functions defined in CEC
2017, and CEC 2022.
• The performance of SBOA in solving practical optimization problems is tested on
twelve engineering optimization design problems and three-dimensional path planning
for unmanned aerial vehicles (UAVs).

The structure of this paper is arranged as follows: the literature review is presented in
Sect. 2. Section 3 introduces the proposed SBOA and models it. Section 4 presents a simu-
lation analysis of the convergence behavior, the exploration and exploitation balance ratio
and search agent distribution of SBOA when dealing with optimization problems. Section 5

13
123 Page 4 of 102 Y. Fu et al.

uses twelve engineering optimization problems to verify the efficiency of SBOA in solving
practical engineering optimization problems and addresses a path planning scenario for an
unmanned aerial vehicles (UAVs). Section 6 provides the conclusion and outlines several
directions for future research.

2 Literature review

Metaheuristic algorithms, as a branch of bio-inspired methods, have been widely employed


to solve problems in various domains. These algorithms can be classified into several
categories based on their underlying principles, including evolutionary algorithms (EA),
physics-inspired algorithms (PhA), human behavior-based algorithms, and swarm intelli-
gence (SI) algorithms (Abualigah et al. 2021). EA draws inspiration from natural evolu-
tionary processes, with genetic algorithms (GA) (Holland 1992) and differential evolution
(DE) (Storn and Price 1997) as two of its most representative members. These algorithms
are rooted in Darwin’s theory of evolution. On the other hand, PhA are often inspired by
physical phenomena. A notable example is the simulated annealing algorithm (SA) (Kirk-
patrick et al. 1983), which simulates the annealing process of solids. It uses this physi-
cal analogy to find optimal solutions. Human behavior-based algorithms are based on the
principles of human social learning. A typical example is the social evolution and learning
optimization algorithm (SELOA) (Kumar et al. 2018). SI algorithms are inspired by the
collective behavior of biological populations in nature. Particle swarm optimization (PSO)
(Kennedy and Eberhart 1995) is one such algorithm that mimics the foraging behavior of
birds. In PSO, particles represent birds searching for food (the objective function) in the
forest (search space). During each iteration, particle movement is influenced by both indi-
vidual and group-level information. Ultimately, particles converge toward the best solution
in their vicinity.
The laws of physics play a crucial role in the PhA algorithm, providing essential guid-
ance for optimizing problem solutions. For instance, the simulated annealing algorithm
(SA) (Kirkpatrick et al. 1983) draws inspiration from the physical phenomenon of metal
melting and its cooling and solidification process. The homonuclear molecular orbital
(HMO) optimization (HMO) (Mahdavi-Meymand and Zounemat-Kermani 2022) is pro-
posed based on the Bohr atomic model and the electron arrangement behavior around the
atomic nucleus in the context of the homonuclear molecular structure. Additionally, the
special relativity search (SRS) (Goodarzimehr et al. 2022) is introduced by applying con-
cepts from electromagnetism and the theory of special relativity. The gravity search algo-
rithm (GSA) (Rashedi et al. 2009) is based on Newton’s universal law of gravitation. The
subtraction-average-based optimizer (SABO) (Trojovský and Dehghani 2023) is inspired
by mathematical concepts such as the mean, differences in search agent positions, and dif-
ferences in the values of the objective function. The big bang-big crunch algorithm (BB-
BC) (Erol and Eksin 2006) is based on Newton’s universal law of gravitation. The big
bang-big crunch algorithm (BB-BC) (Mirjalili et al. 2016), water cycle algorithm (WCA)
(Eskandar et al. 2012), nuclear reaction optimization (NRO) (Wei et al. 2019), nuclear reac-
tion optimization (NRO) (Kaveh and Khayatazad 2012). Sinh Cosh optimizer (SCHO) (Bai
et al. 2023) is inspired by the mathematical characteristics of the hyperbolic sine (Sinh) and
hyperbolic cosine (Cosh) functions. The great wall construction algorithm (GWCA) (Guan
et al. 2023) is inspired by the competitive and elimination mechanisms among workers
during the construction of the Great Wall. The exponential distribution optimizer (EDO)

13
Secretary bird optimization algorithm: a new metaheuristic… Page 5 of 102 123

(Abdel-Basset et al. 2023a) is based on the mathematical model of exponential probabil-


ity distribution. Snowmelt optimizer (SAO) (Deng and Liu 2023) inspired by the sublima-
tion and melting behavior of snow in the natural world. The optical microscope algorithm
(OMA) (Cheng and Sholeh 2023) is inspired by the magnification capability of optical
microscopes when observing target objects. It involves a preliminary observation with the
naked eye and is further inspired by the simulated magnification process using objective
and eyepiece lenses, and rime optimizer (RIME) (Su et al. 2023) inspired by the mecha-
nism of ice growth in haze, also draw insights from various physical phenomena.
The MH algorithm, which is based on human behavior, aims to solve optimization
problems by simulating some natural behaviors of humans. For instance, the teaching
and learning-based optimization algorithm (TBLA) (Rao et al. 2011) The MH algorithm,
which is based on human behavior, aims to solve optimization problems by simulating
some natural behaviors of humans. For instance, the teaching and learning-based optimiza-
tion algorithm (TBLA) (Kumar et al. 2018), which is developed by simulating the social
learning behaviors of humans in societal settings, particularly in family organizations. A
proposed an improved approach to Teaching–Learning-Based Optimization, referred to as
Hybrid Teaching–Learning-Based Optimization (Nama et al. 2020).The human evolution-
ary optimization algorithm (HEOA) (Lian and Hui 2024) draws inspiration from human
evolution. HEOA divides the global search process into two distinct stages: human explo-
ration and human development. Logical chaotic mapping is used for initialization. During
the human exploration stage, an initial global search is conducted. This is followed by the
human development stage, where the population is categorized into leaders, explorers, fol-
lowers, and failures, each employing different search strategies. The partial reinforcement
optimizer (PRO) (Taheri et al. 2024) is proposed based on the partial reinforcement extinc-
tion (PRE) theory. According to this theory, learners intermittently reinforce or strengthen
specific behaviors during the learning and training process. In the context of optimization,
PRO incorporates intermittent reinforcement or strengthening of certain behaviors based
on the PRE theory. The soccer league competition algorithm (SLC) (Moosavian and Rood-
sari 2014) is inspired by the competitive interactions among soccer teams and players in
soccer leagues. The student psychology-based optimization algorithm (SPBO) (Das et al.
2020) is inspired by the psychological motivation of students who endeavor to exert extra
effort to enhance their exam performance, with the goal of becoming the top student in
the class. The dynamic hunting leadership optimization (DHL) (Ahmadi et al. 2023) is
proposed based on the notion that effective leadership during the hunting process can sig-
nificantly improve efficiency. This optimization algorithm is motivated by the idea that
proficient leadership in the hunting context can yield superior outcomes. Lastly, the gold
rush optimization algorithm (GRO) (Zolf 2023) is inspired by the competitive interactions
among soccer teams and players in soccer leagues.
The simulation of the biological evolution concept and the principle of natural selection
has provided significant guidance for the development of evolutionary-based algorithms. In
this regard, genetic algorithms (GA) (Holland 1992), and differential evolution (DE) (Storn
and Price 1997) are widely recognized as the most popular evolutionary algorithms. When
designing GAs and DEs, the concepts of natural selection and reproductive processes are
employed, including stochastic operators such as selection, crossover, and mutation. Fur-
thermore, there are other evolutionary-inspired approaches, such as genetic programming
(GP) (Angeline 1994), cultural algorithms (CA) (Reynolds 1994), and evolution strategies
(ES) (Asselmeyer et al. 1997). These methods have garnered widespread attention and have
been applied in various applications, including but not limited to facial recognition (Liu
and IEEE 2014), feature selection (Chen et al. 2021), image segmentation (Chakraborty

13
123 Page 6 of 102 Y. Fu et al.

et al. 2022b), network anomaly detection (Choura et al. 2021), data clustering (Manjarres
et al. 2013), scheduling problems (Attiya et al. 2022), and many other engineering applica-
tions and issues.
In the field of biology, the simulation of collective behaviors of animals, aquatic life,
birds, and other organisms has long been a source of inspiration for the development of
swarm intelligence algorithms. For example, the ant colony optimization (ACO) algorithm
(Dorigo et al. 2006) was inspired by the intelligent behavior of ants in finding the shortest
path to their nest and food sources. The particle swarm optimization (PSO) algorithm
(Kennedy and Eberhart 1995) was inspired by the collective behaviors and movement pat-
terns of birds or fish in their natural environments while searching for food. The grey wolf
optimizer (GWO) (Mirjalili et al. 2014) is based on the social hierarchy and hunting behav-
iors observed in wolf packs. The nutcracker optimizer (NOA) (Abdel-Basset et al. 2023b)
is proposed based on the behavior of nutcrackers, specifically inspired by Clark’s nut-
cracker, which locates seeds and subsequently stores them in appropriate caches. The algo-
rithm also considers the use of various objects or markers as reference points and involves
searching for hidden caches marked from different angles. The sea-horse optimizer (SHO)
(Zhao et al. 2023) draws inspiration from the natural behaviors of seahorses, encompassing
their movements, predatory actions, and reproductive processes. The SHO algorithm incor-
porates principles observed in seahorse behavior for optimization purposes. The African
vulture optimization algorithm (AVOA) (Abdollahzadeh et al. 2021a) draws inspiration
from the foraging and navigation behaviors of African vultures. The fox optimization algo-
rithm (FOX) (Mohammed and Rashid 2023) models the hunting behavior of foxes in the
wild when pursuing prey. The Chameleon swarm algorithm (CSA) (Braik 2021) simulates
the dynamic behaviors of chameleons as they search for food in trees, deserts, and swamps.
The Golden Jackal optimization algorithm (GJO) (Chopra and Mohsin Ansari 2022) is
inspired by the cooperative hunting behavior of golden jackals. The Chimpanzee optimiza-
tion algorithm (ChOA) (Khishe and Mosavi 2020) is based on the hunting behavior of
chimpanzee groups. Finally, the whale optimization algorithm (WOA) (Mirjalili and Lewis
2016) takes inspiration from the behavior of whales when hunting for prey by encircling
them. Additionally, there are several other nature-inspired optimization algorithms in the
field, each drawing inspiration from the foraging and collective behaviors of various animal
species. These algorithms include the marine predator algorithm (MPA) (Faramarzi et al.
2020a), which is inspired by the hunting behavior of marine predators; the rat swarm opti-
mization algorithm (RSO) (Dhiman et al. 2021), inspired by the population behavior of rats
when chasing and attacking prey; The artificial rabbits optimization (ARO) algorithm
(Wang et al. 2022), introduced by Wang et al. in 2022, draws inspiration from the survival
strategies of rabbits in nature. This includes their foraging behavior, which involves detour-
ing and randomly hiding. The detour-foraging strategy specifically entails rabbits eating
grass near the nests of other rabbits, compelling them to consume vegetation around differ-
ent burrows. And a novel bio-inspired algorithm, the shrike mocking optimizer (SMO)
(Zamani et al. 2022), is introduced, drawing inspiration from the astonishing noise-making
behavior of shrikes. On the other hand, the spider wasp optimizer (SWO) (Abdel-Basset
et al. 2023c) takes inspiration from the hunting, nesting, and mating behaviors of female
spider wasps in the natural world. The SWO algorithm incorporates principles observed in
the activities of female spider wasps for optimization purposes. White shark optimizer
(WSO) (Braik et al. 2022), which takes inspiration from the exceptional auditory and olfac-
tory abilities of great white sharks during navigation and hunting; the tunicate swarm algo-
rithm (TSA) (Kaur et al. 2020), inspired by jet propulsion and clustering behavior in tuni-
cates; the honey badger algorithm (HBA) (Hashim et al. 2022), which simulates the

13
Secretary bird optimization algorithm: a new metaheuristic… Page 7 of 102 123

digging and honey-searching dynamic behaviors of honey badgers; the dung beetle optimi-
zation algorithm (DBO) (Xue and Shen 2022), inspired by the rolling, dancing, foraging,
stealing, and reproductive behaviors of dung beetles; and the salp swarm algorithm (SSA)
(Mirjalili et al. 2017), inspired by the collective behavior of salps in the ocean. The fennec
fox optimization (FFO) (Zervoudakis and Tsafarakis 2022), inspired by the survival strate-
gies of fennec foxes in desert environments. The quantum-based avian navigation algo-
rithm (QANA) (Zamani et al. 2021) is proposed, inspired by the extraordinary precision
navigation behavior of migratory birds along long-distance aerial routes; the Northern
Goshawk Optimization (NGO) (Dehghani et al. 2021), which draws inspiration from the
hunting process of northern goshawks; the pathfinder algorithm (PFA) (Yapici and Cetin-
kaya 2019), inspired by animal group collective actions when searching for optimal food
areas or prey; the snake optimizer (SO) (Hashim and Hussien 2022), which models unique
snake mating behaviors. The crested porcupine optimizer (CPO) (Abdel-Basset et al.
2024), introduced by Abdel-Basset et al. (2024), is inspired by the four defense behaviors
of the crested porcupine, encompassing visual, auditory, olfactory, and physical attack
mechanisms. The CPO algorithm incorporates these defensive strategies observed in
crested porcupines for optimization purposes. On the other hand, the Genghis Khan Shark
Optimizer (GKSO) (Hu et al. 2023) is proposed based on the hunting, movement, foraging
(from exploration to exploitation), and self-protection mechanisms observed in the Geng-
his Khan shark. This optimizer draws inspiration from the diverse behaviors exhibited by
Genghis Khan sharks, integrating them into optimization algorithms. Slime Mould Algo-
rithm (SMA) (Li et al. 2020), inspired by slime mold foraging and diffusion behaviors and
Chakraborty combined Second-order Quadratic Approximation with Slime Mould Algo-
rithm (SMA), presenting the hybrid algorithm HSMA (Chakraborty et al. 2023) to enhance
the algorithm’s exploitation capabilities for achieving global optimality. Nama introduced a
novel quasi-reflective slime mould algorithm (QRSMA) (Nama 2022). The evolutionary
crow search algorithm (ECSA) (Zamani and Nadimi-Shahraki 2024) is introduced by Zam-
ani to optimize the hyperparameters of artificial neural networks for diagnosing chronic
diseases. ECSA successfully addresses issues such as reduced population diversity and
slow convergence speed commonly encountered in the Crow Search Algorithm. Sahoo pro-
posed an improved Moth Flame Optimization algorithm (Sahoo et al. 2023) based on a
dynamic inverse learning strategy. Combining binary opposition algorithm (BOA) with
moth flame optimization (MFO), a new hybrid algorithm h-MFOBOA (Nama et al. 2022a)
was introduced. Fatahi proposes an improved binary quantum avian navigation algorithm
(IBQANA) (Fatahi et al. 2023) in the context of fuzzy set system (FSS) for medical data
preprocessing. This aims to address suboptimal solutions generated by the binary version
of heuristic algorithms. Chakraborty introduced an enhancement to the Butterfly Optimiza-
tion Algorithm, named mLBOA (Chakraborty et al. 2022b), utilizing Lagrange interpola-
tion formula and embedding Levy flight search strategy; Sharma proposed an improved
Lagrange interpolation method for global optimization of butterfly optimization algorithm
(Sharma et al. 2022); Hoda Zamani proposes a conscious crow search algorithm (CCSA)
(Zamani et al. 2019) based on neighborhood awareness to address global optimization and
engineering design problems. CCSA successfully tackles issues related to the unbalanced
search strategy and premature convergence often encountered in the Crow Search Algo-
rithm. the gorilla troop optimization algorithm (GTO) (Abdollahzadeh et al. 2021b), based
on the social behaviors of gorillas groups; and the Walrus Optimization Algorithm (WOA)
(Trojovský and Dehghani 2022), inspired by walrus behaviors in feeding, migration, escap-
ing, and confronting predators. Regardless of the differences between these metaheuristic
algorithms, they share a common characteristic of dividing the search process into two

13
123 Page 8 of 102 Y. Fu et al.

phases: exploration and exploitation. “Exploration” represents thealgorithm’s ability to


comprehensively search the entire search space to locate promising areas, while “Exploita-
tion” guides the algorithm to perform precise searches within local spaces, gradually con-
verging toward better solutions (Wu et al. 2019). Therefore, in the design of optimization
algorithms, a careful balance between “exploration” and “exploitation” must be achieved
to demonstrate superior performance in seeking suitable solutions (Liu et al. 2013). The
proposed Secretary Bird Optimization algorithm effectively balances exploration and
exploitation in two phases, making it highly promising for solving engineering optimiza-
tion problems. Table 1 below lists some popular optimization algorithms and analyzes the
advantages and disadvantages of different algorithms when applied to engineering optimi-
zation problems.
Based on the literature, the hunting strategies of Secretary Birds and their behavior
when evading predators are considered intelligent activities, offering a fresh perspective
on problem-solving and optimization. These insights can potentially form the basis for the
design of optimization algorithms. Therefore, in this research, inspired by the Secretary
Bird’s hunting strategy and predator evasion behavior, a promising optimization algorithm
is proposed for solving engineering optimization problems.

3 Secretary bird optimization algorithm

In this section, the proposed secretary bird optimization algorithm (SBOA) is described
and the behavior of the secretary bird is modeled mathematically.

3.1 Secretary bird optimization algorithm inspiration and behavior

The Secretary Bird (Scientific name: Sagittarius serpentarius) is a striking African raptor
known for its distinctive appearance and unique behaviors. It is widely distributed in grass-
lands, savannas, and open riverine areas south of the Sahara Desert in Africa. Secretary
birds typically inhabit tropical open grasslands, savannas with sparse trees, and open areas
with tall grass, and they can also be found in semi-desert regions or wooded areas with
open clearings. The plumage of secretary birds is characterized by grey-brown feathers on
their backs and wings, while their chests are pure white, and their bellies are deep black
(De Swardt 2011; Hofmeyr et al. 2014), as shown in Fig. 1.
The Secretary Bird is renowned for its unique hunting style, characterized by its long
and sturdy legs and talons that enable it to run and hunt on the ground (Portugal et al.
2016). It typically traverses grasslands by walking or trotting, mimicking the posture of a
“secretary at work” by bowing its head and attentively scanning the ground to locate prey
hidden in the grass. Secretary birds primarily feed on insects, reptiles, small mammals,
and other prey. Once they spot prey, they swiftly charge towards it and capture it with their
sharp talons. They then strike the prey against the ground, ultimately killing it and consum-
ing it (Portugal et al. 2016).
The remarkable aspect of Secretary Birds lies in their ability to combat snakes, mak-
ing them a formidable adversary to these reptiles. When hunting snakes, the Secretary
Bird displays exceptional intelligence. It takes full advantage of its height by looking
down on the slithering snakes on the ground, using its sharp eyes to closely monitor
their every move. Drawing from years of experience in combat with snakes, the Sec-
retary Bird can effortlessly predict the snake’s next move, maintaining control of the

13
Table 1  Advantages and disadvantages of different algorithms
Variety Algorithm Advantages and disadvantages

Evolutionary algorithms Genetic algorithm (GA) (Holland 1992) Advantages: It has the ability of global optimization and parallel search, and is
(EA) easy to deal with complex problems
Disadvantages: Depending on the setting of the algorithm parameters, the conver-
gence rate is slow
Differential evolution (DE) (Storn and Price 1997) Advantages: It has strong robustness, strong global search ability, and is not easy
to fall into local optimal solutions
Disadvantages: Large computational resources are required and convergence is
slow
Genetic programming (GP) (Angeline 1994) Advantages: Automatic search, wide applicability, strong adaptability
Disadvantages: It is computationally expensive, difficult to adjust parameters, and
easy to fall into local optimal solutions
Cultural algorithm (CA) (Reynolds 1994) Advantages: Knowledge sharing helps to speed up the search process, improve
the efficiency of the algorithm, and has strong adaptability
Disadvantages: High complexity, dependence on parameter Settings, and high
computational cost
Secretary bird optimization algorithm: a new metaheuristic…

Evolution strategy (ES) (Asselmeyer et al. 1997) Advantages: Strong adaptability and robustness
Disadvantages: It is computationally expensive and easy to fall into local optimal
solutions
Based on physical phenom- Simulated annealing (SA) (Kirkpatrick et al. 1983) Advantages: It has strong global search ability, strong adaptability, simple and
ena algorithms (PhA) easy to implement
Disadvantages: The convergence speed is slow, the solution accuracy is not high,
and it is easy to fall into the local optimal solution
Gravitational search algorithm (GSA) (Rashedi et al. 2009) Advantages: It has simple structure, strong adaptability and strong global search
ability
Disadvantages: The computational cost is high, the convergence rate is unstable,
and it is easy to fall into local optimal solutions
Big bang-big crunch algorithm (BB-BC) (Erol and Eksin Advantages: Simple and easy to understand, few parameter Settings, strong global
2006) search ability
Disadvantages: The computational cost is high, the convergence rate is unstable,
and it is easy to fall into local optima
Page 9 of 102 123

13
Table 1  (continued)
123

Variety Algorithm Advantages and disadvantages

13
Multiverse optimizer (MVO) (Mirjalili et al. 2016) Advantages: It has relatively few parameters, strong adaptability and strong
global search ability
Disadvantages: It has poor performance in solving large-scale optimization prob-
Page 10 of 102

lems and lacks the ability to jump out of local minima


Water cycle algorithm (WCA) (Eskandar et al. 2012) Advantages: It is simple and easy to implement with fewer parameter Settings
Disadvantages: The convergence rate is slow and the computational cost is high
Nuclear reaction optimization (NRO) (Wei et al. 2019) Advantages: It has strong global search ability and strong robustness
Disadvantages: The convergence speed is slow, the solution accuracy is not high,
and it is easy to fall into the local optimal solution
Snow ablation optimizer (SAO) (Deng and Liu 2023) Advantages: It is simple and easy to implement, and has strong global search
ability
Disadvantages: The convergence speed is slow and it is easy to fall into local
optimum
Rime optimization algorithm (RIME) (Su et al. 2023) Advantages: It has fast convergence speed and strong global search ability
Disadvantages: It has poor robustness and is easy to fall into local optima
Based on human behavior Teaching–learning-based optimization(Rao et al. 2011) Advantages: The structure is simple, easy to understand, few parameters, strong
algorithms convergence ability and better global search ability
Disadvantages: It is easy to fall into local optimum and performs poorly in high-
dimensional problems
Social learning optimization algorithm paradigm (Kumar Advantages: It has strong global search ability and fast convergence speed
et al. 2018) Disadvantages: The computational cost is high, there is randomness, and the
convergence schedule is low
Soccer league competition (Moosavian and Roodsari 2014) Advantages: Strong global search ability, simple and easy to implement
Disadvantages: The convergence speed is slow, the solution accuracy is low, and
it is easy to fall into local optimum
Gold rush optimizer (GRO) (Zolf 2023) Advantages: The structure is simple and the convergence speed is relatively fast
Disadvantages: It has certain randomness, low convergence progress, and is easy
to fall into local optimum
Y. Fu et al.
Table 1  (continued)
Variety Algorithm Advantages and disadvantages

Swarm intelligence (SI) Particle swarm optimization (PSO) (Kennedy and Eberhart Advantages: Simple and easy to implement; Strong global search ability; Strong
1995) parallelism and gradient-free optimization
Disadvantages: It tends to converge to local optima and has poor convergence in
high-dimensional problems
Ant colony optimization (ACO) (Dorigo et al. 2006) Advantages: Strong global search ability and strong interpretability
Disadvantages: Premature convergence, time-consuming pheromone delivery,
falling into local optimum, etc
Grey wolf optimizer (GWO) (Mirjalili et al. 2014) Advantages: It is simple and easy to implement, with few parameters, strong
global search ability and fast convergence speed
Disadvantages: It is sensitive to parameters, easy to fall into local optimal solu-
tions and not suitable for high-dimensional problems
African vultures optimization algorithm (AVOA) (Abdol- Advantages: The structure is simple and easy to implement, the convergence
lahzadeh et al. 2021a) speed is fast, and the robustness is strong
Disadvantages: High dimensional problems are prone to fall into local optima
Fox hunting algorithm (FOX) (Mohammed and Rashid Advantages: It has less parameters, easy implementation and strong global search
Secretary bird optimization algorithm: a new metaheuristic…

2023) ability
Disadvantages: It is sensitive to parameters, easy to fall into local optima and not
suitable for high-dimensional problems
Chameleon swarm algorithm (CSA) (Braik 2021) Advantages: Strong global search ability and adaptability
Disadvantages: It is difficult to select the parameters and computationally inten-
sive
Whale optimization algorithm (WOA) (Mirjalili and Lewis Advantages: It has strong global search ability and does not rely on gradient
2016) NaNormation
Disadvantages: The slow convergence rate makes it difficult to find the optimal
result, and it increases the computation time of the multidimensional problem
Chimp optimization algorithm (ChOA) (Khishe and Advantages: The structure is simple and easy to implement, and it has good
Mosavi 2020) robustness and strong global search ability
Disadvantages: It is easy to fall into local optimum, slow convergence speed and
low optimization accuracy
Page 11 of 102 123

13
Table 1  (continued)
123

Variety Algorithm Advantages and disadvantages

13
Golden jackal optimization algorithm (GJO) (Chopra and Advantages: Few parameters, adaptability, strong global search ability
Mohsin Ansari 2022) Disadvantages: It is easy to fall into local optimum due to the large amount of
computation
Page 12 of 102

Marine predator algorithm (MPA) (Faramarzi et al. 2020a) Advantages: Strong global search ability, diversity and convergence speed
Disadvantages: It is difficult to select the parameters, and the algorithm is com-
plex because of the large amount of calculation
Rat swarm optimization (RSO) (Dhiman et al. 2021) Advantages: Global search capability, does not rely on gradient NaNormation
Disadvantages: The computational complexity is large, the convergence speed is
slow, and it is easy to fall into local optimum
Tunicate swarm algorithm (TSA) (Kaur et al. 2020) Advantages: It has simple operation, few parameters to adjust and strong ability
to jump out of local optimum
Disadvantages: The convergence rate is slow and it is difficult to find the optimal
result
White shark optimizer (WSO) (Braik et al. 2022) Advantages: Strong global search ability, diversity
Disadvantages: The convergence speed is slow and it is easy to fall into local
optimum
Honey badger algorithm (HBA) (Hashim et al. 2022) Advantages: The convergence speed is faster and the robustness is better
Disadvantages: It is easy to fall into a local optimal solution
Salp swarm algorithm (SSA) (Mirjalili et al. 2017) Advantages: The structure is simple, few parameters are adjusted, and the adapt-
ability is strong
Disadvantages: It is easy to fall into local optimal solutions and has slow conver-
gence speed
Dung beetle optimization algorithm (DBO) (Xue and Shen Advantages: The convergence rate is faster
2022) Disadvantages: It is less robust and has more parameter Settings
Fennec fox optimization (FFO) (Trojovska et al. 2022) Advantages: The convergence speed is fast and the optimization accuracy is high
Disadvantages: It has poor robustness and is easy to fall into local optimum
Northern goshawk algorithm (NGO) Advantages: It has strong global search capability and is easy to implement and
understand
Disadvantages: It may fall into a local optimum and the convergence speed is
slow
Y. Fu et al.
Table 1  (continued)
Variety Algorithm Advantages and disadvantages

Snake optimizer (SO) (Hashim and Hussien 2022) Advantages: It has the ability of global optimization and parallel search, and is
easy to deal with complex problems
Disadvantages: The convergence speed is slow in the later stage, and it is easy to
fall into local optimum
Slime mold algorithm (SMA) (Li et al. 2020) Advantages: The structure is simple and easy to realize and understand, and has
the ability of global optimization and parallel search
Disadvantages: The convergence speed is slow, the solution accuracy is low, and
it is easy to fall into local optimum
Artificial gorilla troops optimizer (GTO) (Abdollahzadeh Advantages: Strong global optimization ability and excellent performance in
et al. 2021b) high-dimensional problems
Disadvantages: It is easy to fall into local optimum and has slow convergence
speed
Walrus optimization algorithm (WOA) (Trojovský and Advantages: It has fast convergence speed and strong global search ability
Dehghani 2022) Disadvantages: It is easy to fall into local optimum and has poor robustness
Secretary bird optimization algorithm: a new metaheuristic…
Page 13 of 102 123

13
123 Page 14 of 102 Y. Fu et al.

Fig. 1  Secretary bird

Fig. 2  Secretary bird hunting and escape strategies correspond to SBOA

situation. It gracefully hovers around the snake, leaping, and provoking it. It behaves
like an agile martial arts master, while the snake, trapped within its circle, struggles in
fear. The relentless teasing exhausts the snake, leaving it weakened. At this point, the
Secretary Bird avoids the snake’s frontal attacks by jumping behind it to deliver a lethal
blow. It uses its sharp talons to grasp the snake’s vital points, delivering a fatal strike.
Dealing with larger snakes can be challenging for the Secretary Bird, as large snakes
possess formidable constriction and crushing power. In such cases, the Secretary Bird
may lift the snake off the ground, either by carrying it in its beak or gripping it with its
talons. It then soars into the sky before releasing the snake, allowing it to fall to the hard
ground, resulting in a predictable outcome (Feduccia and Voorhies 1989; De Swardt
2011).
Furthermore, the intelligence of the Secretary Bird is evident in its strategies for evad-
ing predators, which encompass two distinct approaches. The first strategy involves the
bird’s ability to camouflage itself when it detects a nearby threat. If suitable camouflage

13
Secretary bird optimization algorithm: a new metaheuristic… Page 15 of 102 123

surroundings are available, the Secretary Bird will blend into its environment to evade
potential threats. The second strategy comes into play when the bird realizes that the sur-
rounding environment is not conducive to camouflage. In such cases, it will opt for flight
or rapid walking as a means to swiftly escape from the predator (Hofmeyr et al. 2014). The
correspondence between the Secretary Bird’s behavior and the secretary bird optimization
algorithm (SBOA) is illustrated in Fig. 2. In this context, the preparatory hunting behav-
ior of the secretary bird corresponds to the initialization stage of secretary bird optimiza-
tion algorithm (SBOA). The subsequent stages of the secretary bird’s hunting process align
with the three exploration stages of SBOA. The two strategies employed by the secretary
bird to evade predators correspond to C1 andC2, the two strategies in the exploitation stage
of SBOA.

3.2 Mathematical modeling of the secretary bird optimization algorithm

In this subsection, the mathematical model of the natural behavior of secretary birds to
hunt snakes and avoid natural enemies is proposed to be presented to SBOA.

3.2.1 Initial preparation phase

The secretary bird optimization algorithm (SBOA) method belongs to the category of pop-
ulation-based metaheuristic approaches, where each Secretary Bird is considered a mem-
ber of the algorithm’s population. The position of each Secretary Bird in the search space
determines the values of decision variables. Consequently, in the SBOA method, the posi-
tions of the Secretary Birds represent candidate solutions to the problem at hand. In the
initial implementation of the SBOA, Eq. (1) is employed for the random initialization of
the Secretary Birds’ positions in the search space.
( )
Xi,j = lbj + r × ubj − lbj , i = 1, 2, … , N, j = 1, 2, … , Dim (1)

where Xi denotes the position of the ith secretary bird lbj and ubj are the lower and upper
bounds, respectively, and r denotes a random number between 0 and 1.
In the secretary bird optimization algorithm (SBOA), it is a population-based approach
where optimization starts from a population of candidate solutions, as shown in Eq. (2).
These candidate solutions X are randomly generated within the upper bound (ub) and lower
bound (lb) constraints for the given problem. The best solution obtained thus far is approxi-
mately treated as the optimal solution in each iteration.

⎡ x1,1 x1,2 ⋯ x1,j ⋯ x1,Dim ⎤


⎢ x2,1 x2,2 ⋯ x2,j ⋯ x2,Dim ⎥
⎢ ⎥
⋮ ⋮ ⋱ ⋮ ⋱ ⋮
X=⎢ ⎥ (2)
⎢ xi,1 xi,2 ⋯ xi,j ⋯ xi,Dim ⎥
⎢ ⋮ ⋮ ⋱ ⋮ ⋱ ⋮ ⎥
⎢ ⎥
⎣ xN,1 xN,2 ⋯ xN,j ⋯ xN,Dim ⎦N×Dim

X said secretary bird group, Xi said the ith secretary bird, Xi,j said the ith secretary bird jth
question the value of a variable, said N group members (the secretary) number, and Dim
said problem of variable dimension.

13
123 Page 16 of 102 Y. Fu et al.

Each secretary bird represents a candidate solution to optimize the problem. Therefore,
the objective function can be evaluated based on the values proposed by each secretary bird
for the problem variables. The resulting objective function values are then compiled into a
vector using Eq. (3).

⎡ F1 ⎤ ⎡ F(X 1 ) ⎤
⎢ ⋮ ⎥ ⎢ ⋮ ⎥
F=⎢ Fi ⎥ = ⎢ F(X i ) ⎥ (3)
⎢ ⎥ ⎢ ⎥
⎢ ⋮ ⎥ ⎢ ⋮ ⎥
⎣ FN ⎦ ⎣ F(X N ) ⎦
N×1 N×1

Here, F represents the vector of objective function values, and Fi represents the objec-
tive function value obtained by the ith secretary bird. By comparing the obtained objec-
tive function values, the quality of the corresponding candidate solutions is effectively
analyzed, determining the best candidate solution for a given problem. In minimization
problems, the secretary bird with the lowest objective function value is the best candidate
solution, whereas in maximization problems, the secretary bird with the highest objective
function value is the best candidate solution. Since the positions of the secretary birds and
the values of the objective function are updated in each iteration, it is necessary to deter-
mine the best candidate solution in each iteration as well.
Two distinct natural behaviors of the secretary bird have been utilized for updating the
SBOA members. These two types of behaviors encompass:

(a) The secretary bird’s hunting strategy;


(b) The secretary bird’s escape strategy.

Thus, in each iteration, each member of the secretary bird colony is updated in two dif-
ferent stages.

3.2.2 Hunting strategy of secretary bird (exploration phase)

The hunting behavior of secretary birds when feeding on snakes is typically divided into
three stages: searching for prey, consuming prey, and attacking prey. The hunting behavior
of the secretary bird is shown in Fig. 3.
Based on the biological statistics of the secretary bird’s hunting phases and the time
durations for each phase, we have divided the entire hunting process into three equal time
intervals, namelyt < 13 T , 13 T < t < 32 T and 23 T < t < T , corresponding to the three phases
of the secretary bird’s predation: searching for prey, consuming prey, and attacking prey.
Therefore, the modeling of each phase in SBOA is as follows:
Stage 1 (Searching for Prey): The hunting process of secretary birds typically begins
with the search for potential prey, especially snakes. Secretary birds have incredibly sharp
vision, allowing them to quickly spot snakes hidden in the tall grass of the savannah.
They use their long legs to slowly sweep the ground while paying attention to their sur-
roundings, searching for signs of snakes (Feduccia and Voorhies 1989). Their long legs
and necks enable them to maintain a relatively safe distance to avoid snake attacks. This
situation arises in the initial iterations of optimization, where exploration is crucial. There-
fore, this stage employs a differential evolution strategy. Differential evolution uses differ-
ences between individuals to generate new solutions, enhancing algorithm diversity and
global search capabilities. By introducing differential mutation operations, diversity can

13
Secretary bird optimization algorithm: a new metaheuristic… Page 17 of 102 123

Fig. 3  The hunting behavior of secretary birds

help avoid getting trapped in local optima. Individuals can explore different regions of the
solution space, increasing the chances of finding the global optimum. Therefore, updating
the secretary bird’s position in the Searching for Prey stage can be mathematically modeled
using Eqs. (4) and (5).
1 ( )
(4)
new P1
While t < T, xi,j = xi,j + xrandom_1 − xrandom_2 × R1
3
{
Xinew,P1 , if Finew,P1 < Fi
Xi = (5)
Xi , else

where, t represents the current iteration number, T represents the maximum iteration num-
ber, Xinew,P1 represents the new state of the ith secretary bird in the first stage, and xrandom_1
and xrandom_2 are the random candidate solutions in the first stage iteration. R1 represents a
randomly generated array of dimension 1 × Dim from the interval [0, 1], where Dim is the
dimensionality of the solution space. xi,j new P1
represents its value of the jth dimension, and
Finew,P1
represents its fitness value of the objective function.
Stage 2 (Consuming Prey): After a secretary bird discovers a snake, it engages in a dis-
tinctive method of hunting. Unlike other raptors that immediately dive in for combat, the
secretary bird employs its agile footwork and maneuvers around the snake. The secretary

13
123 Page 18 of 102 Y. Fu et al.

bird stands its ground, observing every move of the snake from a high vantage point. It
uses its keen judgment of the snake’s actions to hover, jump, and provoke the snake gradu-
ally, thereby wearing down its opponent’s stamina (Hofmeyr et al. 2014). In this stage, we
introduce Brownian motion (RB) to simulate the random movement of the secretary bird.
Brownian motion can be mathematically modeled using Eq. (6). This "peripheral combat"
strategy gives the secretary bird a significant physical advantage. Its long legs make it dif-
ficult for the snake to entangle its body, and the bird’s talons and leg surfaces are covered
with thick keratin scales, like a layer of thick armor, making it impervious to the fangs of
venomous snakes. During this stage, the secretary bird may frequently pause to lock onto
the snake’s location with its sharp eyesight. Here, we use the concept of “ xbest ” (individual
historical best position) and Brownian motion. By using “ xbest ,” individuals can perform
local searches towards the best positions they have previously found, better exploring the
surrounding solution space. Additionally, this approach not only helps individuals avoid
premature converging to local optima but also accelerates the algorithm’s convergence to
the best positions in the solution space. This is because individuals can search based on
both global information and their own historical best positions, increasing the chances of
finding the global optimum. The introduction of the randomness of Brownian motion ena-
bles individuals to explore the solution space more effectively and provides opportunities
to avoid being trapped in local optima, leading to better results when addressing complex
problems. Therefore, updating the secretary bird’s position in the Consuming Prey stage
can be mathematically modeled using Eqs. (7) and (8).
RB = randn(1, Dim) (6)

1 2 ( )
While new P1
T < t < T, xi,j = xbest + exp ((t∕T) ∧ 4) × (RB − 0.5) × xbest − xi,j (7)
3 3
{
Xinew,P1 , if Finew,P1 < Fi
Xi = (8)
Xi , else

where randn(1, Dim) represents a randomly generated array of dimension 1 × Dim from a
standard normal distribution (mean 0, standard deviation 1), and xbest represents the current
best value.
Stage 3 (Attacking Prey): When the snake is exhausted, the secretary bird perceives the
opportune moment and swiftly take action, using its powerful leg muscles to launch an
attack. This stage typically involves the secretary bird’s leg-kicking technique, where it
rapidly raises its leg and delivers accurate kicks using its sharp talons, often targeting the
snake’s head. The purpose of these kicks is to quickly incapacitate or kill the snake, thereby
avoiding being bitten in return. The sharp talons strike at the snake’s vital points, leading to
its demise. Sometimes, when the snake is too large to be immediately killed, the secretary
bird may carry the snake into the sky and release it, causing it to fall to the hard ground and
meet its end. In the random search process, we introduce Levy flight strategy to enhance the
optimizer’s global search capabilities, reduce the risk of SBOA getting stuck in local solu-
tions, and improve the algorithm’s convergence accuracy. Levy flight is a random move-
ment pattern characterized by short, continuous steps and occasional long jumps in a short
amount of time. It is used to simulate the flight ability of the secretary bird, enhancing its
exploration of the search space. Large steps help the algorithm explore the global range
of the search space, bringing individuals closer to the best position more quickly, while
small steps contribute to improving optimization accuracy. To make SBOA more dynamic,

13
Secretary bird optimization algorithm: a new metaheuristic… Page 19 of 102 123

adaptive, and flexible during the optimization process—achieving a better balance between
exploration and exploitation, avoiding premature convergence, accelerating convergence,
and enhancing algorithm performance—we introduce a nonlinear perturbation factor rep-
resented as (1 − Tt )( 2 × Tt ). Therefore, updating the secretary bird’s position in the Attack-
ing Prey stage can be mathematically modeled using Eqs. (9) and (10).
(( ) ( ))
2 t t
(9)
new P1
While t > T, xi,j = xbest + 1 − ∧ 2× × xi,j × RL
3 T T
{
Xinew,P1 , if Finew,P1 < Fi
Xi = (10)
Xi , else

To enhance the optimization accuracy of the algorithm, we use the weighted Levy flight,
denoted as ’ RL ’.
RL = 0.5 × Levy(Dim) (11)
Here, Levy(Dim) represents the Levy flight distribution function. It is calculated as
follows:
u×𝜎
𝐿𝑒𝑣𝑦(D) = s × 1 (12)
|v| 𝜂

Here, s is a fixed constant of 0.01 and 𝜂 is a fixed constant of 1.5. u and v are random
numbers in the interval [0, 1]. The formula for σ is as follows:
� � 1𝜂
⎛ 𝜋𝜂 ⎞
⎜ Γ(1 + 𝜂) × 𝑠𝑖𝑛 2 ⎟
𝜎=⎜ � � � �⎟ (13)
𝜂−1
⎜Γ 1+𝜂
×𝜂×2 2 ⎟
⎝ 2 ⎠

Here, Γ denotes the gamma function and 𝜂 has a value of 1.5.

3.2.3 Escape strategy of secretary bird (exploitation stage)

The natural enemies of secretary birds are large predators such as eagles, hawks, foxes, and
jackals, which may attack them or steal their food. When encountering these threats, sec-
retary birds typically employ various evasion strategies to protect themselves or their food.
These strategies can be broadly categorized into two main types. The first strategy involves
flight or rapid running. Secretary birds are known for their exceptionally long legs, ena-
bling them to run at remarkable speeds. They can cover distances of 20 to 30 km in a
single day, earning them the nickname “marching eagles”. Additionally, secretary birds are
skilled flyers and can swiftly take flight to escape danger, seeking safer locations (Feduccia
and Voorhies 1989). The second strategy is camouflage. Secretary birds may use the colors
or structures in their environment to blend in, making it harder for predators to detect them.
Their evasion behaviors when confronted with threats are illustrated in Fig. 4. In the design
of the SBOA, it is assumed that one of the following two conditions occurs with equal
probability:

13
123 Page 20 of 102 Y. Fu et al.

i. C1: Camouflage by environment;


ii. C2 : Fly or run away.

In the first strategy, when secretary birds detect the proximity of a predator, they initially
search for a suitable camouflage environment. If no suitable and safe camouflage environ-
ment is found nearby, they will opt for flight or rapid running to escape. In this context, we
2
introduce a dynamic perturbation factor, denoted as (1 − Tt ) . This dynamic perturbation
factor helps the algorithm strike a balance between exploration (searching for new solu-
tions) and exploitation (using known solutions). By adjusting these factors, it is possible
to increase the level of exploration or enhance exploitation at different stages. In summary,
both evasion strategies employed by secretary birds can be mathematically modeled using
Eq. (14), and this updated condition can be expressed using Eq. (15).
{ ( )2
new,P2 C1 ∶ xbest + (2 × RB − 1) × 1 − Tt × xi,j , if r and < ri
xi,j = ( ) (14)
C2 ∶ xi,j + R2 × xrandom − K × xi,j , else

{
Xinew,P2 , if Finew,P2 < Fi
Xi = (15)
Xi , else

Here, r = 0.5, R2 represents the random generation of an array of dimension (1 × Dim)


from the normal distribution, xrandom represents the random candidate solution of the cur-
rent iteration, and K represents the random selection of integer 1 or 2, which can be calcu-
lated by Eq. (16).

Fig. 4  The escape behavior of the secretary bird

13
Secretary bird optimization algorithm: a new metaheuristic… Page 21 of 102 123

K = round(1 + rand(1, 1)) (16)


Here, rand(1, 1) means randomly generating a random number between (0,1).
In summary, the flowchart of SBOA is shown in Fig. 5, and the pseudocode is shown in
Algorithm 1.

Algorithm 1  Pseudo code of SBOA

3.3 Algorithm complexity analysis

Different algorithms take varying amounts of time to optimize the same problems and assess-
ing the computational complexity of an algorithm is an essential way to evaluate its execu-
tion time. In this paper, we utilize Big O notation (Tallini et al. 2016) to analyze the time
complexity of SBOA. Let N represent the population size of secretary birds, Dim denote the
dimensionality, and T is the maximum number of iterations. Following the rules of opera-
tion for the time complexity symbol O, the time complexity for randomly initializing the
population is O(N). During the solution update process, the computational complexity is
O(T × N) + O(T × N × Dim), which encompasses both finding the best positions and updat-
ing the positions of all solutions. Therefore, the total computational complexity of the pro-
posed SBOA can be expressed as O(N × (T × Dim + 1)).

13
123 Page 22 of 102 Y. Fu et al.

4 Analysis of experimental results

In this section, to assess the effectiveness of the proposed algorithm SBOA in optimization
and providing optimal solutions, we conducted a series of experiments. First, we designed
experiments to evaluate the convergence as well as exploration vs. exploitation capabili-
ties of the algorithm. Secondly, we compared this algorithm with 14 other algorithms in
the context of CEC-2017 and CEC-2022 to validate its performance. Finally, we subjected
it to a rank-sum test to determine whether there were significant performance differences
between the SBOA algorithm and the other algorithms. The algorithms are executed using
a consistent system configuration, implemented on a desktop computer featuring an 13th
Intel(R) Core (TM) i5-13400 (16 CPUs), ~ 2.5 GHz processor and 16 GB RAM. These
experiments were conducted utilizing the MATLAB 2022b platform.

4.1 Qualitative analysis

In this section, the CEC-2017 test set is used to validate the SBOA in terms of exploration
and exploitation balance and convergence behavior in 30 dimensions. The CEC-2017 test
set functions are shown in Table 3.

4.1.1 Exploration and exploitation

Among metaheuristic algorithms, exploration and exploitation are considered two crucial
factors. Exploration involves searching for new solutions in the solution space, aiming to
discover better solutions in unknown regions. Exploitation, on the other hand, focuses on
known solution spaces and conducts searches within the local neighborhoods of solutions
to find potentially superior solutions. A well-balanced combination of exploration and
exploitation not only helps the algorithm converge quickly to optimal solutions, enhancing
search efficiency, but also allows for flexibility in addressing diverse optimization problems
and complexities, showcasing exceptional adaptability and robustness (Morales-Castaneda
et al. 2020). A high-quality algorithm should strike a good balance between these two fac-
tors. Therefore, we use Eqs. (17) and (18) to calculate the percentages of exploration and
exploitation, respectively, allowing us to assess the algorithm’s balance between these two
factors. Div(t) is a measure of dimension diversity calculated by Eq. (19). Here, xid repre-
sents the position of the ith represents the position of the dth dimension, and Divmax denotes
the maximum diversity throughout the entire iteration process.
Div(t)
Exploration(%) = Divmax
× 100 (17)

|Div(t)−Divmax |
Exploitation(%) = Divmax
× 100 (18)

1 ∑D 1 ∑N � � � �
i=1 ��median xd (t) − xid (t)�� (19)
Div(t) = D d=1 N

Figure 6 intuitively illustrates the balance between exploration and exploitation in


SBOA using the 30-dimensional CEC-2017 test functions. From the graph, it is evident
that the intersection point of the exploration and exploitation ratio in the SBOA algorithm

13
Secretary bird optimization algorithm: a new metaheuristic… Page 23 of 102 123

Fig. 5  Flowchart of SBOA

13
123 Page 24 of 102 Y. Fu et al.

primarily occurs during the mid-iterations of the problem search process. In the initial
stages, there is a comprehensive exploration of the global search space, gradually tran-
sitioning into the phase of local exploitation. It’s worth noting that the SBOA algorithm
maintains a relatively high exploitation ratio in the later iterations across all functions, con-
tributing to enhanced problem convergence speed and search precision. The SBOA algo-
rithm maintains a dynamic equilibrium between exploration and exploitation throughout
the iteration process. Therefore, SBOA exhibits outstanding advantages in avoiding local
optima and premature convergence.

4.1.2 Convergence behavior analysis

To validate the convergence characteristics of SBOA, we designed experiments to provide


a detailed analysis of its convergence behavior. As shown in Fig. 7, the first column pro-
vides an intuitive representation of the two-dimensional search space of the test function,
vividly showcasing the complex nature of the objective function. The second column offers
a detailed view of the search trajectories of the agents. It is evident that majority of agents
closely converge around the optimal solution and are distributed across the entire search
space, demonstrating SBOA’s superior ability to avoid falling into local optima. The third
column illustrates the changes in the average fitness values of the search agents. Initially,
this value is relatively high, highlighting that intelligent agents are extensively exploring
the search space. It then rapidly decreases, emphasizing that most search agents possess the
inherent potential to find the optimal solution. The fourth column displays the search tra-
jectories of individual agents, transitioning from initial fluctuations to stability, revealing a
smooth shift from global exploration to local exploitation, facilitating the acquisition of the
global optimum. The last column shows the convergence curves of SBOA. For unimodal
functions, the curve continuously descends, indicating that with an increasing number of
iterations, the algorithm gradually approaches the optimal solution. For multimodal func-
tions, the curve exhibits a stepwise descent, signifying that SBOA can consistently escape
local optima and eventually reach the global optimum.

4.1.3 Search agent distribution analysis

To thoroughly analyze the optimization performance of SBOA, we conducted iterative


experiments on the search history of algorithm search proxies. Figure 8 illustrates the his-
torical distribution of search proxies and targets at different iteration counts when SBOA
tackled partial functions of CEC2005. The red points represent the targets, while the black
points represent the search proxies.
From the graph, it can be observed that at iteration 1, most search proxies are distant
from the target, with only a few scattered around it. In comparison, by iteration 100, search
proxies are closer to the target, but some also cluster in other directions, indicating a con-
vergence towards local optimal solutions and suggesting that SBOA temporarily falls into
local optima, as seen in F1, F7, and F10. By iteration 300, certain functions (such as F1
and F9) break out of local optima, and search proxies start converging towards the global
optimal solution. Compared to iteration 300, at iterations 400 and 500, search proxies are
even closer to the target and clustered together. Moreover, more search proxies are distrib-
uted around the target, indicating that SBOA successfully escapes local optima and pro-
gresses towards the global optimal solution.

13
Secretary bird optimization algorithm: a new metaheuristic… Page 25 of 102 123

Fig. 6  Balance between exploration and exploitation

These results demonstrate that, although SBOA may temporarily fall into local optima
in certain situations, with an increase in the number of iterations, it is capable of break-
ing out of local optima and gradually approaching and converging on the global optimal
solution.

4.2 Quantitative analysis

4.2.1 Competing algorithms and parameter settings

In this section, to validate the effectiveness of SBOA, we conducted comparisons with


15 advanced algorithms on the CEC-2017 and CEC-2022 test functions. The compared
algorithms fall into three categories: (1) High-performance algorithms: The winning
algorithms of the CEC2017 competition, LSHADE_cnEpSin (Awad et al. 2017), and
LSHADE_SPACMA (Mohamed et al. 2017), as well as the outstanding variant of the

13
123 Page 26 of 102 Y. Fu et al.

differential evolution algorithm, MadDE (Biswas et al. 2021). (2) Highly-Cited Algo-
rithms: DE (Storn and Price 1997), Grey wolf optimizer (GWO) (Mirjalili et al. 2014),
Whale optimization algorithm (WOA) (Mirjalili and Lewis 2016), CPSOGSA (Rather
and Bala 2021) and African vultures optimization algorithm (AVOA) (Abdollahzadeh
et al. 2021a); (3) advanced algorithms: snake optimizer (SO) (Hashim and Hussien
2022), Artificial gorilla troops optimizer (GTO) (Abdollahzadeh et al. 2021b), crayfish
optimization algorithm (COA) (Jia et al. 2023), Rime optimization algorithm (RIME)
(Su et al. 2023), Golden jackal optimization (GJO) (Chopra and Mohsin Ansari 2022),
dung beetle optimizer (DBO) (Xue and Shen 2022) and nutcracker optimization algo-
rithm (NOA) (Abdel-Basset et al. 2023b). The parameter settings for the compared algo-
rithms are detailed in Table 2. We set the maximum number of iterations and population
size for all algorithms to 500 and 30, respectively. Each algorithm is independently run
30 times, and the experimental results will be presented in the following text. The best
results for each test function and its corresponding dimension are highlighted in bold.

4.2.2 CEC‑2017 experimental results

In this section, for a more comprehensive assessment of SBOA’s performance, we con-


ducted a comparative validation using the CEC-2017 test functions. As shown in Table 3,
the CEC-2017 test functions are categorized into four types: unimodal functions, multi-
modal functions, hybrid functions, and composite functions. Unimodal functions have only
one global optimum and no local optima, making them suitable for evaluating algorithm
development performance. Multimodal test functions contain multiple local optima and are
primarily used to assess the algorithm’s ability to find the global optimum and escape from
local optima. Hybrid functions and composite functions are employed to gauge the algo-
rithm’s capability in handling complex, continuous problems.
During the validation using the CEC2017 test suite, we conducted tests in dimensions
of 30, 50, and 100. The results for different dimensions are presented in Tables 4, 5 6. The
three symbols in the last row of the table (W|T|L) indicate the number of wins, draws and
losses that SBOA has received compared to its competitors. Convergence curves for some
of the functions can be observed in Figs. 9, 10, 11 and Box plots are shown in Figs. 13, 14,
15.
Figure 12 displays the ranking of different algorithms across various dimensions. To
better illustrate this, we employed radar charts to depict the ranking of different algorithms
on the test set. From the figure, it is evident that the SBOA algorithm consistently main-
tains a leading position in dimensions of 30, 50, and 100. For 30-dimensional tests, SBOA
achieved the best average ranking for 22 functions, the second-best for 3 functions, and
the third-best for 4 functions. As the dimensionality increases, SBOA continues to deliver
strong optimization results. In the 50-dimensional tests, SBOA maintains the best average
ranking for 20 functions, the second-best for 6 functions, and the third-best for 1 functions.
In the 100-dimensional tests, SBOA achieved the best average results for 20 test functions,
the second-best for 5 functions. Notably, SBOA did not have the worst ranking in any of
the three dimensions. Although it didn’t perform as ideally in functions F1, F2, F3, F6, F7
and F18 in the 100-dimensional tests, SBOA still delivered strong results, significantly out-
performing most other algorithms in those cases.
From convergence plots in different dimensions, we observe that algorithms such as SO,
GTO, DBO, and WOA often encounter local optima in the later stages of iteration and
struggle to escape from them. In contrast, SBOA maintains strong exploration capabilities

13
Secretary bird optimization algorithm: a new metaheuristic… Page 27 of 102 123

Fig. 7  The convergence behavior of SBOA

13
123 Page 28 of 102 Y. Fu et al.

Fig. 8  Search agent distribution with different iterations

even in the later stages of iteration. Although, during certain periods in the subsequent
iterations, functions F5, F8, F10, F12, F16, F20, and F22 briefly experience local optima,
with an increase in the number of iterations, SBOA demonstrates the ability to break free
from these local optima and continues to explore more deeply, ultimately achieving higher
convergence accuracy. This suggests that the introduced differential evolution strategy,
Brownian motion strategy, and Levy flight strategy are effective. These strategies not only
help the algorithm escape local optima but also enhance the algorithm’s convergence speed
and accuracy.

13
Secretary bird optimization algorithm: a new metaheuristic… Page 29 of 102 123

Table 2  Compare the parameter settings of the algorithms


Algorithms Name of the parameter Value of the parameter

LSHADE_cnEpSin pb; ps; freq_inti 0.4; 0.5; 0.5


LSHADE_SPACMA L_Rate; val_2_reach 0.8; 10^(− 8)
MadDE q_cr_rate; p_best_rate; arc_rate 0.01; 0.18; 2.3
DE F; CR; 0.8; 0.1
GWO a [0,2]
WOA a, a2, b [0,2], [− 1, − 2], 1
CPSOGSA φ1, φ2 2.05, 2.05
AVOA L1, L2, w, p1, p2, p3 0.8, 0.2, 2.5, 0.6, 0.4, 0.6
SO Q, T, c1, c2, c3 0.25, 0.6, 0.5, 0.05, 2
GTO P, r1, r2,r3 P, r1, r2, r3 ∈ [0,1]
COA temp [20,35];
RIME W 5
GJO E0 [− 1,1]
DBO P_percent 0.2
NOA Delta, Prp, Pa2 0.05, 0.2, 0.4

Figures 13, 14, 15 present the results of 16 algorithms on three dimensions of the
CEC2017 test set in the form of box plots. It is evident from the figures that the majority of
SBOA box plots are the narrowest, indicating that SBOA exhibits higher robustness com-
pared to other algorithms. Furthermore, the boxes are positioned near the optimal function
values, suggesting that SBOA achieves higher precision in solving problems compared to
other algorithms.

4.2.3 CEC‑2022 experimental results

To assess the scalability of SBOA, in this section, we compare it with the 15 benchmark
algorithms using the CEC-2022 test functions in both 10 and 20 dimensions. Similar to
the CEC-2017 test functions, the CEC-2022 test functions consist of unimodal functions,
multimodal functions, hybrid functions, and composite functions (Luo et al. 2022). The
specific details can be found in Table 7.
The test results for SBOA and the 15 other algorithms on the CEC-2022 test suite are
presented in Tables 8 and 9, and the convergence curves can be seen in Figs. 16 and 17.
And Box plots are shown in Figs. 19 and 20.The results indicate that SBOA outperforms
the other algorithms in 6 out of the functions in the 10-dimensional and 20-dimensional
CEC-2022 test set. To clearly illustrate the comparative ranking of SBOA against other
algorithms, a stacked ranking chart is presented in Fig. 18. Rankings are divided into five
categories: average best ranking, average second-best ranking, average third-best ranking,
average worst ranking, and other rankings. From the chart, it is evident that SBOA does not
have the worst ranking in any of the test functions, demonstrating its strong scalability and
effectiveness. When considering the overall ranking, SBOA ranks the highest among the
15 algorithms and significantly outperforms the others. Figure 17 displays the convergence
curves of SBOA and the 15 benchmark algorithms. From the figure, we can observe that
SBOA exhibits higher convergence speed and accuracy compared to the other algorithms.
The results indicate that our proposed SBOA demonstrates faster convergence speed and

13
123 Page 30 of 102 Y. Fu et al.

Table 3  CEC-2017 test functions


Type ID CEC2017 function name Rang Dimension fmin

Unimodal F1 Shifted and rotated bent cigar function [− 100,100] 30/50/100 100
F2 Shifted and rotated sum of different power function [− 100,100] 30/50/100 200
F3 Shifted and rotated Zakharov function [− 100,100] 30/50/100 300
Multimodal F4 Shifted and rotated Rosenbrock’s function [− 100,100] 30/50/100 400
F5 Shifted and rotated Rastrigin’s Function [− 100,100] 30/50/100 500
F6 Shifted and rotated expanded Scaffer’s F6 Function [− 100,100] 30/50/100 600
F7 Shifted and ROTATED Lunacek Bi_Rastrigin func- [− 100,100] 30/50/100 700
tion
F8 Shifted and rotated non-continuous Rastrigin’s [− 100,100] 30/50/100 800
function
F9 Shifted and rotated levy function [− 100,100] 30/50/100 900
F10 Shifted and rotated Schwefel’s function [− 100,100] 30/50/100 1000
Hybrid F11 Hybrid function 1 (N = 3) [− 100,100] 30/50/100 1100
F12 Hybrid function 2 (N = 3) [− 100,100] 30/50/100 1200
F13 Hybrid function 3 (N = 3) [− 100,100] 30/50/100 1300
F14 Hybrid function 4 (N = 4) [− 100,100] 30/50/100 1400
F15 Hybrid function 5 (N = 4) [− 100,100] 30/50/100 1500
F16 Hybrid function 6 (N = 4) [− 100,100] 30/50/100 1600
F17 Hybrid function 6 (N = 5) [− 100,100] 30/50/100 1700
F18 Hybrid function 6 (N = 5) [− 100,100] 30/50/100 1800
F19 Hybrid function 6 (N = 5) [− 100,100] 30/50/100 1900
F20 Hybrid function 6 (N = 6) [− 100,100] 30/50/100 2000
Composition F21 Composition function 1 (N = 3) [− 100,100] 30/50/100 2100
F22 Composition function 2 (N = 3) [− 100,100] 30/50/100 2200
F23 Composition function 3 (N = 4) [− 100,100] 30/50/100 2300
F24 Composition function 4 (N = 4) [− 100,100] 30/50/100 2400
F25 Composition function 5 (N = 5) [− 100,100] 30/50/100 2500
F26 Composition function 6 (N = 5) [− 100,100] 30/50/100 2600
F27 Composition function 7 (N = 6) [− 100,100] 30/50/100 2700
F28 Composition function 8 (N = 6) [− 100,100] 30/50/100 2800
F29 Composition function 9 (N = 3) [− 100,100] 30/50/100 2900
F30 Composition function 10 (N = 3) [− 100,100] 30/50/100 3000

higher convergence accuracy compared to the other 15 algorithms. As depicted in Figs. 19


and 20, SBOA exhibits superior robustness relative to the other 15 algorithms.

4.3 Statistical test

In this section, we analyze the experimental results using Wilcoxon test and Friedman test
to statistically analyze the differences between SBOA and other comparison algorithms.

13
Table 4  The experimental results of 16 algorithms in CEC-2017 (Dim = 30)
Function LSHADE_cnEpSin LSHADE_SPACMA MadDE DE AVOA GWO WOA CPSOGSA

F1 Ave 9.87E + 04 9.10E + 04 9.67E + 05 2.68E + 07 4.56E + 06 3.39E + 09 4.83E + 09 1.00E + 09


Std 1.40E + 05 1.68E + 05 6.24E + 05 7.35E + 06 1.29E + 07 2.33E + 09 1.67E + 09 9.13E + 08
F2 Ave 1.95E + 22 1.20E + 18 4.40E + 22 1.37E + 28 9.22E + 25 2.07E + 32 1.79E + 35 3.46E + 37
Std 1.06E + 23 5.21E + 18 2.41E + 23 2.94E + 28 4.85E + 26 1.04E + 33 6.43E + 35 1.29E + 38
F3 Ave 2.89E + 04 5.99E + 04 8.45E + 04 1.30E + 05 5.54E + 04 6.29E + 04 2.46E + 05 1.65E + 05
Std 2.01E + 04 5.59E + 04 1.09E + 04 2.37E + 04 8.27E + 03 1.22E + 04 6.20E + 04 5.32E + 04
F4 Ave 5.02E + 02 5.07E + 02 5.13E + 02 6.41E + 02 5.45E + 02 6.54E + 02 1.25E + 03 9.24E + 02
Std 2.43E + 01 2.74E + 01 1.41E + 01 1.95E + 01 4.03E + 01 1.01E + 02 4.23E + 02 2.78E + 02
F5 Ave 5.70E + 02 5.79E + 02 6.78E + 02 6.91E + 02 7.35E + 02 6.19E + 02 8.71E + 02 8.17E + 02
Std 2.43E + 01 2.25E + 01 1.72E + 01 1.47E + 01 4.67E + 01 2.92E + 01 5.03E + 01 4.54E + 01
F6 Ave 6.05E + 02 6.01E + 02 6.11E + 02 6.04E + 02 6.57E + 02 6.14E + 02 6.78E + 02 6.71E + 02
Std 2.94E + 00 7.41E − 01 4.16E + 00 6.34E − 01 8.12E + 00 5.17E + 00 9.77E + 00 9.40E + 00
F7 Ave 8.43E + 02 8.38E + 02 9.71E + 02 9.46E + 02 1.18E + 03 9.17E + 02 1.31E + 03 1.61E + 03
Secretary bird optimization algorithm: a new metaheuristic…

Std 2.88E + 01 3.69E + 01 3.17E + 01 1.60E + 01 8.64E + 01 5.62E + 01 8.40E + 01 2.23E + 02


F8 Ave 8.79E + 02 8.80E + 02 9.47E + 02 9.86E + 02 9.69E + 02 9.02E + 02 1.07E + 03 1.03E + 03
Std 1.60E + 01 2.11E + 01 1.53E + 01 1.34E + 01 3.53E + 01 1.84E + 01 5.65E + 01 4.98E + 01
F9 Ave 1.43E + 03 1.24E + 03 3.54E + 03 3.43E + 03 5.28E + 03 2.53E + 03 1.23E + 04 7.99E + 03
Std 3.60E + 02 2.94E + 02 8.75E + 02 4.38E + 02 8.24E + 02 8.31E + 02 4.02E + 03 1.39E + 03
F10 Ave 5.13E + 03 5.04E + 03 5.80E + 03 6.89E + 03 5.35E + 03 5.12E + 03 7.57E + 03 5.56E + 03
Std 8.81E + 02 9.53E + 02 4.00E + 02 3.21E + 02 8.29E + 02 1.34E + 03 8.13E + 02 5.78E + 02
F11 Ave 1.31E + 03 1.26E + 03 1.33E + 03 2.35E + 03 1.31E + 03 2.99E + 03 1.02E + 04 1.72E + 03
Std 6.38E + 01 4.54E + 01 3.95E + 01 7.89E + 02 1.19E + 02 1.37E + 03 5.89E + 03 5.61E + 02
F12 Ave 7.28E + 05 4.40E + 05 2.12E + 06 3.38E + 07 1.66E + 07 1.16E + 08 5.77E + 08 2.83E + 07
Std 1.01E + 06 4.00E + 05 7.34E + 05 1.20E + 07 1.64E + 07 1.21E + 08 3.23E + 08 4.21E + 07
F13 Ave 2.44E + 04 1.73E + 04 8.51E + 04 6.37E + 06 1.37E + 05 1.79E + 07 1.34E + 07 5.32E + 04
Std 1.72E + 04 1.02E + 04 5.60E + 04 2.04E + 06 7.27E + 04 5.37E + 07 1.09E + 07 2.74E + 04
Page 31 of 102 123

13
Table 4  (continued)
123

Function LSHADE_cnEpSin LSHADE_SPACMA MadDE DE AVOA GWO WOA CPSOGSA

13
F14 Ave 1.60E + 03 1.60E + 03 1.23E + 05 6.45E + 05 8.22E + 05 6.58E + 05 2.01E + 06 4.39E + 05
Std 5.72E + 01 5.99E + 01 1.04E + 05 4.59E + 05 9.33E + 05 6.67E + 05 2.49E + 06 4.42E + 05
Page 32 of 102

F15 Ave 3.15E + 03 3.28E + 03 7.23E + 03 1.67E + 06 4.22E + 04 2.13E + 06 7.40E + 06 2.94E + 04
Std 1.60E + 03 2.99E + 03 3.44E + 03 1.09E + 06 2.43E + 04 7.42E + 06 1.33E + 07 1.44E + 04
F16 Ave 2.35E + 03 2.59E + 03 2.62E + 03 2.98E + 03 3.13E + 03 2.77E + 03 4.36E + 03 3.29E + 03
Std 2.36E + 02 2.01E + 02 2.23E + 02 1.82E + 02 3.83E + 02 2.16E + 02 6.96E + 02 4.69E + 02
F17 Ave 1.95E + 03 2.02E + 03 2.01E + 03 2.17E + 03 2.63E + 03 2.06E + 03 2.78E + 03 2.69E + 03
Std 1.22E + 02 1.57E + 02 1.20E + 02 1.30E + 02 2.25E + 02 1.63E + 02 2.55E + 02 2.79E + 02
F18 Ave 3.60E + 04 1.06E + 05 5.45E + 05 2.10E + 06 2.64E + 06 2.68E + 06 1.80E + 07 9.50E + 05
Std 2.17E + 04 3.56E + 05 3.09E + 05 1.22E + 06 3.07E + 06 5.43E + 06 2.37E + 07 1.35E + 06
F19 Ave 3.70E + 03 2.20E + 03 1.51E + 04 1.25E + 06 1.01E + 05 3.15E + 06 2.29E + 07 4.25E + 04
Std 8.22E + 03 1.64E + 02 2.26E + 04 5.59E + 05 1.16E + 05 7.35E + 06 2.04E + 07 5.71E + 04
F20 Ave 2.35E + 03 2.47E + 03 2.37E + 03 2.52E + 03 2.80E + 03 2.45E + 03 2.91E + 03 2.92E + 03
Std 9.76E + 01 1.50E + 02 1.05E + 02 1.24E + 02 2.09E + 02 1.55E + 02 2.05E + 02 2.95E + 02
F21 Ave 2.38E + 03 2.38E + 03 2.45E + 03 2.48E + 03 2.53E + 03 2.41E + 03 2.64E + 03 2.58E + 03
Std 2.06E + 01 2.08E + 01 1.92E + 01 1.97E + 01 4.74E + 01 2.80E + 01 5.85E + 01 6.43E + 01
F22 Ave 4.08E + 03 2.31E + 03 2.51E + 03 3.44E + 03 5.76E + 03 5.27E + 03 7.89E + 03 6.59E + 03
Std 1.94E + 03 4.46E + 00 9.62E + 02 2.58E + 02 2.21E + 03 1.91E + 03 2.10E + 03 1.38E + 03
F23 Ave 2.75E + 03 2.73E + 03 2.83E + 03 2.84E + 03 2.96E + 03 2.80E + 03 3.19E + 03 3.12E + 03
Std 2.50E + 01 2.58E + 01 2.95E + 01 9.13E + 00 9.58E + 01 5.57E + 01 1.13E + 02 1.67E + 02
F24 Ave 2.91E + 03 2.90E + 03 3.00E + 03 3.05E + 03 3.15E + 03 2.97E + 03 3.28E + 03 3.35E + 03
Std 2.53E + 01 2.07E + 01 2.22E + 01 1.69E + 01 1.06E + 02 7.30E + 01 1.01E + 02 9.74E + 01
F25 Ave 2.90E + 03 2.90E + 03 2.93E + 03 2.98E + 03 2.95E + 03 3.03E + 03 3.21E + 03 3.02E + 03
Std 1.28E + 01 1.43E + 01 1.56E + 01 1.59E + 01 3.16E + 01 7.97E + 01 7.87E + 01 7.71E + 01
F26 Ave 4.58E + 03 4.31E + 03 3.56E + 03 5.35E + 03 6.88E + 03 5.04E + 03 8.28E + 03 7.46E + 03
Y. Fu et al.
Table 4  (continued)
Function LSHADE_cnEpSin LSHADE_SPACMA MadDE DE AVOA GWO WOA CPSOGSA

Std 3.43E + 02 4.81E + 02 9.37E + 02 4.75E + 02 9.46E + 02 4.01E + 02 1.18E + 03 1.12E + 03


F27 Ave 3.24E + 03 3.23E + 03 3.25E + 03 3.27E + 03 3.30E + 03 3.27E + 03 3.51E + 03 3.56E + 03
Std 2.26E + 01 1.67E + 01 1.13E + 01 7.78E + 00 6.02E + 01 2.99E + 01 1.32E + 02 1.67E + 02
F28 Ave 3.26E + 03 3.25E + 03 3.28E + 03 3.38E + 03 3.32E + 03 3.48E + 03 3.92E + 03 3.35E + 03
Std 2.44E + 01 3.24E + 01 2.55E + 01 2.55E + 01 2.30E + 01 8.78E + 01 2.58E + 02 6.33E + 01
F29 Ave 3.67E + 03 3.65E + 03 3.89E + 03 4.03E + 03 4.42E + 03 3.97E + 03 5.24E + 03 4.65E + 03
Std 1.35E + 02 1.30E + 02 1.14E + 02 1.33E + 02 3.11E + 02 2.45E + 02 5.90E + 02 3.58E + 02
F30 Ave 3.02E + 04 2.20E + 04 3.40E + 05 1.52E + 06 1.75E + 06 1.38E + 07 6.19E + 07 6.47E + 05
Std 2.37E + 04 1.19E + 04 1.69E + 05 7.64E + 05 1.28E + 06 9.88E + 06 5.31E + 07 6.16E + 05
(W|T|L) (24 |1 |5) (24 |0 |6) (29 |0 |1) (30 |0 |0) (30 |0 |0) (30 |0 |0) (30 |0 |0) (30 |0 |0)
Function SO GTO COA RIME GJO DBO NOA SBOA

F1 Ave 1.02E + 07 1.69E + 05 7.04E + 08 4.30E + 06 1.37E + 10 1.94E + 08 8.31E + 06 1.79E + 04


Secretary bird optimization algorithm: a new metaheuristic…

Std 8.75E + 06 2.06E + 05 9.16E + 08 1.67E + 06 4.64E + 09 1.36E + 08 5.47E + 06 1.18E + 04


F2 Ave 1.09E + 27 2.57E + 24 8.55E + 25 3.91E + 17 2.22E + 34 5.15E + 33 2.09E + 20 2.42E + 17
Std 5.83E + 27 1.15E + 25 3.91E + 26 9.09E + 17 7.22E + 34 2.45E + 34 9.79E + 20 4.66E + 17
F3 Ave 7.20E + 04 1.60E + 04 1.24E + 05 4.99E + 04 6.13E + 04 9.24E + 04 3.02E + 04 2.56E + 04
Std 7.93E + 03 6.20E + 03 3.15E + 04 1.42E + 04 1.01E + 04 2.54E + 04 7.37E + 03 7.87E + 03
F4 Ave 5.56E + 02 5.18E + 02 6.16E + 02 5.31E + 02 1.59E + 03 7.20E + 02 5.28E + 02 5.01E + 02
Std 3.37E + 01 3.24E + 01 6.91E + 01 3.48E + 01 8.48E + 02 2.28E + 02 3.87E + 01 2.81E + 01
F5 Ave 5.97E + 02 6.96E + 02 7.63E + 02 6.08E + 02 7.18E + 02 7.74E + 02 7.45E + 02 5.61E + 02
Std 1.54E + 01 4.34E + 01 4.72E + 01 3.29E + 01 3.67E + 01 6.99E + 01 4.60E + 01 1.97E + 01
F6 Ave 6.16E + 02 6.48E + 02 6.52E + 02 6.15E + 02 6.40E + 02 6.49E + 02 6.37E + 02 6.00E + 02
Std 7.15E + 00 9.67E + 00 1.32E + 01 5.36E + 00 9.75E + 00 1.19E + 01 1.24E + 01 1.91E + 00
F7 Ave 9.04E + 02 1.10E + 03 1.22E + 03 8.65E + 02 1.07E + 03 1.03E + 03 9.88E + 02 8.19E + 02
Std 3.62E + 01 7.88E + 01 1.18E + 02 3.90E + 01 6.77E + 01 7.17E + 01 7.21E + 01 3.43E + 01
Page 33 of 102 123

13
Table 4  (continued)
123

Function SO GTO COA RIME GJO DBO NOA SBOA

13
F8 Ave 8.94E + 02 9.58E + 02 9.85E + 02 9.11E + 02 9.81E + 02 1.03E + 03 9.79E + 02 8.77E + 02
Std 2.45E + 01 3.48E + 01 2.17E + 01 2.57E + 01 4.66E + 01 5.49E + 01 4.45E + 01 1.55E + 01
Page 34 of 102

F9 Ave 2.35E + 03 4.47E + 03 7.84E + 03 2.70E + 03 5.40E + 03 7.51E + 03 4.80E + 03 1.54E + 03


Std 7.21E + 02 8.57E + 02 1.94E + 03 1.53E + 03 1.23E + 03 2.49E + 03 1.75E + 03 4.45E + 02
F10 Ave 4.09E + 03 5.91E + 03 6.33E + 03 4.59E + 03 7.21E + 03 6.54E + 03 5.20E + 03 3.98E + 03
Std 7.08E + 02 8.46E + 02 9.47E + 02 4.99E + 02 1.50E + 03 1.18E + 03 4.97E + 02 5.04E + 02
F11 Ave 1.46E + 03 1.26E + 03 1.78E + 03 1.34E + 03 3.85E + 03 2.10E + 03 1.32E + 03 1.21E + 03
Std 1.31E + 02 5.61E + 01 7.70E + 02 5.99E + 01 1.84E + 03 9.73E + 02 5.82E + 01 4.50E + 01
F12 Ave 3.78E + 06 1.57E + 06 1.29E + 07 2.38E + 07 1.08E + 09 3.37E + 07 2.87E + 06 1.09E + 06
Std 4.86E + 06 1.46E + 06 1.19E + 07 1.77E + 07 9.83E + 08 3.68E + 07 3.03E + 06 1.47E + 06
F13 Ave 3.96E + 04 2.29E + 04 1.52E + 05 1.81E + 05 1.48E + 08 3.36E + 07 1.45E + 04 3.39E + 03
Std 2.44E + 04 2.03E + 04 1.11E + 05 2.33E + 05 1.67E + 08 9.75E + 07 1.67E + 04 2.48E + 04
F14 Ave 7.75E + 04 4.38E + 03 3.86E + 05 9.41E + 04 1.05E + 06 4.40E + 05 3.11E + 03 3.19E + 03
Std 9.48E + 04 2.84E + 03 4.03E + 05 7.95E + 04 1.16E + 06 9.00E + 05 2.82E + 03 3.95E + 04
F15 Ave 1.44E + 04 9.21E + 03 3.81E + 04 1.94E + 04 7.66E + 06 7.53E + 04 9.74E + 03 2.31E + 03
Std 1.19E + 04 9.34E + 03 3.84E + 04 1.33E + 04 1.18E + 07 1.03E + 05 1.02E + 04 1.10E + 04
F16 Ave 2.60E + 03 2.86E + 03 3.24E + 03 2.83E + 03 3.12E + 03 3.44E + 03 2.78E + 03 2.35E + 03
Std 2.66E + 02 2.84E + 02 3.64E + 02 3.44E + 02 4.03E + 02 4.61E + 02 3.49E + 02 3.20E + 02
F17 Ave 2.23E + 03 2.27E + 03 2.34E + 03 2.21E + 03 2.32E + 03 2.68E + 03 2.51E + 03 1.80E + 03
Std 1.75E + 02 2.72E + 02 2.26E + 02 2.12E + 02 2.25E + 02 2.73E + 02 2.57E + 02 1.59E + 02
F18 Ave 1.35E + 06 1.13E + 05 1.97E + 06 1.45E + 06 6.69E + 06 6.36E + 06 1.60E + 05 5.42E + 05
Std 1.57E + 06 6.77E + 04 1.73E + 06 1.19E + 06 1.72E + 07 6.70E + 06 1.49E + 05 4.96E + 05
F19 Ave 1.66E + 04 9.40E + 03 4.17E + 04 1.86E + 04 1.75E + 07 1.71E + 06 7.46E + 03 2.08E + 03
Std 1.19E + 04 7.48E + 03 4.76E + 04 1.54E + 04 3.85E + 07 3.27E + 06 6.17E + 03 1.38E + 04
F20 Ave 2.48E + 03 2.63E + 03 2.64E + 03 2.54E + 03 2.68E + 03 2.65E + 03 2.68E + 03 2.23E + 03
Y. Fu et al.
Table 4  (continued)
Function SO GTO COA RIME GJO DBO NOA SBOA

Std 1.82E + 02 2.10E + 02 2.08E + 02 2.27E + 02 2.64E + 02 2.13E + 02 2.32E + 02 1.02E + 02


F21 Ave 2.40E + 03 2.44E + 03 2.46E + 03 2.41E + 03 2.49E + 03 2.55E + 03 2.51E + 03 2.37E + 03
Std 2.10E + 01 4.02E + 01 4.79E + 01 2.96E + 01 3.86E + 01 4.46E + 01 5.53E + 01 1.51E + 01
F22 Ave 3.91E + 03 3.22E + 03 3.61E + 03 5.02E + 03 5.86E + 03 5.61E + 03 6.52E + 03 2.30E + 03
Std 1.85E + 03 1.87E + 03 2.25E + 03 2.16E + 03 2.60E + 03 2.23E + 03 1.57E + 03 8.60E + 02
F23 Ave 2.81E + 03 2.88E + 03 2.88E + 03 2.80E + 03 2.91E + 03 3.01E + 03 2.90E + 03 2.72E + 03
Std 4.67E + 01 6.41E + 01 6.31E + 01 4.12E + 01 6.11E + 01 9.43E + 01 1.12E + 02 2.31E + 01
F24 Ave 2.96E + 03 3.06E + 03 3.02E + 03 2.95E + 03 3.11E + 03 3.16E + 03 3.14E + 03 2.89E + 03
Std 3.95E + 01 1.19E + 02 6.16E + 01 4.15E + 01 7.21E + 01 8.52E + 01 1.54E + 02 1.73E + 01
F25 Ave 2.94E + 03 2.91E + 03 2.97E + 03 2.92E + 03 3.27E + 03 3.02E + 03 2.92E + 03 2.90E + 03
Std 3.21E + 01 1.58E + 01 3.42E + 01 3.03E + 01 2.29E + 02 1.30E + 02 2.03E + 01 2.16E + 01
F26 Ave 5.53E + 03 5.10E + 03 6.91E + 03 4.94E + 03 6.32E + 03 6.94E + 03 5.92E + 03 4.26E + 03
Std 5.97E + 02 1.64E + 03 1.30E + 03 8.63E + 02 6.40E + 02 8.95E + 02 1.49E + 03 7.51E + 02
Secretary bird optimization algorithm: a new metaheuristic…

F27 Ave 3.30E + 03 3.33E + 03 3.30E + 03 3.25E + 03 3.36E + 03 3.34E + 03 3.29E + 03 3.22E + 03
Std 4.12E + 01 8.07E + 01 4.59E + 01 1.85E + 01 6.42E + 01 8.21E + 01 6.75E + 01 1.26E + 01
F28 Ave 3.36E + 03 3.28E + 03 3.36E + 03 3.28E + 03 3.85E + 03 3.74E + 03 3.27E + 03 3.25E + 03
Std 4.53E + 01 3.16E + 01 4.62E + 01 3.00E + 01 3.40E + 02 9.15E + 02 1.78E + 01 2.97E + 01
F29 Ave 4.11E + 03 4.39E + 03 4.17E + 03 4.10E + 03 4.33E + 03 4.48E + 03 4.09E + 03 3.63E + 03
Std 1.93E + 02 4.52E + 02 3.07E + 02 2.05E + 02 3.77E + 02 3.43E + 02 3.02E + 02 1.78E + 02
F30 Ave 2.88E + 05 2.35E + 04 1.14E + 06 6.15E + 05 4.32E + 07 4.63E + 06 2.61E + 04 1.20E + 04
Std 3.23E + 05 1.64E + 04 9.77E + 05 3.40E + 05 4.33E + 07 9.50E + 06 1.57E + 04 1.07E + 05
(W|T|L) (30 |0 |0) (28 |0 |2) (30 |0 |0) (30 |0 |0) (30 |0 |0) (30 |0 |0) (28 |0 |2)
Page 35 of 102 123

13
Table 5  The experimental results of 16 algorithms in CEC-2017 (Dim = 50)
123

Function LSHADE_cnEpSin LSHADE_SPACMA MadDE DE AVOA GWO WOA CPSOGSA

13
F1 Ave 7.63E + 07 6.86E + 07 7.54E + 08 2.10E + 09 3.17E + 08 1.09E + 10 2.06E + 10 7.73E + 09
Std 5.04E + 07 4.96E + 07 3.57E + 08 3.25E + 08 4.35E + 08 4.24E + 09 4.75E + 09 4.18E + 09
Page 36 of 102

F2 Ave 1.00E + 30 1.00E + 30 1.00E + 30 8.51E + 58 5.74E + 55 2.73E + 62 1.24E + 82 6.83E + 71


Std 1.43E + 14 1.43E + 14 1.43E + 14 1.60E + 59 3.07E + 56 1.50E + 63 5.98E + 82 3.21E + 72
F3 Ave 1.26E + 05 1.67E + 05 1.88E + 05 2.91E + 05 1.85E + 05 1.61E + 05 3.07E + 05 3.55E + 05
Std 5.90E + 04 8.51E + 04 2.36E + 04 3.40E + 04 3.92E + 04 2.68E + 04 1.06E + 05 7.48E + 04
F4 Ave 6.55E + 02 6.77E + 02 7.94E + 02 1.33E + 03 7.99E + 02 1.62E + 03 4.20E + 03 3.68E + 03
Std 6.79E + 01 6.38E + 01 8.43E + 01 1.24E + 02 7.52E + 01 5.18E + 02 1.28E + 03 2.15E + 03
F5 Ave 7.28E + 02 7.26E + 02 9.08E + 02 9.41E + 02 8.64E + 02 7.94E + 02 1.14E + 03 1.04E + 03
Std 3.54E + 01 3.51E + 01 2.41E + 01 1.93E + 01 3.87E + 01 6.37E + 01 8.39E + 01 8.76E + 01
F6 Ave 6.19E + 02 6.07E + 02 6.37E + 02 6.17E + 02 6.63E + 02 6.25E + 02 6.96E + 02 6.76E + 02
Std 5.33E + 00 2.13E + 00 4.68E + 00 1.68E + 00 6.07E + 00 4.65E + 00 1.34E + 01 6.14E + 00
F7 Ave 1.12E + 03 1.09E + 03 1.37E + 03 1.29E + 03 1.66E + 03 1.17E + 03 1.94E + 03 2.81E + 03
Std 5.75E + 01 5.71E + 01 5.45E + 01 2.99E + 01 1.09E + 02 1.06E + 02 1.22E + 02 3.12E + 02
F8 Ave 1.03E + 03 1.02E + 03 1.22E + 03 1.24E + 03 1.17E + 03 1.05E + 03 1.44E + 03 1.26E + 03
Std 4.24E + 01 4.01E + 01 2.30E + 01 2.14E + 01 5.00E + 01 3.56E + 01 7.31E + 01 6.16E + 01
F9 Ave 6.72E + 03 4.98E + 03 1.84E + 04 1.72E + 04 1.38E + 04 1.20E + 04 3.87E + 04 1.97E + 04
Std 2.99E + 03 2.43E + 03 2.33E + 03 2.15E + 03 1.70E + 03 5.37E + 03 7.93E + 03 3.19E + 03
F10 Ave 9.56E + 03 8.99E + 03 1.11E + 04 1.28E + 04 8.98E + 03 8.40E + 03 1.35E + 04 8.25E + 03
Std 1.83E + 03 6.84E + 02 4.58E + 02 4.71E + 02 1.13E + 03 1.92E + 03 9.63E + 02 9.22E + 02
F11 Ave 1.59E + 03 2.70E + 03 3.62E + 03 9.04E + 03 2.02E + 03 6.93E + 03 8.87E + 03 9.49E + 03
Std 1.93E + 02 2.85E + 03 7.25E + 02 2.57E + 03 5.43E + 02 2.79E + 03 2.05E + 03 5.16E + 03
F12 Ave 1.56E + 07 1.30E + 07 3.18E + 07 5.80E + 08 1.05E + 08 1.61E + 09 4.32E + 09 1.48E + 09
Std 1.06E + 07 8.84E + 06 1.70E + 07 1.44E + 08 6.48E + 07 1.86E + 09 1.44E + 09 1.26E + 09
F13 Ave 4.07E + 04 2.32E + 04 1.39E + 05 3.26E + 07 2.31E + 05 2.09E + 08 4.86E + 08 6.65E + 07
Std 1.80E + 04 1.13E + 04 1.81E + 05 1.47E + 07 3.19E + 05 1.62E + 08 2.85E + 08 1.35E + 08
Y. Fu et al.
Table 5  (continued)
Function LSHADE_cnEpSin LSHADE_SPACMA MadDE DE AVOA GWO WOA CPSOGSA

F14 Ave 2.32E + 04 1.81E + 04 1.23E + 06 3.67E + 06 2.79E + 06 1.35E + 06 6.19E + 06 2.02E + 06
Std 3.25E + 04 1.77E + 04 8.55E + 05 1.81E + 06 1.72E + 06 2.33E + 06 5.58E + 06 1.99E + 06
F15 Ave 1.18E + 04 8.59E + 03 4.68E + 04 5.77E + 06 6.18E + 04 1.98E + 07 5.69E + 07 5.79E + 04
Std 6.30E + 03 5.95E + 03 3.87E + 04 3.67E + 06 3.83E + 04 2.37E + 07 8.44E + 07 2.56E + 04
F16 Ave 3.16E + 03 3.43E + 03 3.53E + 03 4.56E + 03 4.27E + 03 3.35E + 03 6.42E + 03 4.30E + 03
Std 3.83E + 02 3.92E + 02 3.31E + 02 3.01E + 02 4.79E + 02 5.62E + 02 6.95E + 02 5.92E + 02
F17 Ave 2.97E + 03 3.26E + 03 3.12E + 03 3.60E + 03 3.91E + 03 3.21E + 03 4.48E + 03 3.74E + 03
Std 3.24E + 02 2.41E + 02 1.28E + 02 2.29E + 02 3.65E + 02 5.03E + 02 5.53E + 02 5.02E + 02
F18 Ave 2.10E + 05 2.06E + 05 5.89E + 06 1.46E + 07 5.89E + 06 1.03E + 07 4.98E + 07 3.47E + 06
Std 1.64E + 05 1.37E + 05 4.35E + 06 7.13E + 06 5.78E + 06 9.40E + 06 3.37E + 07 3.02E + 06
F19 Ave 2.23E + 04 1.77E + 04 3.62E + 04 1.24E + 06 3.63E + 05 5.88E + 06 2.54E + 07 5.45E + 05
Std 1.86E + 04 1.09E + 04 1.24E + 04 8.63E + 05 3.75E + 05 1.21E + 07 2.39E + 07 7.76E + 05
F20 Ave 3.06E + 03 3.48E + 03 3.17E + 03 3.50E + 03 3.49E + 03 3.35E + 03 3.82E + 03 3.55E + 03
Secretary bird optimization algorithm: a new metaheuristic…

Std 2.76E + 02 2.66E + 02 2.33E + 02 1.90E + 02 3.65E + 02 5.03E + 02 2.91E + 02 3.46E + 02


F21 Ave 2.53E + 03 2.52E + 03 2.69E + 03 2.74E + 03 2.81E + 03 2.57E + 03 3.09E + 03 2.96E + 03
Std 3.60E + 01 2.95E + 01 2.91E + 01 1.97E + 01 8.45E + 01 7.51E + 01 1.14E + 02 1.18E + 02
F22 Ave 1.09E + 04 1.03E + 04 1.30E + 04 1.36E + 04 1.07E + 04 1.01E + 04 1.47E + 04 1.07E + 04
Std 2.23E + 03 3.32E + 03 1.00E + 03 2.36E + 03 9.46E + 02 1.98E + 03 1.05E + 03 7.98E + 02
F23 Ave 3.03E + 03 3.01E + 03 3.18E + 03 3.18E + 03 3.43E + 03 3.06E + 03 3.89E + 03 3.62E + 03
Std 6.10E + 01 3.55E + 01 3.29E + 01 2.64E + 01 1.44E + 02 8.50E + 01 1.60E + 02 1.27E + 02
F24 Ave 3.16E + 03 3.17E + 03 3.35E + 03 3.41E + 03 3.64E + 03 3.21E + 03 3.92E + 03 3.97E + 03
Std 4.29E + 01 5.71E + 01 4.71E + 01 2.45E + 01 1.44E + 02 1.10E + 02 1.78E + 02 1.34E + 02
F25 Ave 3.14E + 03 3.15E + 03 3.37E + 03 3.74E + 03 3.27E + 03 4.08E + 03 5.24E + 03 4.47E + 03
Std 4.83E + 01 6.02E + 01 8.99E + 01 1.08E + 02 6.37E + 01 4.45E + 02 6.69E + 02 7.02E + 02
F26 Ave 6.67E + 03 6.21E + 03 7.04E + 03 8.42E + 03 1.07E + 04 7.04E + 03 1.54E + 04 1.26E + 04
Page 37 of 102 123

13
Table 5  (continued)
123

Function LSHADE_cnEpSin LSHADE_SPACMA MadDE DE AVOA GWO WOA CPSOGSA

13
Std 6.77E + 02 6.03E + 02 2.27E + 03 2.44E + 02 1.91E + 03 6.38E + 02 1.58E + 03 1.48E + 03
F27 Ave 3.57E + 03 3.55E + 03 3.71E + 03 3.88E + 03 3.90E + 03 3.74E + 03 5.03E + 03 4.79E + 03
Page 38 of 102

Std 1.02E + 02 1.24E + 02 7.85E + 01 5.90E + 01 1.98E + 02 1.16E + 02 7.48E + 02 5.14E + 02


F28 Ave 3.47E + 03 3.50E + 03 3.80E + 03 4.49E + 03 3.93E + 03 4.86E + 03 6.24E + 03 4.37E + 03
Std 5.79E + 01 8.77E + 01 1.05E + 02 2.42E + 02 1.92E + 02 4.38E + 02 5.64E + 02 4.40E + 02
F29 Ave 4.47E + 03 4.27E + 03 5.04E + 03 5.26E + 03 5.53E + 03 4.90E + 03 9.25E + 03 7.06E + 03
Std 3.11E + 02 3.04E + 02 2.38E + 02 2.32E + 02 6.79E + 02 4.12E + 02 1.50E + 03 1.18E + 03
F30 Ave 6.36E + 06 3.37E + 06 2.36E + 07 4.53E + 07 2.97E + 07 1.67E + 08 3.32E + 08 1.43E + 08
Std 2.82E + 06 1.46E + 06 7.17E + 06 1.17E + 07 1.51E + 07 7.52E + 07 1.18E + 08 4.44E + 07
(W|T|L) (25 |2 |3) (23 |1 |6) (29 |0 |1) (29 |1 |0) (30 |0 |0) (30 |0 |0) (30 |0 |0) (30 |0 |0)
Function SO GTO COA RIME GJO DBO NOA SBOA

F1 Ave 7.62E + 08 1.94E + 08 1.05E + 10 4.31E + 07 3.82E + 10 6.41E + 09 5.45E + 08 3.82E + 07


Std 4.60E + 08 1.78E + 08 5.73E + 09 1.31E + 07 8.55E + 09 9.48E + 09 2.85E + 08 8.80E + 07
F2 Ave 2.28E + 56 4.81E + 54 2.57E + 60 2.03E + 42 8.87E + 62 4.55E + 69 7.88E + 47 1.83E + 33
Std 1.23E + 57 2.63E + 55 1.41E + 61 1.10E + 43 2.51E + 63 2.49E + 70 3.79E + 48 2.41E + 41
F3 Ave 1.69E + 05 7.54E + 04 3.32E + 05 2.15E + 05 1.41E + 05 2.68E + 05 1.42E + 05 1.03E + 05
Std 1.81E + 04 1.56E + 04 4.93E + 04 4.09E + 04 1.87E + 04 7.46E + 04 3.70E + 04 2.12E + 04
F4 Ave 8.76E + 02 7.45E + 02 2.00E + 03 7.10E + 02 5.04E + 03 1.49E + 03 7.87E + 02 6.39E + 02
Std 1.49E + 02 1.02E + 02 7.99E + 02 7.30E + 01 1.95E + 03 1.20E + 03 9.48E + 01 6.93E + 01
F5 Ave 7.10E + 02 8.47E + 02 8.92E + 02 7.58E + 02 9.39E + 02 9.72E + 02 9.02E + 02 7.15E + 02
Std 2.37E + 01 3.94E + 01 3.17E + 01 4.76E + 01 6.95E + 01 1.09E + 02 6.75E + 01 3.89E + 01
F6 Ave 6.29E + 02 6.59E + 02 6.68E + 02 6.30E + 02 6.56E + 02 6.66E + 02 6.60E + 02 6.12E + 02
Std 7.27E + 00 7.10E + 00 4.69E + 00 6.20E + 00 9.67E + 00 1.16E + 01 1.11E + 01 4.91E + 00
F7 Ave 1.19E + 03 1.52E + 03 1.76E + 03 1.12E + 03 1.43E + 03 1.40E + 03 1.41E + 03 1.11E + 03
Std 6.43E + 01 9.67E + 01 8.63E + 01 7.09E + 01 9.36E + 01 1.69E + 02 1.21E + 02 7.55E + 01
Y. Fu et al.
Table 5  (continued)
Function SO GTO COA RIME GJO DBO NOA SBOA

F8 Ave 1.02E + 03 1.13E + 03 1.24E + 03 1.06E + 03 1.23E + 03 1.28E + 03 1.19E + 03 1.01E + 03


Std 2.36E + 01 5.37E + 01 2.51E + 01 4.84E + 01 6.50E + 01 1.06E + 02 5.99E + 01 3.91E + 01
F9 Ave 6.82E + 03 1.22E + 04 3.17E + 04 1.14E + 04 2.38E + 04 2.94E + 04 1.60E + 04 1.84E + 03
Std 2.62E + 03 1.65E + 03 4.73E + 03 4.06E + 03 4.70E + 03 7.94E + 03 3.69E + 03 2.36E + 03
F10 Ave 9.81E + 03 9.61E + 03 1.37E + 04 8.06E + 03 1.12E + 04 1.18E + 04 8.70E + 03 7.33E + 03
Std 2.78E + 03 1.77E + 03 8.02E + 02 9.93E + 02 2.20E + 03 2.33E + 03 9.74E + 02 7.75E + 02
F11 Ave 3.27E + 03 1.47E + 03 6.08E + 03 1.77E + 03 1.03E + 04 5.20E + 03 1.69E + 03 1.45E + 03
Std 1.09E + 03 8.87E + 01 2.88E + 03 1.51E + 02 2.78E + 03 4.40E + 03 1.52E + 02 1.28E + 02
F12 Ave 5.56E + 07 2.45E + 07 4.12E + 08 1.90E + 08 9.82E + 09 8.57E + 08 6.61E + 07 1.23E + 07
Std 4.86E + 07 3.01E + 07 6.85E + 08 9.78E + 07 5.08E + 09 5.25E + 08 4.11E + 07 1.16E + 07
F13 Ave 3.22E + 05 3.42E + 04 2.55E + 06 4.73E + 05 2.87E + 09 1.30E + 08 5.40E + 04 1.48E + 04
Std 3.02E + 05 1.86E + 04 4.37E + 06 2.20E + 05 3.44E + 09 2.02E + 08 2.92E + 04 1.32E + 04
F14 Ave 7.08E + 05 1.23E + 05 1.65E + 06 7.73E + 05 2.16E + 06 3.01E + 06 7.32E + 04 2.10E + 05
Secretary bird optimization algorithm: a new metaheuristic…

Std 7.19E + 05 9.86E + 04 2.08E + 06 4.61E + 05 2.36E + 06 3.55E + 06 4.70E + 04 1.35E + 05


F15 Ave 3.56E + 04 1.80E + 04 8.92E + 04 1.04E + 05 2.58E + 08 2.13E + 07 1.42E + 04 1.06E + 04
Std 2.63E + 04 9.04E + 03 4.27E + 04 5.16E + 04 4.17E + 08 6.63E + 07 7.69E + 03 8.18E + 03
F16 Ave 3.34E + 03 3.79E + 03 3.94E + 03 3.58E + 03 4.24E + 03 4.87E + 03 3.73E + 03 2.95E + 03
Std 3.91E + 02 6.16E + 02 6.33E + 02 5.01E + 02 6.43E + 02 6.51E + 02 3.66E + 02 4.38E + 02
F17 Ave 3.27E + 03 3.43E + 03 3.47E + 03 3.44E + 03 3.68E + 03 4.25E + 03 3.38E + 03 2.71E + 03
Std 2.84E + 02 4.38E + 02 4.26E + 02 2.84E + 02 5.32E + 02 5.02E + 02 3.41E + 02 2.66E + 02
F18 Ave 4.75E + 06 8.72E + 05 9.49E + 06 5.62E + 06 1.74E + 07 1.01E + 07 8.93E + 05 2.28E + 06
Std 3.99E + 06 7.36E + 05 9.26E + 06 3.88E + 06 1.86E + 07 9.63E + 06 5.61E + 05 1.94E + 06
F19 Ave 9.46E + 04 1.66E + 04 4.63E + 05 3.38E + 05 2.59E + 08 6.46E + 06 1.67E + 04 1.82E + 04
Std 1.12E + 05 1.16E + 04 8.33E + 05 2.85E + 05 3.80E + 08 7.61E + 06 1.23E + 04 1.38E + 04
F20 Ave 3.22E + 03 3.33E + 03 3.74E + 03 3.29E + 03 3.61E + 03 3.80E + 03 3.35E + 03 2.89E + 03
Page 39 of 102 123

13
Table 5  (continued)
123

Function SO GTO COA RIME GJO DBO NOA SBOA

13
Std 3.24E + 02 2.87E + 02 2.40E + 02 2.97E + 02 4.10E + 02 3.57E + 02 3.69E + 02 3.54E + 02
F21 Ave 2.51E + 03 2.64E + 03 2.70E + 03 2.57E + 03 2.74E + 03 2.88E + 03 2.79E + 03 2.48E + 03
Page 40 of 102

Std 3.99E + 01 5.68E + 01 8.11E + 01 5.08E + 01 7.94E + 01 1.03E + 02 1.11E + 02 4.11E + 01


F22 Ave 1.13E + 04 1.13E + 04 1.36E + 04 9.66E + 03 1.35E + 04 1.33E + 04 1.08E + 04 7.66E + 03
Std 2.26E + 03 1.75E + 03 2.54E + 03 9.34E + 02 2.31E + 03 2.28E + 03 8.72E + 02 3.07E + 03
F23 Ave 3.09E + 03 3.29E + 03 3.32E + 03 3.07E + 03 3.36E + 03 3.59E + 03 3.42E + 03 2.94E + 03
Std 6.33E + 01 1.35E + 02 1.26E + 02 7.04E + 01 1.00E + 02 1.77E + 02 2.01E + 02 5.82E + 01
F24 Ave 3.23E + 03 3.39E + 03 3.39E + 03 3.18E + 03 3.50E + 03 3.75E + 03 3.58E + 03 3.10E + 03
Std 6.74E + 01 1.48E + 02 1.02E + 02 7.01E + 01 8.20E + 01 1.83E + 02 2.62E + 02 5.04E + 01
F25 Ave 3.36E + 03 3.22E + 03 3.98E + 03 3.14E + 03 6.00E + 03 3.57E + 03 3.26E + 03 3.10E + 03
Std 1.33E + 02 5.70E + 01 4.26E + 02 3.97E + 01 7.79E + 02 6.46E + 02 6.30E + 01 4.37E + 01
F26 Ave 7.94E + 03 9.81E + 03 1.19E + 04 6.91E + 03 1.02E + 04 1.06E + 04 1.04E + 04 6.04E + 03
Std 8.08E + 02 2.18E + 03 1.91E + 03 6.85E + 02 9.77E + 02 1.41E + 03 1.82E + 03 1.42E + 03
F27 Ave 3.81E + 03 3.94E + 03 3.86E + 03 3.62E + 03 4.14E + 03 3.90E + 03 3.76E + 03 3.38E + 03
Std 1.31E + 02 2.92E + 02 2.11E + 02 9.98E + 01 2.14E + 02 2.39E + 02 2.12E + 02 6.97E + 01
F28 Ave 4.21E + 03 3.66E + 03 4.47E + 03 3.41E + 03 6.26E + 03 6.09E + 03 3.69E + 03 3.46E + 03
Std 4.49E + 02 1.24E + 02 3.32E + 02 5.10E + 01 8.19E + 02 2.14E + 03 1.22E + 02 5.66E + 01
F29 Ave 5.01E + 03 5.45E + 03 5.65E + 03 5.26E + 03 6.27E + 03 6.17E + 03 5.35E + 03 4.26E + 03
Std 3.46E + 02 8.76E + 02 6.36E + 02 3.21E + 02 7.17E + 02 6.00E + 02 3.81E + 02 4.03E + 02
F30 Ave 1.59E + 07 2.45E + 06 3.44E + 07 6.15E + 07 4.69E + 08 3.81E + 07 4.61E + 06 1.22E + 06
Std 6.77E + 06 9.62E + 05 1.98E + 07 2.25E + 07 2.70E + 08 3.37E + 07 1.72E + 06 3.54E + 05
(W|T|L) (29 |1 |0) (26 |0 |4) (30 |0 |0) (29 |0 |1) (30 |0 |0) (30 |0 |0) (27 |0 |3)
Y. Fu et al.
Table 6  The experimental results of 16 algorithms in CEC-2017 (Dim = 100)
Function LSHADE_cnEpSin LSHADE_SPACMA MadDE DE AVOA GWO WOA CPSOGSA

F1 Ave 7.28E + 09 9.19E + 09 4.52E + 10 4.25E + 10 1.18E + 10 5.57E + 10 1.07E + 11 7.33E + 10


Std 2.02E + 09 2.87E + 09 8.15E + 09 3.95E + 09 3.83E + 09 9.88E + 09 1.02E + 10 2.15E + 10
F2 Ave 1.00E + 30 1.00E + 30 1.00E + 30 2.72E + 146 6.94E + 139 1.33E + 131 3.32E + 182 5.64E + 168
Std 1.43E + 14 1.43E + 14 1.43E + 14 1.40E + 147 1.97E + 140 4.03E + 131 NaN NaN
F3 Ave 3.93E + 05 4.18E + 05 4.83E + 05 7.09E + 05 4.41E + 05 5.44E + 05 9.04E + 05 7.93E + 05
Std 8.13E + 04 1.37E + 05 4.79E + 04 4.42E + 04 1.68E + 05 8.13E + 04 1.02E + 05 6.48E + 04
F4 Ave 1.76E + 03 1.88E + 03 4.77E + 03 8.34E + 03 2.53E + 03 6.04E + 03 2.02E + 04 1.71E + 04
Std 3.29E + 02 4.59E + 02 8.99E + 02 8.32E + 02 3.93E + 02 1.76E + 03 3.22E + 03 5.88E + 03
F5 Ave 1.30E + 03 1.28E + 03 1.68E + 03 1.75E + 03 1.41E + 03 1.23E + 03 1.98E + 03 1.82E + 03
Std 1.10E + 02 8.49E + 01 4.31E + 01 3.49E + 01 7.82E + 01 8.43E + 01 1.18E + 02 1.26E + 02
F6 Ave 6.42E + 02 6.28E + 02 6.71E + 02 6.48E + 02 6.68E + 02 6.47E + 02 7.09E + 02 6.87E + 02
Std 7.16E + 00 3.27E + 00 4.35E + 00 2.00E + 00 3.74E + 00 4.79E + 00 1.15E + 01 6.22E + 00
F7 Ave 2.54E + 03 2.23E + 03 3.05E + 03 2.87E + 03 3.20E + 03 2.20E + 03 3.83E + 03 6.88E + 03
Secretary bird optimization algorithm: a new metaheuristic…

Std 2.43E + 02 2.41E + 02 1.27E + 02 9.33E + 01 1.60E + 02 1.66E + 02 1.95E + 02 5.20E + 02


F8 Ave 1.60E + 03 1.61E + 03 2.07E + 03 2.05E + 03 1.83E + 03 1.56E + 03 2.37E + 03 2.15E + 03
Std 9.74E + 01 1.13E + 02 5.16E + 01 4.53E + 01 9.28E + 01 6.92E + 01 1.26E + 02 1.07E + 02
F9 Ave 3.10E + 04 3.37E + 04 5.88E + 04 8.05E + 04 3.05E + 04 4.43E + 04 7.74E + 04 4.76E + 04
Std 7.51E + 03 9.36E + 03 4.96E + 03 8.88E + 03 3.88E + 03 1.28E + 04 1.74E + 04 5.25E + 03
F10 Ave 2.33E + 04 2.42E + 04 2.70E + 04 3.02E + 04 1.80E + 04 2.09E + 04 2.94E + 04 1.76E + 04
Std 2.25E + 03 3.32E + 03 8.53E + 02 6.51E + 02 1.46E + 03 5.88E + 03 1.44E + 03 1.51E + 03
F11 Ave 5.00E + 04 5.95E + 04 1.06E + 05 1.27E + 05 1.04E + 05 9.21E + 04 2.91E + 05 2.17E + 05
Std 3.42E + 04 4.03E + 04 1.34E + 04 1.68E + 04 2.81E + 04 2.43E + 04 1.16E + 05 5.60E + 04
F12 Ave 7.33E + 08 9.49E + 08 2.77E + 09 7.87E + 09 1.11E + 09 1.03E + 10 2.95E + 10 1.55E + 10
Std 3.26E + 08 4.36E + 08 1.22E + 09 9.34E + 08 5.11E + 08 4.91E + 09 5.54E + 09 6.80E + 09
F13 Ave 2.57E + 05 7.50E + 05 1.51E + 07 8.87E + 07 5.47E + 05 1.61E + 09 2.92E + 09 8.84E + 08
Std 1.00E + 06 1.11E + 06 1.83E + 07 3.01E + 07 1.83E + 06 1.64E + 09 1.21E + 09 8.83E + 08
Page 41 of 102 123

13
Table 6  (continued)
123

Function LSHADE_cnEpSin LSHADE_SPACMA MadDE DE AVOA GWO WOA CPSOGSA

13
F14 Ave 5.81E + 05 8.58E + 05 7.97E + 06 2.91E + 07 8.83E + 06 9.78E + 06 2.23E + 07 6.51E + 06
Std 3.55E + 05 4.74E + 05 1.86E + 06 6.60E + 06 3.34E + 06 4.56E + 06 9.20E + 06 2.97E + 06
Page 42 of 102

F15 Ave 5.69E + 04 2.70E + 04 1.01E + 05 1.97E + 07 1.04E + 05 3.58E + 08 5.09E + 08 2.21E + 07
Std 2.02E + 04 1.23E + 04 9.05E + 04 8.92E + 06 1.96E + 05 5.50E + 08 2.06E + 08 3.94E + 07
F16 Ave 6.83E + 03 7.29E + 03 9.55E + 03 1.12E + 04 8.17E + 03 6.77E + 03 1.75E + 04 8.61E + 03
Std 9.36E + 02 8.84E + 02 7.56E + 02 4.70E + 02 9.45E + 02 8.00E + 02 2.92E + 03 6.93E + 02
F17 Ave 5.21E + 03 5.67E + 03 6.09E + 03 7.95E + 03 6.32E + 03 5.60E + 03 3.20E + 04 7.09E + 03
Std 5.22E + 02 7.85E + 02 3.82E + 02 4.19E + 02 6.92E + 02 5.51E + 02 2.80E + 04 8.40E + 02
F18 Ave 1.08E + 06 9.47E + 05 7.14E + 06 4.23E + 07 7.40E + 06 1.06E + 07 1.90E + 07 7.84E + 06
Std 6.87E + 05 3.64E + 05 1.85E + 06 1.24E + 07 3.82E + 06 6.15E + 06 1.12E + 07 3.93E + 06
F19 Ave 3.72E + 05 1.70E + 05 2.50E + 05 2.91E + 07 2.84E + 06 2.04E + 08 5.06E + 08 6.81E + 07
Std 2.71E + 05 2.54E + 05 1.35E + 05 1.50E + 07 2.08E + 06 2.29E + 08 1.93E + 08 1.31E + 08
F20 Ave 6.12E + 03 6.68E + 03 6.25E + 03 6.91E + 03 6.11E + 03 5.46E + 03 7.14E + 03 6.10E + 03
Std 7.27E + 02 3.72E + 02 3.82E + 02 3.54E + 02 6.60E + 02 1.10E + 03 5.98E + 02 6.19E + 02
F21 Ave 3.15E + 03 3.13E + 03 3.49E + 03 3.59E + 03 3.67E + 03 3.15E + 03 4.46E + 03 4.23E + 03
Std 1.20E + 02 1.13E + 02 8.85E + 01 3.89E + 01 1.96E + 02 1.74E + 02 2.02E + 02 2.10E + 02
F22 Ave 2.62E + 04 2.61E + 04 3.00E + 04 3.27E + 04 2.21E + 04 2.16E + 04 3.22E + 04 1.96E + 04
Std 2.78E + 03 2.79E + 03 8.01E + 02 4.04E + 02 1.64E + 03 3.11E + 03 1.35E + 03 1.63E + 03
F23 Ave 3.80E + 03 3.64E + 03 3.98E + 03 3.99E + 03 4.43E + 03 3.75E + 03 5.32E + 03 5.19E + 03
Std 1.37E + 02 6.84E + 01 8.71E + 01 3.57E + 01 3.38E + 02 6.94E + 01 2.24E + 02 3.65E + 02
F24 Ave 4.56E + 03 4.45E + 03 4.65E + 03 4.67E + 03 5.35E + 03 4.47E + 03 6.87E + 03 6.68E + 03
Std 1.86E + 02 1.83E + 02 1.13E + 02 6.18E + 01 2.50E + 02 1.49E + 02 4.14E + 02 4.23E + 02
F25 Ave 4.35E + 03 4.36E + 03 6.61E + 03 1.10E + 04 4.93E + 03 7.23E + 03 1.08E + 04 1.06E + 04
Std 2.70E + 02 2.51E + 02 6.35E + 02 7.68E + 02 4.33E + 02 8.41E + 02 1.23E + 03 2.06E + 03
F26 Ave 1.78E + 04 1.78E + 04 2.51E + 04 1.99E + 04 2.73E + 04 1.73E + 04 3.84E + 04 3.37E + 04
Y. Fu et al.
Table 6  (continued)
Function LSHADE_cnEpSin LSHADE_SPACMA MadDE DE AVOA GWO WOA CPSOGSA

Std 1.54E + 03 2.33E + 03 2.06E + 03 5.55E + 02 2.55E + 03 1.79E + 03 3.44E + 03 2.73E + 03


F27 Ave 4.19E + 03 3.96E + 03 4.38E + 03 5.02E + 03 4.38E + 03 4.27E + 03 6.33E + 03 6.43E + 03
Std 2.18E + 02 1.94E + 02 1.27E + 02 1.49E + 02 3.12E + 02 1.77E + 02 8.42E + 02 8.75E + 02
F28 Ave 5.51E + 03 5.75E + 03 8.73E + 03 1.40E + 04 6.15E + 03 1.04E + 04 1.44E + 04 1.01E + 04
Std 8.03E + 02 9.62E + 02 8.39E + 02 1.05E + 03 7.30E + 02 1.92E + 03 8.96E + 02 1.80E + 03
F29 Ave 8.27E + 03 8.13E + 03 1.08E + 04 1.10E + 04 9.86E + 03 9.44E + 03 2.13E + 04 1.38E + 04
Std 7.68E + 02 5.50E + 02 4.70E + 02 4.50E + 02 9.11E + 02 8.51E + 02 5.36E + 03 2.06E + 03
F30 Ave 1.73E + 07 7.37E + 06 6.87E + 07 6.10E + 07 5.77E + 07 1.56E + 09 2.86E + 09 1.08E + 09
Std 1.18E + 07 3.68E + 06 5.54E + 07 1.38E + 07 3.50E + 07 1.23E + 09 1.16E + 09 7.32E + 08
(W|T|L) (25 |1 |4) (24 |0 |6) (29 |0 |1) (29 |1 |0) (29 |0 |1) (28 |1 |1) (30 |0 |0) (29 |0 |1)
Function SO GTO COA RIME GJO DBO NOA SBOA

F1 Ave 1.44E + 10 1.84E + 10 7.14E + 10 9.45E + 08 1.37E + 11 1.11E + 11 1.75E + 10 1.22E + 10


Secretary bird optimization algorithm: a new metaheuristic…

Std 3.23E + 09 9.52E + 09 1.00E + 10 2.57E + 08 1.13E + 10 8.01E + 10 3.45E + 09 7.25E + 09


F2 Ave 3.61E + 136 5.16E + 148 5.59E + 146 9.83E + 120 5.83E + 150 6.83E + 160 9.57E + 130 9.29E + 115
Std 1.77E + 137 2.82E + 149 3.06E + 147 4.08E + 121 3.19E + 151 NaN 3.63E + 131 5.08E + 116
F3 Ave 3.72E + 05 2.53E + 05 7.63E + 05 7.07E + 05 4.00E + 05 6.20E + 05 3.74E + 05 3.37E + 05
Std 2.63E + 04 2.05E + 04 1.02E + 05 1.23E + 05 5.65E + 04 2.31E + 05 7.99E + 04 1.95E + 04
F4 Ave 2.97E + 03 2.61E + 03 9.47E + 03 1.25E + 03 2.11E + 04 1.72E + 04 2.79E + 03 1.21E + 03
Std 6.00E + 02 6.03E + 02 3.34E + 03 1.16E + 02 4.91E + 03 1.43E + 04 4.25E + 02 3.46E + 02
F5 Ave 1.18E + 03 1.39E + 03 1.52E + 03 1.32E + 03 1.58E + 03 1.79E + 03 1.63E + 03 1.10E + 03
Std 6.42E + 01 5.40E + 01 5.49E + 01 9.46E + 01 1.06E + 02 1.83E + 02 1.16E + 02 7.18E + 01
F6 Ave 6.47E + 02 6.67E + 02 6.75E + 02 6.56E + 02 6.72E + 02 6.80E + 02 6.79E + 02 6.41E + 02
Std 4.17E + 00 3.99E + 00 3.32E + 00 6.15E + 00 4.24E + 00 8.60E + 00 5.47E + 00 6.26E + 00
F7 Ave 2.24E + 03 3.08E + 03 3.42E + 03 2.16E + 03 3.02E + 03 3.03E + 03 3.21E + 03 2.39E + 03
Std 1.20E + 02 1.68E + 02 1.06E + 02 1.97E + 02 1.93E + 02 3.00E + 02 2.28E + 02 1.51E + 02
Page 43 of 102 123

13
Table 6  (continued)
123

Function SO GTO COA RIME GJO DBO NOA SBOA

13
F8 Ave 1.51E + 03 1.82E + 03 2.03E + 03 1.61E + 03 1.94E + 03 2.20E + 03 2.02E + 03 1.49E + 03
Std 7.22E + 01 6.05E + 01 6.82E + 01 7.35E + 01 8.29E + 01 2.04E + 02 1.07E + 02 7.89E + 01
Page 44 of 102

F9 Ave 2.77E + 04 2.89E + 04 5.08E + 04 4.73E + 04 6.15E + 04 7.66E + 04 4.17E + 04 2.73E + 04


Std 5.86E + 03 2.44E + 03 1.12E + 04 1.67E + 04 1.11E + 04 1.05E + 04 5.42E + 03 3.08E + 03
F10 Ave 3.02E + 04 1.90E + 04 2.54E + 04 1.93E + 04 2.70E + 04 2.80E + 04 2.09E + 04 1.59E + 04
Std 2.89E + 03 3.16E + 03 3.58E + 03 1.38E + 03 5.11E + 03 4.89E + 03 1.72E + 03 1.75E + 03
F11 Ave 1.40E + 05 3.40E + 04 3.49E + 05 4.22E + 04 1.04E + 05 2.32E + 05 7.16E + 04 2.28E + 04
Std 2.51E + 04 9.99E + 03 7.93E + 04 1.71E + 04 1.89E + 04 6.52E + 04 2.71E + 04 9.14E + 03
F12 Ave 1.82E + 09 7.72E + 08 1.34E + 10 1.23E + 09 5.09E + 10 7.25E + 09 1.59E + 09 3.49E + 08
Std 6.82E + 08 2.89E + 08 7.28E + 09 5.13E + 08 1.50E + 10 1.82E + 09 4.73E + 08 3.60E + 08
F13 Ave 2.94E + 06 1.83E + 05 9.96E + 08 2.32E + 06 8.62E + 09 2.74E + 08 9.20E + 06 5.68E + 04
Std 2.19E + 06 1.96E + 05 1.53E + 09 1.03E + 06 3.08E + 09 2.02E + 08 2.11E + 07 1.02E + 06
F14 Ave 7.25E + 06 1.29E + 06 1.21E + 07 6.72E + 06 1.61E + 07 1.63E + 07 1.50E + 06 3.07E + 06
Std 4.33E + 06 7.09E + 05 7.04E + 06 3.29E + 06 6.56E + 06 1.14E + 07 9.39E + 05 1.50E + 06
F15 Ave 5.33E + 05 2.96E + 04 4.76E + 07 4.57E + 05 3.19E + 09 6.64E + 07 1.19E + 05 7.78E + 03
Std 5.39E + 05 1.93E + 04 2.28E + 08 2.68E + 05 2.91E + 09 9.90E + 07 8.46E + 04 3.98E + 03
F16 Ave 7.15E + 03 6.90E + 03 9.48E + 03 7.56E + 03 9.57E + 03 9.33E + 03 7.63E + 03 5.70E + 03
Std 1.29E + 03 6.79E + 02 1.84E + 03 7.76E + 02 7.57E + 02 1.49E + 03 7.51E + 02 8.62E + 02
F17 Ave 5.73E + 03 6.13E + 03 7.40E + 03 6.03E + 03 1.87E + 04 9.58E + 03 5.78E + 03 4.97E + 03
Std 4.76E + 02 7.47E + 02 1.51E + 03 6.27E + 02 1.64E + 04 1.79E + 03 5.92E + 02 6.07E + 02
F18 Ave 1.12E + 07 2.11E + 06 9.22E + 06 9.86E + 06 2.05E + 07 2.08E + 07 2.03E + 06 2.84E + 06
Std 4.51E + 06 9.95E + 05 7.11E + 06 4.94E + 06 1.12E + 07 1.44E + 07 7.04E + 05 2.59E + 06
F19 Ave 2.75E + 06 1.29E + 05 2.75E + 07 1.64E + 07 2.54E + 09 9.37E + 07 1.11E + 06 6.72E + 03
Std 1.77E + 06 1.46E + 05 4.54E + 07 1.33E + 07 1.72E + 09 9.68E + 07 9.40E + 05 4.24E + 03
F20 Ave 7.32E + 03 5.44E + 03 7.08E + 03 5.87E + 03 6.27E + 03 7.06E + 03 6.13E + 03 4.89E + 03
Y. Fu et al.
Table 6  (continued)
Function SO GTO COA RIME GJO DBO NOA SBOA

Std 3.02E + 02 6.34E + 02 4.10E + 02 4.75E + 02 9.58E + 02 6.99E + 02 5.13E + 02 5.07E + 02


F21 Ave 3.09E + 03 3.46E + 03 3.76E + 03 3.17E + 03 3.57E + 03 4.02E + 03 3.73E + 03 2.99E + 03
Std 7.29E + 01 1.67E + 02 1.68E + 02 1.13E + 02 1.23E + 02 1.93E + 02 2.82E + 02 7.55E + 01
F22 Ave 3.23E + 04 2.32E + 04 3.01E + 04 2.19E + 04 2.78E + 04 2.94E + 04 2.34E + 04 1.98E + 04
Std 2.36E + 03 2.91E + 03 2.62E + 03 1.50E + 03 4.20E + 03 4.52E + 03 2.07E + 03 3.43E + 03
F23 Ave 3.74E + 03 4.19E + 03 4.26E + 03 3.75E + 03 4.45E + 03 4.81E + 03 4.16E + 03 3.39E + 03
Std 1.04E + 02 2.03E + 02 1.64E + 02 1.41E + 02 1.44E + 02 2.65E + 02 3.31E + 02 7.97E + 01
F24 Ave 4.79E + 03 5.23E + 03 5.33E + 03 4.22E + 03 5.98E + 03 6.05E + 03 4.80E + 03 4.07E + 03
Std 1.91E + 02 5.52E + 02 4.54E + 02 1.68E + 02 2.44E + 02 5.32E + 02 2.06E + 02 1.56E + 02
F25 Ave 5.55E + 03 4.73E + 03 8.17E + 03 3.99E + 03 1.28E + 04 7.83E + 03 5.04E + 03 4.04E + 03
Std 4.43E + 02 3.84E + 02 1.16E + 03 1.54E + 02 1.90E + 03 3.53E + 03 3.69E + 02 4.09E + 02
F26 Ave 1.94E + 04 2.60E + 04 3.17E + 04 1.59E + 04 2.81E + 04 2.61E + 04 2.51E + 04 1.45E + 04
Std 1.84E + 03 4.11E + 03 3.18E + 03 1.95E + 03 2.38E + 03 3.20E + 03 4.63E + 03 3.52E + 03
Secretary bird optimization algorithm: a new metaheuristic…

F27 Ave 4.35E + 03 4.37E + 03 4.47E + 03 4.02E + 03 5.84E + 03 4.75E + 03 4.26E + 03 3.71E + 03
Std 1.63E + 02 3.84E + 02 3.27E + 02 1.68E + 02 4.62E + 02 5.08E + 02 2.81E + 02 1.05E + 02
F28 Ave 1.02E + 04 5.37E + 03 1.09E + 04 4.24E + 03 1.60E + 04 1.85E + 04 6.19E + 03 4.85E + 03
Std 1.48E + 03 5.43E + 02 1.33E + 03 3.08E + 02 1.61E + 03 6.07E + 03 6.78E + 02 5.10E + 02
F29 Ave 9.29E + 03 8.95E + 03 1.18E + 04 9.57E + 03 1.77E + 04 1.20E + 04 1.00E + 04 7.26E + 03
Std 7.91E + 02 9.47E + 02 1.72E + 03 6.09E + 02 1.03E + 04 2.36E + 03 7.46E + 02 6.63E + 02
F30 Ave 1.85E + 07 7.36E + 06 5.91E + 08 1.76E + 08 7.98E + 09 3.02E + 08 4.77E + 07 9.40E + 05
Std 9.10E + 06 3.75E + 06 7.39E + 08 9.30E + 07 3.24E + 09 1.36E + 08 2.28E + 07 4.78E + 05
(W|T|L) (28 |1 |1) (27 |0 |3) (30 |0 |0) (26 |0 |4) (30 |0 |0) (30 |0 |0) (28 |0 |2)
Page 45 of 102 123

13
123 Page 46 of 102 Y. Fu et al.

Fig. 9  CEC-2017 test function convergence curve (Dim = 30)

4.3.1 Wilcoxon rank sum test

To comprehensively demonstrate the superiority of the proposed algorithm, in this sec-


tion, we will use the Wilcoxon rank-sum test to assess whether the results of each run
of SBOA significantly differ from other algorithms at a significance level of P = 5%
(Dao 2022). The null hypothesis, denoted as H ­ 0, states that there is no significant dif-
ference between the two algorithms. If P < 5%, we reject the null hypothesis, indicat-
ing a significant difference between the two algorithms. If P > 5%, we accept the null
hypothesis, suggesting that there is no significant difference, meaning the algorithms
perform similarly. The “ NaN ” value indicates that the performance between the two is
similar and cannot be compared. Tables 10, 11, 12 respectively present the test results
of SBOA and the comparison algorithms in CEC-2017 for dimensions 30, 50, and 100.

13
Secretary bird optimization algorithm: a new metaheuristic… Page 47 of 102 123

Fig. 10  CEC-2017 test function convergence curve (Dim = 50)

The results for CEC-2022 can be found in Tables 13, 14. To highlight the comparison
results, values exceeding 0.05 will be bolded.
The absence of “NaN” values in CEC-2017 and the minimal presence of “NaN” val-
ues in CEC-2022 test functions suggest that SBOA’s optimization results are generally
dissimilar to those of the other algorithms. Furthermore, as seen in the tables, it can be
observed that DE does not have prominently highlighted data in the CEC-2017 results,
and other comparative algorithms have few data points highlighted in bold, particularly
in the 100-dimensional data for 2017 test functions. Thus, SBOA shows significant dif-
ferences from DE and other compared metaheuristic algorithms.
In conclusion, based on the analysis results presented in Sects. 4.1 and 4.2, it is evident
that SBOA exhibits the best overall performance among various metaheuristic algorithms.
This underscores the effectiveness of the strategies employed in SBOA, such as the dif-
ferential evolution strategy, Levy flight strategy, dynamic perturbation factor, and other

13
123 Page 48 of 102 Y. Fu et al.

Fig. 11  CEC-2017 test function convergence curve (Dim = 100)

Fig. 12  CEC-2017 test function ranking statistics

associated components. These findings highlight the competitive advantage of SBOA in


solving optimization problems across a range of dimensions and test functions.

13
Secretary bird optimization algorithm: a new metaheuristic… Page 49 of 102 123

Fig. 13  Boxplot of SBOA and competitor algorithms in optimization of the CEC-2017 test suite (Dim = 30)

4.3.2 Friedmann’s test

By employing the non-parametric Friedman average rank test to rank the experimental
results of the SBOA algorithm and other algorithms on the CEC-2017 and CEC-2022 test
sets, we obtained the rankings presented in Table 15. Clearly, SBOA consistently ranks
first, indicating that our proposed optimizer outperforms the other benchmark algorithms
on the considered test sets.

5 Application of SBOA

5.1 SBOA is used for real‑world engineering optimization problems

After conducting the experiments and analysis in the fourth section, we have confirmed that
SBOA exhibits superior optimization performance in testing functions. However, the pri-
mary objective of metaheuristic algorithms is to address real-world problems. Therefore,

13
123 Page 50 of 102 Y. Fu et al.

Fig. 14  Boxplot of SBOA and competitor algorithms in optimization of the CEC-2017 test suite (Dim = 50)

in this section, we will further validate the effectiveness and applicability of SBOA through
its application to practical problems. To assess the practical applicability and scalability of
the proposed algorithm, we applied it to twelve typical real engineering problems (Kumar
et al. 2020a; b). These problems encompass: three-bar truss design (TBTD), pressure vessel
design (PVD), tension/compression spring design (TCPD (case 1)), welded beam design
(WBD), weight minimization of speed reducer design (WMSRD), rolling element bearing
design (REBD), gear train design (GTD), hydrostatic thrust bearing design (HSTBD), single
cone pulley design (SCPD), gas transmission compressor design (GTCD), planetary gear
train design (PGTD) and four-stage gear box design (F-sGBD). Furthermore, to demonstrate
the superiority of SBOA, we compared its results with the optimization results obtained
from fifteen state-of-the-art algorithms mentioned earlier in this study.

13
Secretary bird optimization algorithm: a new metaheuristic… Page 51 of 102 123

Fig. 15  Boxplot of SBOA and competitor algorithms in optimization of the CEC-2017 test suite
(Dim = 100)

5.1.1 Three‑bar truss design (TBTD)

The Three-Bar Truss Design problem originates from the field of civil engineering. Its
objective is to minimize the overall structure weight by controlling two parameter vari-
ables. The structure is depicted in Fig. 21, and its mathematical model is described by
Eq. (20).

13
123 Page 52 of 102 Y. Fu et al.

Table 7  CEC-2022 test functions


Type ID Description Range Dimension fmin

Unimodal F1 Shifted and full rotated Zakharov function [− 100,100] 20 300


Multimodal F2 Shifted and full rotated Rosenbrock’s function [− 100,100] 20 400
F3 Shifted and full rotated Rastrigin’s function [− 100,100] 20 600
F4 Shifted and full rotated non-continuous Rastri- [− 100,100] 20 800
gin’s function
F5 Shifted and full rotated levy function [− 100,100] 20 900
Hybrid F6 Hybrid function 1 (N = 3) [− 100,100] 20 1800
F7 Hybrid function 2 (N = 6) [− 100,100] 20 2000
F8 Hybrid function 3 (N = 5) [− 100,100] 20 2200
Composition F9 Composition function 1 (N = 5) [− 100,100] 20 2300
F10 Composition function 2 (N = 4) [− 100,100] 20 2400
F11 Composition function 3 (N = 5) [− 100,100] 20 2600
F12 Composition function 4 (N = 6) [− 100,100] 20 2700

� � � �
Consider x⃗ = x1 x2 = A1 A2 ,
� � � √ �
Minimize f x⃗ = l ∗ 2 2x1 + x2 ,

� � 2x1 + x2
Subject to g1 x⃗ = � P − 𝜎 ≤ 0,
2x12 + 2x1 x2
� � x2
g2 x⃗ = � P − 𝜎 ≤ 0, (20)
2x12 + 2x1 x2
� � 1
g3 x⃗ = √ P − 𝜎 ≤ 0,
2x2 + x1
Parameter range 0 ≤ x1 , x2 ≤ 1,
Where l = 100 cm, P = 2 KN∕cm2 , 𝜎 = 2 KN∕cm2

Table shows the optimization results of SBOA and 11 other different contrasts for
the three-bar truss design problem. As seen in the table, SBOA, LSHADE_cnEp-
Sin, LSHADE_SPACMA, MadDE and GTO simultaneously achieve an optimal cost of
2.64E + 02 and produce different solutions (Table 16).

5.1.2 Pressure vessel design (PVD)

The Pressure Vessel Design problem features a structure as shown in Fig. 22. The design
objective is to minimize costs while meeting usage requirements. Four optimization
parameters include the vessel thickness(Ts ), head thickness(Th ), inner radius(R), and head
length(L). Equation (21) provides its mathematical model.

13
Table 8  Experimental results of 16 algorithms in CEC-2022 (Dim = 10)
Function LSHADE_cnEpSin LSHADE_SPACMA MadDE DE AVOA GWO WOA CPSOGSA

F1 Ave 3.00E + 02 1.33E + 03 8.90E + 02 6.32E + 03 4.37E + 02 3.89E + 03 2.68E + 04 2.58E + 03


Std 3.06E − 06 2.19E + 03 5.54E + 02 2.20E + 03 1.25E + 02 3.19E + 03 1.02E + 04 2.55E + 03
F2 Ave 4.05E + 02 4.06E + 02 4.02E + 02 4.09E + 02 4.28E + 02 4.26E + 02 4.82E + 02 4.35E + 02
Std 2.47E + 00 3.15E + 00 3.51E + 00 3.75E + 00 3.12E + 01 2.44E + 01 8.36E + 01 3.37E + 01
F3 Ave 6.00E + 02 6.00E + 02 6.00E + 02 6.00E + 02 6.24E + 02 6.02E + 02 6.37E + 02 6.42E + 02
Std 1.86E − 01 1.00E − 02 2.53E − 02 2.52E − 03 1.12E + 01 2.32E + 00 1.36E + 01 1.79E + 01
F4 Ave 8.06E + 02 8.08E + 02 8.14E + 02 8.28E + 02 8.33E + 02 8.16E + 02 8.45E + 02 8.40E + 02
Std 2.54E + 00 3.59E + 00 2.85E + 00 5.25E + 00 9.64E + 00 7.71E + 00 1.76E + 01 1.24E + 01
F5 Ave 9.00E + 02 9.00E + 02 9.04E + 02 9.48E + 02 1.31E + 03 9.18E + 02 1.56E + 03 1.63E + 03
Std 3.06E − 01 4.93E − 01 6.01E + 00 2.82E + 01 1.88E + 02 3.21E + 01 4.12E + 02 3.91E + 02
F6 Ave 1.83E + 03 1.83E + 03 1.96E + 03 9.88E + 03 4.36E + 03 6.41E + 03 5.51E + 03 3.51E + 03
Std 2.28E + 01 2.52E + 01 2.89E + 02 1.36E + 04 2.06E + 03 2.29E + 03 2.79E + 03 1.74E + 03
F7 Ave 2.01E + 03 2.01E + 03 2.01E + 03 2.01E + 03 2.05E + 03 2.03E + 03 2.08E + 03 2.09E + 03
Secretary bird optimization algorithm: a new metaheuristic…

Std 8.62E + 00 8.07E + 00 5.17E + 00 1.68E + 00 2.03E + 01 1.26E + 01 4.05E + 01 4.10E + 01


F8 Ave 2.21E + 03 2.22E + 03 2.22E + 03 2.22E + 03 2.23E + 03 2.23E + 03 2.23E + 03 2.29E + 03
Std 9.34E + 00 7.70E + 00 4.07E + 00 3.87E + 00 1.10E + 01 2.99E + 01 8.30E + 00 6.68E + 01
F9 Ave 2.52E + 03 2.53E + 03 2.53E + 03 2.53E + 03 2.55E + 03 2.58E + 03 2.61E + 03 2.54E + 03
Std 6.69E + 00 1.46E − 13 5.39E − 06 1.60E + 00 4.43E + 01 2.99E + 01 4.41E + 01 3.73E + 01
F10 Ave 2.53E + 03 2.50E + 03 2.52E + 03 2.49E + 03 2.57E + 03 2.56E + 03 2.61E + 03 2.66E + 03
Std 4.61E + 01 1.91E + 01 3.80E + 01 1.84E + 01 6.59E + 01 5.73E + 01 2.16E + 02 3.44E + 02
F11 Ave 2.60E + 03 2.69E + 03 2.60E + 03 2.71E + 03 2.71E + 03 2.78E + 03 2.85E + 03 2.75E + 03
Std 1.39E + 02 1.12E + 02 1.91E − 02 3.10E + 01 9.03E + 01 1.38E + 02 1.57E + 02 1.74E + 02
F12 Ave 2.86E + 03 2.86E + 03 2.86E + 03 2.87E + 03 2.87E + 03 2.88E + 03 2.89E + 03 2.92E + 03
Std 2.73E + 00 1.37E + 00 1.48E + 00 8.93E − 01 2.42E + 01 3.20E + 01 3.73E + 01 8.25E + 01
(W|T|L) (6 |1 |5) (10 |1 |1) (12 |0 |0) (11 |0 |1) (12 |0 |0) (12 |0 |0) (12 |0 |0) (12 |0 |0)
Page 53 of 102 123

13
Table 8  (continued)
123

Function SO GTO COA RIME GJO DBO NOA SBOA

13
F1 Ave 1.06E + 03 3.00E + 02 3.22E + 03 3.01E + 02 2.72E + 03 1.91E + 03 3.00E + 02 3.00E + 02
Std 7.15E + 02 3.49E − 04 2.32E + 03 6.06E − 01 1.93E + 03 2.23E + 03 1.01E + 00 1.17E − 01
Page 54 of 102

F2 Ave 4.09E + 02 4.08E + 02 4.15E + 02 4.17E + 02 4.49E + 02 4.31E + 02 4.09E + 02 4.00E + 02


Std 1.69E + 01 1.31E + 01 2.55E + 01 2.54E + 01 2.48E + 01 3.31E + 01 1.54E + 01 1.25E + 01
F3 Ave 6.01E + 02 6.07E + 02 6.06E + 02 6.00E + 02 6.11E + 02 6.10E + 02 6.00E + 02 6.00E + 02
Std 2.06E + 00 3.70E + 00 1.08E + 01 1.45E-01 9.48E + 00 7.67E + 00 2.62E − 01 2.41E − 03
F4 Ave 8.14E + 02 8.24E + 02 8.31E + 02 8.26E + 02 8.34E + 02 8.31E + 02 8.36E + 02 8.05E + 02
Std 4.66E + 00 8.66E + 00 3.52E + 00 1.01E + 01 1.11E + 01 1.10E + 01 2.08E + 01 4.91E + 00
F5 Ave 9.52E + 02 1.00E + 03 1.08E + 03 9.01E + 02 9.70E + 02 9.83E + 02 1.06E + 03 9.00E + 02
Std 4.59E + 01 8.27E + 01 2.40E + 02 1.13E + 00 5.16E + 01 8.02E + 01 1.78E + 02 1.66E-01
F6 Ave 3.22E + 03 2.08E + 03 4.93E + 03 4.26E + 03 1.37E + 04 5.86E + 03 1.88E + 03 1.94E + 03
Std 1.39E + 03 9.67E + 02 2.11E + 03 1.96E + 03 9.61E + 03 2.23E + 03 2.78E + 02 2.02E + 03
F7 Ave 2.03E + 03 2.03E + 03 2.02E + 03 2.02E + 03 2.06E + 03 2.04E + 03 2.02E + 03 2.00E + 03
Std 2.34E + 01 1.19E + 01 9.64E + 00 1.97E + 01 3.10E + 01 2.70E + 01 2.42E + 01 8.36E + 00
F8 Ave 2.22E + 03 2.22E + 03 2.24E + 03 2.22E + 03 2.23E + 03 2.23E + 03 2.22E + 03 2.22E + 03
Std 1.80E + 00 4.78E + 00 3.65E + 01 4.91E + 00 3.29E + 00 1.13E + 01 7.60E + 00 8.84E + 00
F9 Ave 2.53E + 03 2.53E + 03 2.53E + 03 2.53E + 03 2.59E + 03 2.54E + 03 2.53E + 03 2.53E + 03
Std 2.26E − 01 1.01E + 00 1.42E − 03 1.23E − 03 3.21E + 01 2.29E + 01 5.96E − 09 1.89E − 13
F10 Ave 2.56E + 03 2.53E + 03 2.55E + 03 2.52E + 03 2.60E + 03 2.53E + 03 2.57E + 03 2.50E + 03
Std 1.12E + 02 5.29E + 01 6.18E + 01 4.53E + 01 2.11E + 02 5.74E + 01 8.30E + 01 5.51E + 01
F11 Ave 2.69E + 03 2.64E + 03 2.81E + 03 2.73E + 03 2.91E + 03 2.76E + 03 2.69E + 03 2.60E + 03
Std 1.01E + 02 1.17E + 02 1.37E + 02 1.49E + 02 2.04E + 02 1.75E + 02 1.49E + 02 1.36E + 02
F12 Ave 2.87E + 03 2.87E + 03 2.87E + 03 2.87E + 03 2.88E + 03 2.87E + 03 2.87E + 03 2.86E + 03
Std 1.18E + 01 1.55E + 01 1.02E + 01 5.12E + 00 1.67E + 01 1.13E + 01 1.04E + 01 2.07E + 00
(W|T|L) (12 |0 |0) (11 |0 |1) (12 |0 |0) (12 |0 |0) (12 |0 |0) (12 |0 |0) (10 |1 |1)
Y. Fu et al.
Table 9  Experimental Results of 16 algorithms in CEC-2022 (Dim = 20)
Function LSHADE_cnEpSin LSHADE_SPACMA MadDE DE AVOA GWO WOA CPSOGSA

F1 Ave 7.96E + 03 1.53E + 04 1.86E + 04 3.48E + 04 1.89E + 04 1.44E + 04 4.04E + 04 3.70E + 04


Std 1.21E + 04 1.84E + 04 4.96E + 03 7.73E + 03 8.01E + 03 4.87E + 03 1.80E + 04 1.36E + 04
F2 Ave 4.64E + 02 4.48E + 02 4.57E + 02 4.82E + 02 4.83E + 02 5.17E + 02 6.32E + 02 4.87E + 02
Std 2.28E + 01 1.99E + 01 1.13E + 01 1.17E + 01 3.71E + 01 4.28E + 01 5.76E + 01 3.83E + 01
F3 Ave 6.09E + 02 6.01E + 02 6.02E + 02 6.01E + 02 6.50E + 02 6.06E + 02 6.71E + 02 6.63E + 02
Std 5.72E + 00 5.31E − 01 5.24E − 01 1.50E − 01 9.38E + 00 2.83E + 00 1.67E + 01 1.29E + 01
F4 Ave 8.67E + 02 8.38E + 02 8.73E + 02 9.27E + 02 8.89E + 02 8.62E + 02 9.50E + 02 9.17E + 02
Std 1.70E + 01 1.46E + 01 9.74E + 00 1.31E + 01 1.47E + 01 3.44E + 01 4.10E + 01 2.76E + 01
F5 Ave 1.35E + 03 9.25E + 02 1.60E + 03 2.19E + 03 2.53E + 03 1.20E + 03 4.21E + 03 3.61E + 03
Std 3.95E + 02 2.69E + 01 3.19E + 02 3.88E + 02 3.37E + 02 2.00E + 02 1.76E + 03 1.08E + 03
F6 Ave 7.31E + 03 2.18E + 03 3.77E + 04 3.16E + 06 5.75E + 03 4.36E + 06 7.56E + 06 6.66E + 03
Std 5.95E + 03 4.76E + 02 2.13E + 04 1.32E + 06 4.99E + 03 7.86E + 06 8.86E + 06 6.14E + 03
F7 Ave 2.10E + 03 2.05E + 03 2.07E + 03 2.06E + 03 2.16E + 03 2.10E + 03 2.24E + 03 2.21E + 03
Secretary bird optimization algorithm: a new metaheuristic…

Std 2.43E + 01 8.74E + 00 1.16E + 01 9.11E + 00 5.68E + 01 5.14E + 01 7.23E + 01 1.03E + 02


F8 Ave 2.23E + 03 2.23E + 03 2.23E + 03 2.23E + 03 2.26E + 03 2.25E + 03 2.31E + 03 2.43E + 03
Std 3.82E + 00 4.78E + 00 1.08E + 00 2.03E + 00 4.78E + 01 4.48E + 01 6.70E + 01 1.37E + 02
F9 Ave 2.48E + 03 2.48E + 03 2.48E + 03 2.48E + 03 2.49E + 03 2.53E + 03 2.61E + 03 2.49E + 03
Std 1.58E + 00 1.12E − 03 8.81E − 02 4.48E − 01 8.62E + 00 3.32E + 01 5.28E + 01 5.88E + 00
F10 Ave 3.10E + 03 2.52E + 03 2.52E + 03 2.51E + 03 3.47E + 03 3.68E + 03 5.13E + 03 4.91E + 03
Std 5.30E + 02 6.02E + 01 5.25E + 01 7.47E + 00 7.56E + 02 7.26E + 02 1.16E + 03 6.89E + 02
F11 Ave 2.99E + 03 2.94E + 03 2.91E + 03 3.18E + 03 3.00E + 03 3.51E + 03 3.95E + 03 3.09E + 03
Std 1.26E + 02 1.07E + 02 3.14E + 01 1.53E + 02 1.40E + 02 3.76E + 02 6.12E + 02 4.05E + 02
F12 Ave 2.91E + 03 2.96E + 03 2.96E + 03 2.97E + 03 3.00E + 03 2.99E + 03 3.09E + 03 3.33E + 03
Std 1.09E + 01 1.29E + 01 5.59E + 00 6.35E + 00 4.44E + 01 3.73E + 01 1.31E + 02 1.95E + 02
(W|T|L) (10 |0 |2) (7 |1 |4) (11 |0 |1) (11 |0 |1) (11 |0 |1) (12 |0 |0) (12 |0 |0) (11 |0 |1)
Page 55 of 102 123

13
Table 9  (continued)
123

Function SO GTO COA RIME GJO DBO NOA SBOA

13
F1 Ave 1.68E + 04 2.41E + 03 4.41E + 04 1.27E + 03 1.88E + 04 3.46E + 04 2.29E + 03 1.21E + 03
Std 5.05E + 03 9.23E + 02 1.40E + 04 5.36E + 02 6.20E + 03 1.00E + 04 1.22E + 03 1.37E + 03
Page 56 of 102

F2 Ave 4.60E + 02 4.69E + 02 5.04E + 02 4.57E + 02 6.18E + 02 5.02E + 02 4.53E + 02 4.49E + 02


Std 1.58E + 01 1.89E + 01 5.20E + 01 1.57E + 01 8.59E + 01 6.14E + 01 2.10E + 01 1.22E + 01
F3 Ave 6.10E + 02 6.31E + 02 6.35E + 02 6.05E + 02 6.28E + 02 6.37E + 02 6.16E + 02 6.00E + 02
Std 5.86E + 00 1.30E + 01 1.99E + 01 3.11E + 00 9.34E + 00 1.23E + 01 1.19E + 01 3.18E-01
F4 Ave 8.48E + 02 8.77E + 02 8.90E + 02 8.63E + 02 8.97E + 02 9.18E + 02 9.02E + 02 8.40E + 02
Std 1.83E + 01 1.39E + 01 9.10E + 00 1.90E + 01 2.78E + 01 2.82E + 01 2.93E + 01 1.16E + 01
F5 Ave 1.28E + 03 1.90E + 03 2.67E + 03 1.41E + 03 2.16E + 03 2.24E + 03 2.19E + 03 9.15E + 02
Std 1.83E + 02 4.71E + 02 4.07E + 02 1.01E + 03 4.68E + 02 6.56E + 02 7.08E + 02 1.09E + 02
F6 Ave 6.92E + 03 6.39E + 03 5.07E + 03 1.31E + 04 2.47E + 07 4.07E + 05 7.22E + 03 7.44E + 03
Std 5.59E + 03 6.40E + 03 5.04E + 03 7.55E + 03 4.66E + 07 6.85E + 05 5.48E + 03 6.50E + 03
F7 Ave 2.09E + 03 2.12E + 03 2.13E + 03 2.09E + 03 2.13E + 03 2.14E + 03 2.10E + 03 2.04E + 03
Std 3.94E + 01 4.24E + 01 1.03E + 02 5.17E + 01 4.52E + 01 6.11E + 01 5.89E + 01 1.24E + 01
F8 Ave 2.24E + 03 2.26E + 03 2.29E + 03 2.25E + 03 2.24E + 03 2.30E + 03 2.29E + 03 2.23E + 03
Std 2.37E + 01 5.00E + 01 7.47E + 01 4.26E + 01 3.08E + 01 6.97E + 01 7.94E + 01 2.13E + 00
F9 Ave 2.48E + 03 2.48E + 03 2.48E + 03 2.48E + 03 2.59E + 03 2.51E + 03 2.48E + 03 2.48E + 03
Std 7.61E − 01 4.92E − 02 2.46E + 00 6.30E − 01 4.16E + 01 2.82E + 01 1.11E − 01 2.88E − 03
F10 Ave 3.08E + 03 3.34E + 03 3.47E + 03 2.91E + 03 4.09E + 03 3.44E + 03 3.06E + 03 2.69E + 03
Std 4.52E + 02 1.23E + 03 1.22E + 03 3.00E + 02 1.50E + 03 1.08E + 03 3.21E + 02 2.73E + 02
F11 Ave 2.98E + 03 2.91E + 03 3.13E + 03 2.95E + 03 4.61E + 03 3.12E + 03 3.02E + 03 2.91E + 03
Std 1.21E + 02 1.39E + 02 3.19E + 02 9.46E + 01 6.15E + 02 1.61E + 02 4.06E + 02 1.53E + 02
F12 Ave 3.02E + 03 3.03E + 03 2.98E + 03 2.98E + 03 3.03E + 03 3.03E + 03 2.98E + 03 2.95E + 03
Std 3.64E + 01 6.81E + 01 2.44E + 01 3.54E + 01 4.45E + 01 6.30E + 01 5.54E + 01 4.64E + 00
(W|T|L) (11 |0 |1) (11 |0 |1) (11 |0 |1) (12 |0 |0) (12 |0 |0) (12 |0 |0) (11 |0 |1)
Y. Fu et al.
Secretary bird optimization algorithm: a new metaheuristic… Page 57 of 102 123

Fig. 16  CEC-2022 test function convergence curve (Dim = 10)

[ ] [ ]
Consider x⃗ = x1 x2 x3 x4 = Ts Th RL ,
( )
Minimize f x⃗ = 0.6224x1 x3 x4 + 1.7781x2 x32 + 3.1661x12 x4 + 19.84x12 x3 ,
( )
Subject to g1 x⃗ = −x1 + 0.0193x3 ≤ 0,
( )
g2 x⃗ = −x3 + 0.00954x3 ≤ 0, (21)
( ) 4
g3 x⃗ = −𝜋x32 x4 − 𝜋x33 + 1296000 ≤ 0,
( ) 3
g4 x⃗ = x4 − 240 ≤ 0,
Parameter range 0 ≤ x1 , x2 ≤ 99, 10 ≤ x3 , x4 ≤ 200.

From the results in Table 17, it is evident that SBOA, LSHADE_cnEpSin, MadDE and
NOA outperforms all other competitors, achieving a minimum cost of 5.89E + 03.

5.1.3 Tension/compression spring design (TCPD (case 1))

This design problem aims to minimize the weight of tension/compression springs by opti-
mizing three critical parameters: wire diameter(d), coil diameter(D), and the number of

13
123 Page 58 of 102 Y. Fu et al.

Fig. 17  CEC-2022 test function convergence curve (Dim = 20)

Fig. 18  CEC-2022 test function ranking statistics

13
Secretary bird optimization algorithm: a new metaheuristic… Page 59 of 102 123

Fig. 19  Boxplot of SBOA and competitor algorithms in optimization of the CEC-2022 test suite (Dim = 10)

coils(N). The structure of this engineering problem is illustrated in Fig. 23, with the math-
ematical model is presented in Eq. (22).
[ ]
Consider x⃗ = x1 x2 x3 = [dDN],
( ) ( )
Minimize f x⃗ = x3 + 2 x2 x12 ,
( ) x23 x3
Subject to g1 x⃗ = 1 − ≤ 0,
71785x14
( ) 4x22 − x1 x2 1
g2 x⃗ = ( )+ ≤ 0, (22)
12566 x2 x13 − x14 5108x12
( ) 140.45x1
g3 x⃗ = 1 − ≤ 0,
x22 x3
( ) x + x2
g4 x⃗ = 1 − 1 ≤ 0,
1.5
Parameter range 0.05 ≤ x1 ≤ 2, 0.25 ≤ x2 ≤ 1.3, 2 ≤ x3 ≤ 1.5.

13
123 Page 60 of 102 Y. Fu et al.

Fig. 20  Boxplot of SBOA and competitor algorithms in optimization of the CEC-2022 test suite (Dim = 20)

Table presents the optimization results of SBOA compared to fourteen different com-
peting algorithms for the tension/compression spring design problem. It is evident from
the table that SBOA outperforms the other algorithms, achieving the optimal value of
1.27E – 02 (Table 18).

5.1.4 Welded beam design (WBD)

Welded beam design represents a typical nonlinear programming problem, aiming to mini-
mize the manufacturing cost of a welded beam by controlling parameters such as beam
thickness(h), length(l), height(t), width(b), and weld size. The structure of the optimization
problem is depicted in Fig. 24, and its mathematical model is described by Eq. (23).

13
Table 10  P-value on CEC-2017 (Dim = 30)
Function LSHADE_cnEpSin LSHADE_SPACMA MadDE DE AVOA GWO WOA CPSOGSA

F1 4.80E − 07 1.26E − 01 3.02E − 11 3.02E − 11 1.31E − 08 3.02E − 11 3.02E − 11 3.02E − 11


F2 2.84E − 04 2.34E − 01 4.69E − 08 3.02E − 11 8.99E − 11 3.02E − 11 3.02E − 11 3.02E − 11
F3 4.46E − 01 9.23E − 01 3.02E − 11 3.02E − 11 3.34E − 11 3.02E − 11 3.02E − 11 3.02E − 11
F4 4.04E − 01 2.84E − 01 1.15E − 01 3.02E − 11 8.88E − 06 1.46E − 10 3.02E − 11 3.02E − 11
F5 9.00E − 01 5.30E − 01 3.02E − 11 3.02E − 11 3.02E − 11 2.49E − 06 3.02E − 11 3.02E − 11
F6 1.68E − 04 7.51E − 01 1.33E − 10 7.66E − 05 3.02E − 11 4.98E − 11 3.02E − 11 3.02E − 11
F7 3.71E − 01 7.98E − 02 4.98E − 11 8.15E − 11 3.02E − 11 7.74E − 06 3.02E − 11 3.02E − 11
F8 7.39E − 01 9.35E − 01 3.02E − 11 3.02E − 11 3.02E − 11 1.61E − 06 3.02E − 11 3.02E − 11
F9 3.87E − 01 4.64E − 03 1.61E − 10 3.34E − 11 3.02E − 11 3.09E − 06 3.02E − 11 3.02E − 11
F10 2.50E − 03 3.85E − 03 1.33E − 10 3.02E − 11 2.49E − 06 3.39E − 02 3.02E − 11 1.56E − 08
F11 1.25E − 07 7.66E − 05 1.61E − 10 3.02E − 11 4.12E − 06 3.02E − 11 3.02E − 11 7.39E − 11
F12 5.27E − 05 2.00E − 06 2.24E − 02 3.02E − 11 8.10E − 10 6.12E − 10 3.02E − 11 3.65E − 08
F13 7.17E − 01 2.06E − 01 7.74E − 06 3.02E − 11 3.20E − 09 3.69E − 11 3.02E − 11 1.86E − 03
Secretary bird optimization algorithm: a new metaheuristic…

F14 3.02E − 11 3.02E − 11 2.43E − 05 8.15E − 11 2.38E − 07 2.88E − 06 1.09E − 10 4.74E − 06


F15 1.68E − 04 7.20E − 05 6.95E − 01 3.02E − 11 6.01E − 08 1.09E − 10 3.02E − 11 6.53E − 07
F16 8.42E − 01 3.85E − 03 2.38E − 03 1.41E − 09 5.46E − 09 9.51E − 06 3.34E − 11 5.07E − 10
F17 9.12E − 01 1.54E − 01 1.71E − 01 5.46E − 06 3.69E − 11 1.99E − 02 3.02E − 11 8.99E − 11
F18 3.34E − 11 2.92E − 09 2.58E − 01 1.85E − 08 1.02E − 05 7.62E − 03 4.20E − 10 2.52E − 01
F19 1.03E − 06 1.73E − 07 1.96E − 01 3.02E − 11 5.60E − 07 1.09E − 10 3.02E − 11 2.84E − 04
F20 1.04E − 04 7.09E − 08 1.53E − 05 1.17E − 09 3.34E − 11 3.52E − 07 3.69E − 11 5.49E − 11
F21 3.51E − 02 1.99E − 02 3.02E − 11 3.02E − 11 3.02E − 11 1.29E − 09 3.02E − 11 3.02E − 11
F22 1.07E − 07 3.37E − 05 1.10E − 08 8.48E − 09 1.01E − 08 5.07E − 10 8.15E − 11 9.92E − 11
F23 1.11E − 04 4.36E − 02 3.02E − 11 3.02E − 11 3.02E − 11 1.10E − 08 3.02E − 11 3.02E − 11
F24 8.68E − 03 3.78E − 02 3.02E − 11 3.02E − 11 3.02E − 11 4.18E − 09 3.02E − 11 3.02E − 11
F25 4.92E − 01 2.23E − 01 6.77E − 05 3.02E − 11 7.04E − 07 3.02E − 11 3.02E − 11 7.39E − 11
F26 4.73E − 01 3.26E − 01 1.33E − 02 1.36E − 07 5.07E − 10 3.57E − 06 5.49E − 11 3.02E − 11
Page 61 of 102 123

13
Table 10  (continued)
123

Function LSHADE_cnEpSin LSHADE_SPACMA MadDE DE AVOA GWO WOA CPSOGSA

13
F27 8.29E − 06 1.11E − 03 1.17E − 09 3.02E − 11 8.15E − 11 3.65E − 08 3.02E − 11 3.02E − 11
F28 7.28E − 01 3.04E − 01 5.26E − 04 3.02E − 11 5.07E − 10 3.02E − 11 3.02E − 11 5.46E − 09
Page 62 of 102

F29 2.84E − 01 5.30E − 01 3.26E − 07 2.87E − 10 6.70E − 11 3.52E − 07 3.02E − 11 3.02E − 11


F30 1.58E − 01 6.84E − 01 6.12E − 10 3.69E − 11 4.08E − 11 3.34E − 11 3.02E − 11 7.38E − 10
Function SO GTO COA RIME GJO DBO NOA

F1 3.02E − 11 2.38E − 07 3.02E − 11 3.02E − 11 3.02E − 11 3.02E − 11 3.02E − 11


F2 3.02E − 11 2.15E − 06 3.02E − 11 1.19E − 01 3.02E − 11 3.02E − 11 8.20E − 07
F3 3.02E − 11 8.29E − 06 3.02E − 11 1.31E − 08 4.50E − 11 3.02E − 11 3.64E − 02
F4 1.87E − 07 1.63E − 02 2.67E − 09 2.89E − 03 3.02E − 11 6.07E − 11 2.07E − 02
F5 1.11E − 03 1.09E-10 3.02E − 11 1.24E − 03 3.02E − 11 3.69E − 11 3.02E − 11
F6 4.50E − 11 3.02E − 11 3.02E − 11 3.69E − 11 3.02E − 11 3.02E − 11 3.02E − 11
F7 3.09E − 06 3.34E − 11 3.02E − 11 1.86E − 01 3.02E − 11 8.15E − 11 3.47E − 10
F8 5.57E − 03 1.96E − 10 3.02E − 11 8.84E − 07 4.08E − 11 3.02E − 11 7.39E − 11
F9 7.74E − 06 3.34E − 11 3.02E − 11 1.68E − 04 3.02E − 11 3.34E − 11 1.61E − 10
F10 2.50E − 03 6.72E − 10 5.49E − 11 2.77E − 01 2.15E − 10 1.78E − 10 3.59E − 05
F11 1.46E − 10 3.37E − 04 3.02E − 11 8.10E − 10 3.02E − 11 3.02E − 11 2.44E − 09
F12 1.05E − 01 6.52E − 01 7.38E − 10 1.21E − 10 3.02E − 11 1.70E − 08 3.64E − 02
F13 1.37E − 01 4.38E − 01 2.44E − 09 1.86E − 09 3.02E − 11 3.02E − 11 1.76E − 02
F14 2.24E − 02 4.44E − 07 9.76E − 10 1.78E − 04 5.07E − 10 2.57E − 07 4.18E − 09
F15 2.32E − 02 9.71E-01 6.28E − 06 8.56E − 04 3.02E − 11 5.53E-08 1.15E − 01
F16 2.50E − 03 1.47E − 07 5.57E − 10 8.88E − 06 7.77E − 09 3.82E − 10 4.08E − 05
F17 5.60E − 07 1.34E − 05 1.85E − 08 1.64E − 05 5.09E − 08 6.70E − 11 5.57E − 10
F18 1.60E − 03 2.83E − 08 1.32E − 04 9.52E − 04 7.20E − 05 1.49E − 06 2.28E − 05
F19 4.84E − 02 8.42E − 01 2.96E − 05 1.63E − 02 3.02E − 11 9.53E − 07 5.01E − 02
F20 2.38E − 07 8.89E − 10 4.62E − 10 6.53E − 07 2.87E − 10 4.62E-10 2.02E − 08
Y. Fu et al.
Table 10  (continued)
Function SO GTO COA RIME GJO DBO NOA

F21 1.60E − 07 1.29E − 09 4.50E − 11 1.47E − 07 3.02E − 11 3.02E − 11 3.02E − 11


F22 2.44E − 09 2.32E − 06 3.50E − 09 8.89E − 10 7.38E − 10 3.16E − 10 2.37E − 10
F23 2.61E − 10 3.02E − 11 6.07E − 11 5.07E − 10 3.02E − 11 3.02E − 11 2.61E − 10
F24 3.82E − 09 3.02E − 11 5.49E − 11 2.19E − 08 3.02E − 11 3.02E − 11 6.07E − 11
F25 5.86E − 06 1.91E − 02 2.23E − 09 7.96E − 03 3.02E − 11 3.82E − 09 7.29E − 03
F26 5.46E − 09 2.24E − 02 1.69E − 09 1.68E − 04 3.69E − 11 2.87E − 10 1.39E − 06
F27 4.50E − 11 4.08E − 11 3.47E − 10 9.83E − 08 3.02E − 11 1.61E − 10 1.61E − 10
F28 7.39E − 11 1.17E − 03 1.96E − 10 3.56E − 04 3.02E − 11 5.49E − 11 8.56E − 04
F29 6.72E − 10 6.70E − 11 1.20E − 08 1.17E − 09 2.37E − 10 6.70E − 11 8.48E − 09
F30 2.19E − 08 8.42E − 01 9.92E − 11 1.46E − 10 3.02E − 11 8.48E − 09 6.10E − 01
Secretary bird optimization algorithm: a new metaheuristic…
Page 63 of 102 123

13
Table 11  P-value on CEC-2017 (Dim = 50)
123

Function LSHADE_cnEpSin LSHADE_SPACMA MadDE DE AVOA GWO WOA CPSOGSA

13
F1 1.11E − 06 2.68E − 06 6.07E − 11 3.02E − 11 1.69E − 09 3.02E − 11 3.02E − 11 3.02E − 11
F2 1.21E − 12 1.21E − 12 1.21E − 12 3.02E − 11 1.61E − 10 3.02E − 11 3.02E − 11 3.02E − 11
Page 64 of 102

F3 5.30E − 01 8.50E − 02 4.50E − 11 3.02E − 11 8.99E − 11 1.69E − 09 3.02E − 11 3.02E − 11


F4 3.40E − 01 3.27E − 02 1.70E − 08 3.02E − 11 1.01E − 08 3.02E − 11 3.02E − 11 3.02E − 11
F5 2.17E − 01 2.58E − 01 3.02E − 11 3.02E − 11 3.02E − 11 1.07E − 07 3.02E − 11 3.02E − 11
F6 7.04E − 07 2.15E − 06 3.02E − 11 1.73E − 07 3.02E − 11 8.10E − 10 3.02E − 11 3.02E − 11
F7 8.19E − 01 2.17E − 01 3.34E − 11 5.49E − 11 3.02E − 11 2.71E − 02 3.02E − 11 3.02E − 11
F8 3.92E − 02 1.19E − 01 3.02E − 11 3.02E − 11 4.50E − 11 9.79E − 05 3.02E − 11 3.02E − 11
F9 6.73E − 01 3.15E − 02 3.34E − 11 3.34E − 11 8.99E − 11 2.13E − 05 3.02E − 11 3.02E − 11
F10 5.97E − 09 4.57E − 09 3.02E − 11 3.02E − 11 1.73E − 07 2.05E − 03 3.02E − 11 4.71E − 04
F11 7.30E − 04 8.12E − 04 3.02E − 11 3.02E − 11 8.89E − 10 3.02E − 11 3.02E − 11 3.02E − 11
F12 5.20E − 01 6.84E − 01 5.09E − 06 3.02E − 11 6.70E − 11 3.02E − 11 3.02E − 11 3.02E − 11
F13 9.06E − 08 1.86E − 03 2.87E − 10 3.02E − 11 4.98E − 11 3.02E − 11 3.02E − 11 3.02E − 11
F14 5.57E − 10 8.99E − 11 1.55E − 09 3.02E − 11 8.89E − 10 3.32E − 06 3.02E − 11 6.72E − 10
F15 1.49E − 01 1.50E − 02 5.09E − 08 3.02E − 11 7.38E − 10 3.02E − 11 3.02E − 11 7.39E − 11
F16 5.19E − 02 1.78E − 04 1.73E − 06 4.08E − 11 4.62E − 10 2.50E − 03 3.02E − 11 9.76E − 10
F17 3.18E − 03 1.01E − 08 3.35E − 08 6.70E − 11 3.34E − 11 4.94E − 05 3.02E − 11 3.47E − 10
F18 8.15E − 11 6.70E − 11 9.79E − 05 1.21E − 10 1.37E − 03 1.09E − 05 4.08E − 11 1.37E − 01
F19 4.04E − 01 7.39E − 01 1.34E − 05 3.02E − 11 6.70E − 11 3.02E − 11 3.02E − 11 6.72E − 10
F20 1.02E − 01 3.96E − 08 4.03E − 03 2.67E − 09 6.53E − 07 8.12E − 04 1.78E − 10 3.96E − 08
F21 5.86E − 06 6.77E − 05 3.02E − 11 3.02E − 11 3.02E − 11 4.31E − 08 3.02E − 11 3.02E − 11
F22 3.26E − 07 2.77E − 05 1.96E − 10 1.31E − 08 3.81E − 07 3.03E − 03 3.02E − 11 1.47E − 07
F23 2.49E − 06 2.49E − 06 3.34E − 11 3.02E − 11 3.02E − 11 1.70E − 08 3.02E − 11 3.02E − 11
F24 1.32E − 04 2.25E − 04 3.02E − 11 3.02E − 11 3.02E − 11 1.17E − 05 3.02E − 11 3.02E − 11
F25 1.56E − 02 9.05E − 02 6.07E − 11 3.02E − 11 1.60E − 07 3.02E − 11 3.02E − 11 3.02E − 11
F26 7.73E − 02 8.65E − 01 3.40E − 01 8.10E − 10 1.86E − 09 1.77E − 03 3.02E − 11 3.02E − 11
Y. Fu et al.
Table 11  (continued)
Function LSHADE_cnEpSin LSHADE_SPACMA MadDE DE AVOA GWO WOA CPSOGSA

F27 5.46E − 09 1.85E − 08 3.34E − 11 3.02E − 11 3.02E − 11 4.08E − 11 3.02E − 11 3.02E − 11


F28 4.92E − 01 1.02E − 01 3.02E − 11 3.02E − 11 1.46E − 10 3.02E − 11 3.02E − 11 3.02E − 11
F29 1.33E − 02 6.00E − 01 1.43E − 08 3.16E − 10 7.38E − 10 8.20E − 07 3.02E − 11 3.02E − 11
F30 3.02E − 11 3.82E − 10 3.02E − 11 3.02E − 11 3.02E − 11 3.02E − 11 3.02E − 11 3.02E − 11
Function SO GTO COA RIME GJO DBO NOA

F1 6.07E − 11 5.46E − 09 3.02E − 11 9.51E − 06 3.02E − 11 3.02E − 11 3.02E − 11


F2 3.02E − 11 1.78E − 10 3.02E − 11 8.53E − 01 3.02E − 11 3.02E − 11 4.20E − 10
F3 9.92E − 11 1.11E − 06 3.02E − 11 4.08E − 11 4.31E − 08 3.02E − 11 4.42E − 06
F4 8.89E − 10 3.16E − 05 3.02E − 11 2.25E − 04 3.02E − 11 3.02E − 11 7.12E − 09
F5 5.79E − 01 5.49E − 11 3.02E − 11 1.17E − 03 3.02E − 11 3.02E − 11 3.69E − 11
F6 2.87E − 10 3.02E − 11 3.02E − 11 1.33E − 10 3.02E − 11 3.02E − 11 3.02E − 11
F7 1.49E − 04 3.02E − 11 3.02E − 11 7.62E − 01 3.02E − 11 4.18E − 09 2.61E − 10
Secretary bird optimization algorithm: a new metaheuristic…

F8 8.31E − 03 3.16E − 10 3.02E − 11 5.46E − 06 3.02E − 11 4.98E − 11 4.08E − 11


F9 4.38E − 01 3.82E − 10 3.02E − 11 2.03E − 07 3.02E − 11 3.34E − 11 6.07E − 11
F10 1.17E − 04 3.96E − 08 3.02E − 11 2.38E − 03 1.46E − 10 2.37E − 10 4.80E − 07
F11 3.02E − 11 1.15E − 01 3.02E − 11 4.57E − 09 3.02E − 11 3.02E − 11 1.25E − 07
F12 2.03E − 07 5.94E − 02 3.69E − 11 3.02E − 11 3.02E − 11 3.02E − 11 1.96E − 10
F13 4.08E − 11 8.88E − 06 3.02E − 11 3.02E − 11 3.02E − 11 3.02E − 11 1.17E − 09
F14 7.22E − 06 7.62E − 03 3.65E − 08 3.65E − 08 1.78E − 10 1.31E − 08 7.69E − 08
F15 1.86E − 06 5.55E − 02 4.98E − 11 3.69E − 11 3.02E − 11 3.34E − 11 8.30E − 01
F16 9.03E − 04 7.04E − 07 7.69E − 08 7.22E − 06 1.07E − 09 6.70E − 11 3.26E − 07
F17 2.83E − 08 7.69E − 08 2.44E − 09 1.78E − 10 6.72E − 10 4.98E − 11 2.00E − 06
F18 2.50E − 03 8.66E − 05 9.53E − 07 4.64E − 05 8.35E − 08 2.20E − 07 5.27E − 05
F19 3.96E − 08 8.53E − 01 3.02E − 11 7.39E − 11 3.02E − 11 2.15E − 10 1.67E − 01
F20 1.30E − 03 1.64E − 05 2.37E − 10 1.17E − 04 7.69E − 08 2.15E − 10 2.43E − 05
Page 65 of 102 123

13
Table 11  (continued)
123

Function SO GTO COA RIME GJO DBO NOA

13
F21 5.83E − 03 8.15E − 11 6.70E − 11 9.06E − 08 3.02E − 11 3.02E − 11 3.02E − 11
F22 7.22E − 06 4.80E − 07 2.67E − 09 4.64E − 03 1.09E − 10 5.57E − 10 3.01E − 07
Page 66 of 102

F23 1.55E − 09 3.69E − 11 4.08E − 11 1.56E − 08 3.02E − 11 3.02E − 11 3.69E − 11


F24 3.82E − 09 5.49E − 11 3.02E − 11 5.97E − 05 3.02E − 11 3.02E − 11 5.49E − 11
F25 5.46E − 09 4.22E − 04 3.02E − 11 1.27E − 02 3.02E − 11 2.61E − 10 5.00E − 09
F26 6.01E − 08 3.65E − 08 5.49E − 11 1.91E − 02 4.50E − 11 4.08E − 11 8.89E − 10
F27 3.34E − 11 4.50E − 11 3.34E − 11 3.16E − 10 3.02E − 11 3.69E − 11 4.98E − 11
F28 4.98E − 11 6.72E − 10 3.02E − 11 1.60E − 03 3.02E − 11 3.02E − 11 9.26E − 09
F29 6.01E − 08 1.43E − 08 1.46E − 10 6.72E − 10 4.08E − 11 4.08E − 11 6.07E − 11
F30 3.02E − 11 1.70E − 08 3.02E − 11 3.02E − 11 3.02E − 11 3.02E − 11 2.23E − 09
Y. Fu et al.
Table 12  P-value on CEC-2017 (Dim = 100)
Function LSHADE_cnEpSin LSHADE_SPACMA MadDE DE AVOA GWO WOA CPSOGSA

F1 5.57E − 03 1.96E − 01 3.34E − 11 3.02E − 11 6.95E − 01 3.02E − 11 3.02E − 11 3.02E − 11


F2 1.21E − 12 1.21E − 12 1.21E − 12 3.02E − 11 3.02E − 11 4.50E − 11 3.02E − 11 3.02E − 11
F3 8.31E − 03 3.79E − 01 3.02E − 11 3.02E − 11 3.59E − 05 4.20E − 10 3.02E − 11 3.02E − 11
F4 2.64E − 01 5.75E − 02 3.02E − 11 3.02E − 11 7.12E − 09 3.02E − 11 3.02E − 11 3.02E − 11
F5 3.59E − 05 2.77E − 05 3.02E − 11 3.02E − 11 7.39E − 11 1.84E − 02 3.02E − 11 3.02E − 11
F6 4.83E − 01 1.09E − 10 3.02E − 11 1.11E − 06 3.02E − 11 4.71E − 04 3.02E − 11 3.02E − 11
F7 5.57E − 03 6.67E − 03 3.02E − 11 4.50E − 11 3.02E − 11 2.96E − 05 3.02E − 11 3.02E − 11
F8 6.77E − 05 1.75E − 05 3.02E − 11 3.02E − 11 3.34E − 11 2.75E − 03 3.02E − 11 3.02E − 11
F9 1.12E − 01 2.62E − 03 3.02E − 11 3.02E − 11 5.87E − 04 3.96E − 08 3.02E − 11 3.02E − 11
F10 4.50E − 11 3.69E − 11 3.02E − 11 3.02E − 11 6.00E − 01 5.37E − 02 3.02E − 11 8.42E − 01
F11 8.65E − 01 1.86E − 01 3.02E − 11 3.02E − 11 3.02E − 11 3.02E − 11 3.02E − 11 3.02E − 11
F12 9.06E − 08 3.82E − 09 8.99E − 11 3.02E − 11 3.50E − 09 3.02E − 11 3.02E − 11 3.02E − 11
F13 6.20E − 04 1.70E − 08 6.07E − 11 3.02E − 11 2.96E − 05 3.02E − 11 3.02E − 11 3.02E − 11
Secretary bird optimization algorithm: a new metaheuristic…

F14 1.33E − 10 1.69E − 09 2.37E − 10 3.02E − 11 4.20E − 10 1.07E − 09 3.02E − 11 1.49E − 06


F15 4.98E − 11 2.92E − 09 3.69E − 11 3.02E − 11 3.02E − 11 3.02E − 11 3.02E − 11 3.02E − 11
F16 4.08E − 05 1.25E − 07 3.02E − 11 3.02E − 11 1.96E − 10 1.87E − 05 3.02E − 11 6.07E − 11
F17 8.50E − 02 1.68E − 04 5.00E − 09 3.02E − 11 2.44E − 09 1.41E − 04 3.02E − 11 1.33E − 10
F18 2.61E − 10 4.98E − 11 2.13E − 04 3.02E − 11 5.83E − 03 1.64E − 05 3.20E − 09 1.95E − 03
F19 3.02E − 11 1.46E − 10 3.02E − 11 3.02E − 11 3.02E − 11 3.02E − 11 3.02E − 11 3.02E − 11
F20 5.97E − 09 3.69E − 11 1.96E − 10 3.02E − 11 1.31E − 08 2.51E − 02 3.34E − 11 4.57E − 09
F21 4.44E − 07 5.86E − 06 3.02E − 11 3.02E − 11 3.02E − 11 1.49E − 06 3.02E − 11 3.02E − 11
F22 4.50E − 11 4.50E − 11 3.02E − 11 3.02E − 11 2.68E − 04 8.24E − 02 3.02E − 11 1.15E − 01
F23 3.02E − 11 3.02E − 11 3.02E − 11 3.02E − 11 3.02E − 11 3.02E − 11 3.02E − 11 3.02E − 11
F24 1.96E − 10 3.20E − 09 3.34E − 11 3.02E − 11 3.02E − 11 3.47E − 10 3.02E − 11 3.02E − 11
F25 9.59E − 01 9.47E − 01 4.08E − 11 3.02E − 11 5.86E − 06 3.34E − 11 3.02E − 11 3.02E − 11
F26 8.30E − 01 5.59E − 01 1.20E − 08 1.22E − 02 6.72E − 10 2.58E − 01 3.02E − 11 3.02E − 11
Page 67 of 102 123

13
Table 12  (continued)
123

Function LSHADE_cnEpSin LSHADE_SPACMA MadDE DE AVOA GWO WOA CPSOGSA

13
F27 1.33E − 10 2.57E − 07 3.02E − 11 3.02E − 11 8.99E − 11 4.08E − 11 3.02E − 11 3.02E − 11
F28 1.32E − 04 6.74E − 06 3.02E − 11 3.02E − 11 2.19E − 08 3.02E − 11 3.02E − 11 3.02E − 11
Page 68 of 102

F29 6.74E − 06 5.46E − 06 3.02E − 11 3.02E − 11 4.08E − 11 1.61E − 10 3.02E − 11 3.02E − 11


F30 3.02E − 11 3.02E − 11 3.02E − 11 3.02E − 11 3.02E − 11 3.02E − 11 3.02E − 11 3.02E − 11
Function SO GTO COA RIME GJO DBO NOA

F1 6.79E − 02 6.38E − 03 3.02E − 11 3.34E − 11 3.02E − 11 5.49E − 11 3.92E − 02


F2 3.02E − 11 2.87E − 10 3.02E − 11 4.35E − 05 3.02E − 11 3.02E − 11 8.99E − 11
F3 2.57E − 07 3.34E − 11 3.02E − 11 3.02E − 11 3.01E − 07 4.50E − 11 5.79E − 01
F4 9.92E − 11 3.08E − 08 3.02E − 11 3.47E − 10 3.02E − 11 3.02E − 11 6.12E − 10
F5 7.96E − 01 8.15E − 11 3.02E − 11 5.60E − 07 3.02E − 11 3.02E − 11 3.02E − 11
F6 2.25E − 04 3.02E − 11 3.02E − 11 7.38E − 10 3.02E − 11 3.02E − 11 3.02E − 11
F7 7.66E − 05 3.34E − 11 3.02E − 11 2.00E − 05 5.49E − 11 9.92E − 11 3.02E − 11
F8 3.26E − 01 3.02E − 11 3.02E − 11 9.53E − 07 3.02E − 11 3.02E − 11 3.02E − 11
F9 8.30E − 01 1.33E − 02 4.98E − 11 1.60E − 07 3.02E − 11 3.02E − 11 5.49E − 11
F10 6.70E − 11 2.12E − 01 3.16E − 10 1.11E − 03 4.50E − 11 3.82E − 10 1.69E − 09
F11 3.02E − 11 4.38E − 01 3.02E − 11 1.76E − 01 3.02E − 11 3.02E − 11 5.60E − 07
F12 2.87E − 10 5.97E − 09 3.02E − 11 6.72E − 10 3.02E − 11 3.02E − 11 8.89E − 10
F13 3.82E − 10 1.73E − 06 3.02E − 11 5.57E − 10 3.02E − 11 3.02E − 11 2.87E − 10
F14 1.02E − 05 7.04E − 07 2.92E − 09 4.80E − 07 4.50E − 11 1.69E − 09 1.47E − 07
F15 3.02E − 11 3.47E − 10 3.02E − 11 3.02E − 11 3.02E − 11 3.02E − 11 3.02E − 11
F16 6.74E − 06 1.19E − 06 6.70E − 11 2.23E − 09 3.02E − 11 5.49E − 11 1.46E − 10
F17 7.22E − 06 3.26E − 07 3.82E − 10 3.01E − 07 3.02E − 11 3.02E − 11 3.77E − 04
F18 3.52E − 07 6.53E − 07 4.03E − 03 1.43E − 05 5.07E − 10 6.53E − 08 1.29E − 09
F19 3.02E − 11 2.87E − 10 3.02E − 11 3.02E − 11 3.02E − 11 3.02E − 11 3.02E − 11
F20 3.02E − 11 1.37E − 03 3.02E − 11 3.08E − 08 1.56E − 08 6.70E − 11 9.76E − 10
Y. Fu et al.
Table 12  (continued)
Function SO GTO COA RIME GJO DBO NOA

F21 4.74E − 06 3.69E − 11 3.02E − 11 1.01E − 08 3.02E − 11 3.02E − 11 3.02E − 11


F22 3.02E − 11 5.61E − 05 3.02E − 11 5.26E − 04 4.08E − 11 1.33E − 10 1.49E − 04
F23 3.02E − 11 3.02E − 11 3.02E − 11 8.99E − 11 3.02E − 11 3.02E − 11 3.02E − 11
F24 3.02E − 11 3.02E − 11 3.02E − 11 1.44E − 03 3.02E − 11 3.02E − 11 3.34E − 11
F25 6.12E − 10 5.26E − 04 3.02E − 11 1.07E − 07 3.02E − 11 8.15E − 11 1.69E − 09
F26 8.24E − 02 2.83E − 08 6.07E − 11 1.60E − 03 2.61E − 10 2.44E − 09 1.19E − 06
F27 3.02E − 11 6.07E − 11 4.50E − 11 4.57E − 09 3.02E − 11 4.50E − 11 1.78E − 10
F28 3.02E − 11 3.01E − 04 3.02E − 11 2.38E − 07 3.02E − 11 4.08E − 11 2.39E − 08
F29 3.47E − 10 6.52E − 09 3.34E − 11 4.08E − 11 3.02E − 11 3.02E − 11 3.02E − 11
F30 3.02E − 11 4.08E − 11 3.02E − 11 3.02E − 11 3.02E − 11 3.02E − 11 3.02E − 11
Secretary bird optimization algorithm: a new metaheuristic…
Page 69 of 102 123

13
123

Table 13  P-value on CEC-2022 (Dim = 10)


Function LSHADE_cnEpSin LSHADE_SPACMA MadDE DE AVOA GWO WOA CPSOGSA

13
F1 3.01E − 11 6.34E − 05 3.02E − 11 3.02E − 11 6.07E − 11 3.02E − 11 3.02E − 11 6.70E − 11
F2 4.33E − 02 9.22E − 01 3.09E − 05 1.97E − 02 1.37E − 02 4.83E − 05 1.81E − 07 9.70E − 03
Page 70 of 102

F3 2.53E − 04 3.15E − 02 4.08E − 11 1.01E − 08 3.02E − 11 3.02E − 11 3.02E − 11 3.02E − 11


F4 3.37E − 05 5.12E − 04 6.57E − 02 6.70E − 11 1.33E − 10 1.91E − 02 5.07E − 10 9.90E − 11
F5 1.37E − 02 1.22E − 02 1.55E − 09 3.02E − 11 3.02E − 11 8.15E − 11 3.02E − 11 3.02E − 11
F6 3.02E − 11 3.02E − 11 3.82E − 09 1.39E − 06 4.20E − 01 1.17E − 05 4.64E − 03 5.59E − 01
F7 1.91E − 02 2.05E − 03 3.77E − 04 1.68E − 04 6.07E − 11 5.19E − 07 3.02E − 11 3.02E − 11
F8 2.71E − 01 9.63E − 02 2.32E − 02 1.54E − 01 1.09E − 10 6.72E − 10 3.69E − 11 2.87E − 10
F9 5.14E − 12 4.59E − 01 5.14E − 12 5.14E − 12 5.14E − 12 5.14E − 12 5.14E − 12 2.64E − 11
F10 5.75E − 02 1.99E − 02 9.71E − 01 7.62E − 03 2.84E − 04 3.64E − 02 9.03E − 04 1.22E − 02
F11 2.50E − 03 1.49E − 02 6.63E − 01 6.63E − 01 4.68E − 02 6.79E − 02 1.04E − 04 4.55E − 01
F12 4.60E − 10 5.01E − 05 8.23E − 02 1.46E − 10 5.95E − 09 3.25E − 07 3.01E − 11 3.81E − 10
Function SO GTO COA RIME GJO DBO NOA

F1 3.02E − 11 4.50E − 11 3.02E − 11 5.07E − 10 3.02E − 11 3.02E − 11 4.46E − 01


F2 6.41E − 01 1.00E + 00 4.55E − 01 1.66E − 01 6.24E − 09 5.56E − 05 9.82E − 01
F3 3.02E − 11 3.02E − 11 4.50E − 11 3.02E − 11 3.02E − 11 3.02E − 11 3.34E − 11
F4 1.56E − 02 2.03E − 07 3.34E − 11 1.20E − 08 8.99E − 11 7.38E − 10 2.23E − 09
F5 3.02E − 11 3.02E − 11 3.34E − 11 6.52E − 09 3.34E − 11 3.02E − 11 8.99E − 11
F6 2.46E − 01 4.18E − 09 3.64E − 02 3.48E − 01 1.61E − 10 5.87E − 04 1.78E − 10
F7 2.43E − 05 3.81E − 07 6.38E − 03 4.84E − 02 3.02E − 11 1.41E − 09 5.30E − 01
F8 3.83E − 06 2.62E − 03 3.50E − 09 1.33E − 01 6.07E − 11 3.81E − 07 3.26E − 01
F9 1.30E − 08 2.03E − 02 5.14E − 12 5.14E − 12 5.14E − 12 4.06E − 08 3.15E − 12
F10 5.37E − 02 3.40E − 01 1.44E − 02 7.96E − 01 1.30E − 03 8.50E − 02 8.68E − 03
F11 1.26E − 01 5.94E − 02 9.21E − 05 1.03E − 02 2.25E − 04 5.01E − 02 4.68E − 02
F12 9.89E − 11 2.31E − 06 4.41E − 06 7.07E − 08 7.68E − 08 1.85E − 09 7.69E − 04
Y. Fu et al.
Table 14  P-value on CEC-2022 (Dim = 20)
Function LSHADE_cnEpSin LSHADE_SPACMA MadDE DE AVOA GWO WOA CPSOGSA

F1 7.39E − 11 8.65E − 01 3.02E − 11 3.02E − 11 3.02E − 11 3.02E − 11 3.02E − 11 3.02E − 11


F2 9.59E − 01 2.38E − 03 6.95E − 01 3.01E − 07 1.11E − 03 3.82E − 10 3.02E − 11 2.68E − 04
F3 3.02E − 11 9.07E − 03 8.99E − 11 2.43E − 05 3.02E − 11 3.02E − 11 3.02E − 11 3.02E − 11
F4 5.53E − 08 2.77E − 01 9.92E − 11 3.02E − 11 3.02E − 11 6.67E − 03 3.02E − 11 9.92E − 11
F5 3.82E − 09 7.06E − 01 1.46E − 10 3.34E − 11 3.02E − 11 5.00E − 09 3.02E − 11 3.02E − 11
F6 8.65E − 01 3.08E − 08 5.57E − 10 3.02E − 11 3.55E − 01 2.23E − 09 3.02E − 11 5.20E − 01
F7 4.50E − 11 6.15E − 02 3.82E − 09 1.87E − 07 3.02E − 11 3.65E − 08 3.02E − 11 3.02E − 11
F8 1.01E − 08 2.46E − 01 4.64E − 03 1.25E − 07 7.38E − 10 9.53E − 07 3.02E − 11 3.69E − 11
F9 1.11E − 06 2.40E − 01 3.02E − 11 3.02E − 11 3.02E − 11 3.02E − 11 3.02E − 11 3.02E − 11
F10 2.68E − 04 2.81E − 02 5.11E − 01 1.00E + 00 8.20E − 07 7.60E − 07 8.48E − 09 8.15E − 11
F11 2.61E − 02 4.64E − 03 7.01E − 02 5.53E − 08 1.37E − 03 1.09E − 10 3.02E − 11 4.46E − 01
F12 4.50E − 11 1.03E − 06 1.09E − 10 3.02E − 11 1.21E − 10 2.44E − 09 3.02E − 11 3.02E − 11
Secretary bird optimization algorithm: a new metaheuristic…

Function SO GTO COA RIME GJO DBO NOA

F1 3.02E − 11 1.06E − 03 3.02E − 11 1.08E − 02 3.02E − 11 3.02E − 11 5.59E − 01


F2 4.29E − 01 1.33E − 01 7.20E − 05 8.42E − 01 3.02E − 11 8.29E − 06 6.95E − 01
F3 3.02E − 11 3.02E − 11 3.02E − 11 5.49E − 11 3.02E − 11 3.02E − 11 3.02E − 11
F4 1.30E − 01 1.21E − 10 4.08E − 11 2.88E − 06 1.46E − 10 3.02E − 11 3.69E − 11
F5 1.41E − 09 5.49E − 11 3.02E − 11 2.77E − 05 3.69E − 11 4.50E − 11 1.09E − 10
F6 9.35E − 01 2.23E − 01 1.30E − 01 5.26E − 04 3.02E − 11 5.08E − 03 7.84E − 01
F7 1.31E − 08 9.92E − 11 6.72E − 10 4.74E − 06 3.02E − 11 1.46E − 10 4.80E − 07
F8 1.86E − 06 1.33E − 02 2.87E − 10 2.32E − 06 1.96E − 10 1.43E − 08 3.51E − 02
F9 4.08E − 11 1.77E − 03 3.02E − 11 3.02E − 11 3.02E − 11 7.39E − 11 8.15E − 11
F10 9.79E − 05 2.32E − 02 2.38E − 03 1.24E − 03 4.94E − 05 6.20E − 04 1.53E − 05
F11 9.47E − 03 3.95E − 01 4.22E − 04 1.56E − 02 3.02E − 11 3.26E − 07 4.46E − 04
F12 3.02E − 11 7.38E − 10 1.33E − 10 3.82E − 10 3.02E − 11 8.15E − 11 4.94E − 05
Page 71 of 102 123

13
123

13
Table 15  Friedman average rank sum test results
Page 72 of 102

Suites CEC-2017 CEC-2022

Dimensions 30 50 100 10 20
Algorithms Ave. rank Overall rank Ave. rank Overall rank Ave. rank Overall rank Ave. rank Overall rank Ave. rank Overall rank

LSHADE_cnEpSin 3.03 3 3.32 3 4.00 2.5 2.31 2 5.92 5


LSHADE_SPACMA 2.52 2 3.13 2 4.00 2.5 3.73 3 2.31 2
MadDE 5.84 4 7.16 8 8.65 10 3.92 4 4.85 3
DE 9.52 10 10.71 11 11.16 11 6.69 6.5 7.54 9
AVOA 10.35 12 9.26 10 7.52 9 10.85 13 9.77 11
GWO 8.87 9 8.13 9 7.23 7 10.31 11.5 8.92 10
WOA 15.10 16 14.97 16 14.71 16 13.92 16 14.46 16
CPSOGSA 12.58 14 12.29 13 11.52 12 12.85 15 12.08 15
SO 6.77 7 6.23 5 6.90 6 7.54 9 6.00 6
GTO 6.71 6 6.29 6 5.39 5 7.15 8 7.00 8
COA 10.26 11 11.23 12 11.65 13 9.69 10 10.38 12
RIME 6.35 5 5.68 4 5.26 4 6.69 6.5 5.38 4
GJO 12.71 15 12.65 15 12.48 14 12.46 14 11.38 13
DBO 12.55 13 12.45 14 12.55 15 10.31 11.5 11.62 14
NOA 7.45 8 7.00 7 7.35 8 6.62 5 7.00 8
SBOA 1.52 1 1.65 1 1.77 1 1.73 1 1
Y. Fu et al.
Secretary bird optimization algorithm: a new metaheuristic… Page 73 of 102 123

Fig. 21  Three-bar truss design


structure

Table 16  Experimental results of three-bar truss design


Algorithm Optimal values for variable Optimal value Ranking
A1 A2

LSHADE_cnEpSin 7.89E − 01 4.08E − 01 2.64E + 02 1


LSHADE_SPACMA 7.89E − 01 4.08E − 01 2.64E + 02 1
MadDE 7.89E − 01 4.08E − 01 2.64E + 02 1
DE 7.88E − 01 4.09E − 01 2.64E + 02 7
AVOA 7.85E − 01 4.19E − 01 2.64E + 02 9
GWO 7.89E − 01 4.08E − 01 2.64E + 02 12
WOA 8.56E − 01 2.44E − 01 2.64E + 02 14
CPSOGSA 8.61E − 01 2.77E − 01 2.71E + 02 16
SO 7.89E − 01 4.07E − 01 2.64E + 02 8
GTO 7.89E − 01 4.08E − 01 2.64E + 02 1
COA 7.90E − 01 4.05E − 01 2.64E + 02 11
RIME 7.34E − 01 5.90E − 01 2.65E + 02 15
GJO 7.90E − 01 4.04E − 01 2.64E + 02 13
DBO 7.90E − 01 4.05E − 01 2.64E + 02 10
NOA 7.89E − 01 4.08E − 01 2.64E + 02 6
SBOA 7.89E − 01 4.09E − 01 2.64E + 02 1

13
123 Page 74 of 102 Y. Fu et al.

Fig. 22  Pressure vessel design structure

� �
Consider x⃗ = x1 x2 x3 x4 ]=[ h l t b ,
� � � � � �
Minimize f x⃗ = f x⃗ = 1.10471x12 x2 + 0.04811x3 x4 14.0 + x2 ,
� � � �
Subject to g1 x⃗ = 𝜏 x⃗ − 𝜏max < 0,
� � � �
g2 x⃗ = 𝜎 x⃗ − 𝜎max < 0,
� � � �
g3 x⃗ = 𝛿 x⃗ − 𝛿max < 0,
� �
g4 x⃗ = x1 − x4 < 0,
� � � �
g5 x⃗ = P − Pc x⃗ < 0,
� �
g6 x⃗ = 0.125 − x1 < 0,
� � � �
g7 x⃗ = 1.10471x12 + 0.04811x3 x4 14.0 + x2 − 5.0 < 0,
Parameter range 0.1 < x1 , x4 < 2, 0.1 < x2 , x3 ≤ 10,

� � x (23)
Where, 𝜏 x⃗ = (𝜏 � )2 + 2𝜏 � 𝜏 �� 2 + (𝜏 � )2
2R
p MR
𝜏� = √ , 𝜏 �� = ,
2x1 x2 1
� x �
M =P L+ 2 ,
2

2 � �
x2 x1 + x2 2
R= + ,
4 2
� � � � ��
√ x22 x1 + x2 2 � � 6PL � � 6PL2
J=2 2x1 x2 + , 𝜎 x⃗ = , 𝛿 x⃗ = 2 ,
4 2 x4 x22 Ex2 x4

40x � � �
� � 36 x22 E
Pc x = 2
1− , P = 6000 Lb, L = 14 in, 𝛿max = 0.25 in,
L 2L 4C
E = 30 × 16 psi, G = 12 × 106 psi, 𝜏max = 13600 psi, 𝜎max = 30000 psi.

The optimization results for the Welded Beam Design problem are presented
in Table 19. As per the test results, SBOA achieves the lowest economic cost after
optimization.

13
Secretary bird optimization algorithm: a new metaheuristic… Page 75 of 102 123

Table 17  Experimental results of pressure vessel design


Algorithm Optimal values for Variable Optimal value Ranking
Ts Th R L

LSHADE_cnEpSin 7.78E − 01 3.85E − 01 4.03E + 01 2.00E + 02 5.89E + 03 1


LSHADE_SPACMA 7.78E − 01 3.85E − 01 4.03E + 01 2.00E + 02 5.89E + 03 5
MadDE 7.78E − 01 3.85E − 01 4.03E + 01 2.00E + 02 5.89E + 03 1
DE 8.21E − 01 4.09E − 01 4.22E + 01 1.76E + 02 6.04E + 03 7
AVOA 1.26E + 00 6.20E − 01 6.50E + 01 1.08E + 01 7.30E + 03 14
GWO 8.71E − 01 4.31E − 01 4.51E + 01 1.43E + 02 6.07E + 03 8
WOA 1.31E + 00 6.38E − 01 6.19E + 01 2.51E + 01 7.87E + 03 16
CPSOGSA 1.28E + 00 6.32E − 01 6.52E + 01 1.01E + 01 7.47E + 03 15
SO 1.06E + 00 5.25E − 01 5.51E + 01 6.26E + 01 6.57E + 03 11
GTO 1.07E + 00 5.29E − 01 5.54E + 01 6.08E + 01 6.63E + 03 12
COA 9.41E − 01 4.69E − 01 4.87E + 01 1.09E + 02 6.25E + 03 10
RIME 1.14E + 00 5.61E − 01 5.86E + 01 4.17E + 01 6.85E + 03 13
GJO 8.79E − 01 4.40E − 01 4.54E + 01 1.40E + 02 6.12E + 03 9
DBO 7.78E − 01 3.85E − 01 4.03E + 01 2.00E + 02 5.89E + 03 6
NOA 7.78E − 01 3.85E − 01 4.03E + 01 2.00E + 02 5.89E + 03 1
SBOA 7.85E − 01 3.88E − 01 4.07E + 01 1.95E + 02 5.89E + 03 1

Fig. 23  Tension/compression spring design structure

5.1.5 Weight minimization of speed reducer design (WMSRD)

This problem originates from the gearbox of a small aircraft engine, aiming to find the
minimum gearbox weight subject to certain constraints. The weight minimization design of
the gearbox involves seven variables, which include: gear width ( x1), number of teeth ( x2),
number of teeth on the pinion ( x3), length between bearings for the first shaft ( x4), length
between bearings for the second shaft ( x5), diameter of the first shaft ( x6), and diameter of
the second shaft ( x7). The mathematical model for this problem is represented by Eq. (24).

13
123 Page 76 of 102 Y. Fu et al.

Table 18  Experimental results of tension/compression spring design


Algorithm Optimal values for Variable Optimal value Ranking
d D N

LSHADE_cnEpSin 5.16E − 02 3.56E − 01 1.13E + 01 1.27E − 02 3


LSHADE_SPACMA 5.17E − 02 3.57E − 01 1.13E + 01 1.27E − 02 2
MadDE 5.18E − 02 3.58E − 01 1.12E + 01 1.27E − 02 4
DE 5.82E − 02 5.32E − 01 5.49E + 00 1.35E − 02 15
AVOA 5.64E − 02 4.82E − 01 6.50E + 00 1.31E − 02 13
GWO 5.00E − 02 3.17E − 01 1.40E + 01 1.27E − 02 7
WOA 5.41E − 02 4.17E − 01 8.45E + 00 1.28E − 02 10
CPSOGSA 5.71E − 02 5.03E − 01 6.02E + 00 1.32E − 02 14
SO 5.46E − 02 4.32E − 01 7.94E + 00 1.28E − 02 11
GTO 5.08E − 02 3.35E − 01 1.27E + 01 1.27E − 02 6
COA 5.60E − 02 4.70E − 01 6.83E + 00 1.30E − 02 12
RIME 6.93E − 02 9.40E − 01 2.00E + 00 1.81E − 02 16
GJO 5.00E − 02 3.17E − 01 1.41E + 01 1.27E − 02 8
DBO 5.10E − 02 3.41E − 01 1.23E + 01 1.27E − 02 5
NOA 5.38E − 02 4.09E − 01 8.77E + 00 1.27E − 02 9
SBOA 5.17E − 02 3.57E − 01 1.13E + 01 1.27E − 02 1

Fig. 24  Welding beam design structure

13
Secretary bird optimization algorithm: a new metaheuristic… Page 77 of 102 123

Table 19  Experimental results of welding beam design


Algorithm Optimal values for Variable Optimal value Ranking
h l t b

LSHADE_cnEpSin 1.99E − 01 3.34E + 00 9.19E + 00 1.99E − 01 1.67E + 00 4


LSHADE_SPACMA 1.99E − 01 3.34E + 00 9.19E + 00 1.99E − 01 1.67E + 00 2
MadDE 1.99E − 01 3.34E + 00 9.19E + 00 1.99E − 01 1.67E + 00 3
DE 2.39E − 01 3.55E + 00 7.73E + 00 2.86E − 01 2.09E + 00 14
AVOA 1.25E − 01 5.57E + 00 9.19E + 00 1.99E − 01 1.82E + 00 13
GWO 1.98E − 01 3.36E + 00 9.19E + 00 1.99E − 01 1.67E + 00 8
WOA 3.63E − 01 2.11E + 00 6.92E + 00 4.15E − 01 2.53E + 00 16
CPSOGSA 1.25E − 01 5.55E + 00 9.20E + 00 1.99E − 01 1.82E + 00 12
SO 1.93E − 01 3.45E + 00 9.19E + 00 1.99E − 01 1.68E + 00 9
GTO 1.99E − 01 3.34E + 00 9.19E + 00 1.99E − 01 1.67E + 00 6
COA 1.98E − 01 3.36E + 00 9.19E + 00 1.99E − 01 1.67E + 00 7
RIME 3.76E − 01 2.14E + 00 6.48E + 00 4.00E − 01 2.35E + 00 15
GJO 1.98E − 01 3.36E + 00 9.19E + 00 2.00E − 01 1.68E + 00 10
DBO 1.95E − 01 3.20E + 00 1.00E + 01 1.95E − 01 1.75E + 00 11
NOA 1.99E − 01 3.34E + 00 9.19E + 00 1.99E − 01 1.67E + 00 5
SBOA 1.99E − 01 3.34E + 00 9.19E + 00 1.99E − 01 1.67E + 00 1

( )
Minimize f x⃗ = 0.7854x22 x1 (14.9334x3 − 43.0934 + 3.3333x32 +
( ) ( ) ( )
0.7854 x5x72 + x4x62 − 1.508x1 x72 + x62 + 7.477 x73 + x63 ,
( )
Subject to g1 x = −x1 x22 x3 + 27 ≤ 0,
( )
g2 x = −x1 x22 x32 + 397.5 ≤ 0,
( )
g3 x = −x2 x64 x3 x4−3 + 1.93 ≤ 0,
( )
g4 x = −x2 x74 x3 x5(−3) + 1.93 ≤ 0,

( ) ( )2
g5 x = 10x6−3 16.91 × 106 + 745x4 x2−1 x3−1 − 1100 ≤ 0,

( ) ( )2 (24)
g6 x = 10x7−3 157.5 × 106 + 745x5 x2−1 x3−1 − 850 ≤ 0,
( )
g7 x = x2 x3 − 40 ≤ 0,
( )
g8 x = −x1 x2(−1) + 5 ≤ 0,
( )
g9 x = x1 x2(−1) − 12 ≤ 0,
( )
g10 x = 1.5x6 − x4 + 1.9 ≤ 0,
( )
g11 x = 1.1x7 − x5 + 1.9 ≤ 0,
Parameters range 0.7 ≤ x2 ≤ 0.8, 17 ≤ x3 ≤ 28, 2.6 ≤ x1 ≤ 3.6,
5 ≤ x7 ≤ 5.5, 7.3 ≤ x5, x4 ≤ 8.3, 2.9 ≤ x6 ≤ 3.9.
The experimental results, as shown in Table 20, we can observe that SBOA achieves the
best optimization results, obtaining an optimal result of 2.99E + 03.

13
123

13
Table 20  Experimental results of reducer weight minimization design
Page 78 of 102

Algorithm Optimal values for Variable Optimal value Ranking


X1 X2 X3 X4 X5 X6 X7

LSHADE_cnEpSin 3.50E + 00 7.00E − 01 1.70E + 01 7.30E + 00 7.72E + 00 3.35E + 00 5.29E + 00 2.99E + 03 2


LSHADE_SPACMA 3.50E + 00 7.00E − 01 1.70E + 01 7.30E + 00 7.72E + 00 3.35E + 00 5.29E + 00 2.99E + 03 3
MadDE 3.50E + 00 7.00E − 01 1.70E + 01 7.30E + 00 7.72E + 00 3.35E + 00 5.29E + 00 2.99E + 03 4
DE 3.50E + 00 7.00E − 01 1.70E + 01 7.30E + 00 7.72E + 00 3.35E + 00 5.29E + 00 2.99E + 03 6
AVOA 3.50E + 00 7.00E − 01 1.70E + 01 7.81E + 00 7.76E + 00 3.35E + 00 5.29E + 00 3.00E + 03 12
GWO 3.52E + 00 7.00E − 01 1.70E + 01 7.34E + 00 7.90E + 00 3.35E + 00 5.29E + 00 3.01E + 03 13
WOA 3.50E + 00 7.00E − 01 1.70E + 01 8.23E + 00 8.23E + 00 3.87E + 00 5.45E + 00 3.28E + 03 16
CPSOGSA 3.50E + 00 7.00E − 01 1.70E + 01 7.35E + 00 7.73E + 00 3.35E + 00 5.29E + 00 3.00E + 03 8
SO 3.50E + 00 7.00E − 01 1.74E + 01 7.30E + 00 7.72E + 00 3.36E + 00 5.29E + 00 3.06E + 03 15
GTO 3.50E + 00 7.00E − 01 1.70E + 01 7.30E + 00 7.72E + 00 3.35E + 00 5.29E + 00 3.00E + 03 11
COA 3.50E + 00 7.00E − 01 1.70E + 01 7.34E + 00 7.72E + 00 3.35E + 00 5.29E + 00 3.00E + 03 10
RIME 3.50E + 00 7.00E − 01 1.70E + 01 7.30E + 00 7.72E + 00 3.35E + 00 5.29E + 00 3.00E + 03 9
GJO 3.55E + 00 7.00E − 01 1.70E + 01 7.80E + 00 8.02E + 00 3.39E + 00 5.30E + 00 3.04E + 03 14
DBO 3.50E + 00 7.00E − 01 1.70E + 01 7.30E + 00 7.72E + 00 3.35E + 00 5.29E + 00 2.99E + 03 7
NOA 3.50E + 00 7.00E − 01 1.70E + 01 7.30E + 00 7.72E + 00 3.35E + 00 5.29E + 00 2.99E + 03 5
SBOA 3.50E + 00 7.00E − 01 1.70E + 01 7.30E + 00 7.72E + 00 3.35E + 00 5.29E + 00 2.99E + 03 1
Y. Fu et al.
Secretary bird optimization algorithm: a new metaheuristic… Page 79 of 102 123

5.1.6 Rolling element bearing design (REBD)

The design of rolling bearings presents complex nonlinear challenges. The bearing’s capac-
ity to support loads is constrained by ten parameters, encompassing five design variables:
pitch circle diameter (Dm ), ball diameter (Db ), curvature coefficients of the outer and inner
races ( fo and fi ), and the total number of balls (Z). The remaining five design parameters,
including e, 𝜖 , 𝜁 , KDmax , and KDmin, are used solely in the constraint conditions. The struc-
ture of the optimization problem for rolling bearings is illustrated in Fig. 25. The math-
ematical model for this problem can be represented by Eq. (25).
� �
Consider x⃗ = x1 x2 x3 x4 x5 x6 x7 x8 x9 x10
� �
= Dm Db fo fi Z e𝜁 KDmax KDmin
� 2∕3 1.8
� � fc Z Db , ifDb ≤ 25.4mm
Minimize f x =
3.647fc Z 2∕3 D1.4
b
, otherwise
� � 𝜙0
Subject to g1 x = Z − � � − 1 ≤ 0,
2sin−1 Db ∕Dm
� �
g2 x = KDmin (D − d) − 2Db ≤ 0,
� �
g3 x = 2Db − KDmax (D − d) ≤ 0,
� �
g4 x = Db − Bw ≤ 0
� �
g5 x = 0.5(D + d) − Dm ≤ 0,
� �
g6 x = Dm − (0.5 + e)(D + d) ≤ 0,
� � � �
g7 x = Db − 0.5 D − Dm − Db ≤ 0,
� �
g8 x = 0.515 − fi ≤ 0,
� �
g9 x = 0.515 − f0 ≤ 0,
10∕3 −0.3
⎧ ⎧ � �1.72 � � � �0.41 ⎫ ⎫
⎪ ⎪ 1−𝛾 fi 2f0 − 1 ⎪ ⎪
Where, fc = 37.91⎨1 + ⎨1.04 � � ⎬ ⎬ ,
⎪ ⎪ 1 + 𝛾 f 0 2f i − 1 ⎪ ⎪
⎩ ⎩ ⎭ ⎭
Db cos(𝛼) ri r0
𝛾= , fi = ,f = ,
Dm Db 0 Db
� �2
{(D − d)∕2 − 3(T∕4)}2 + D∕2 − (T∕4) − Db − {d∕2 + (T∕4)}2
𝜙0 = 2𝜋 − 2 × cos−1 (
2{(D − d)∕2 − 3(T∕4)}(D∕2 − (T∕4) − Db }
T = D − d − 2Db , D = 160, d = 90, Bw = 30,
Parameters range 0.5(D + d) ≤ Dm ≤ 0.6(D + d), 0.5(D + d) ≤ Dm ≤ 0.6(D + d), 4 ≤ Z ≤ 50,
0.515 ≤ fi ≤ 0.6, 0.515 ≤ f0 ≤ 0.6, 0.4 ≤ KDmin ≤ 0.5, 0.6 ≤ KDmax ≤ 0.7,
0.3 ≤≤ 0.4, 0.02 ≤ e ≤ 0.1, 0.6 ≤ 𝜁 ≤ 0.85.
(25)

Table displays the optimization results for the rolling bearing design problem using dif-
ferent comparative algorithms. It is evident that SBOA, LSHADE_cnEpSin, LSHADE_
SPACMA, SO and DBO simultaneously achieve optimal results, yielding the best lowest
of 1.70E + 04 while generating different solutions (Table 21).

13
123 Page 80 of 102 Y. Fu et al.

Fig. 25  Rolling bearing design structure

5.1.7 Gear train design (GTD)

The gear train design problem is a practical issue in the field of mechanical engineering.
The objective is to minimize the ratio of output to input angular velocity of the gear train
by designing relevant gear parameters. Figure 26 illustrates the structure of the optimiza-
tion problem, and Eq. (26) describes the mathematical model for the optimization problem.
[ ] [ ]
Consider x⃗ = x1 x2 x3 x4 = nA nB nC nD ,
( )
( ) 1 x x 2
Minimize f x⃗ = − 1 2 , (26)
6.931 x3 x4
Parameter range 12 ≤ x1 , x2 , x3 , x4 ≤ 60

From Table 22, it is evident that the parameters optimized by SBOA, AVOA, WOA, GTO
and NOA result in the minimum cost for gear train design, achieving a cost of 0.00E + 00.

5.1.8 Hydrostatic thrust bearing design (HSTBD)

The main objective of this design problem is to optimize bearing power loss using four
design variables. These design variables are oil viscosity (𝜇), bearing radius (R), flow rate
(R0), and groove radius (Q). This problem includes seven nonlinear constraints related to
inlet oil pressure, load capacity, oil film thickness, and inlet oil pressure. The mathematical
model for this problem is represented by Eq. (27).

13
Table 21  Experimental results of rolling element bearing design problems
Algorithm Optimal values for Parameters Optimal Ranking
value
x1 x2 x3 x4 x5 x6 x7 x8 x9 x10

LSHADE_ 1.31E + 02 1.80E + 01 4.58E + 00 6.00E − 01 6.00E − 01 4.05E − 01 6.24E − 01 3.00E − 01 9.10E − 02 6.00E − 01 1.70E + 04 1
cnEpSin
LSHADE_ 1.31E + 02 1.80E + 01 4.85E + 00 6.00E − 01 6.00E − 01 4.43E − 01 7.00E − 01 3.00E − 01 1.00E − 01 6.00E − 01 1.70E + 04 1
SPACMA
MadDE 1.31E + 02 1.80E + 01 5.35E + 00 6.00E − 01 6.00E − 01 4.61E − 01 6.53E − 01 3.00E − 01 6.26E − 02 6.00E − 01 1.70E + 04 7
DE 1.31E + 02 1.80E + 01 5.28E + 00 6.00E − 01 6.00E − 01 4.97E − 01 6.88E − 01 3.02E − 01 3.97E − 02 6.00E − 01 1.70E + 04 10
AVOA 1.29E + 02 1.80E + 01 4.51E + 00 6.00E − 01 6.00E − 01 4.86E − 01 6.94E − 01 3.65E − 01 8.23E − 02 6.00E − 01 1.70E + 04 12
GWO 1.30E + 02 1.80E + 01 5.11E + 00 6.00E − 01 6.00E − 01 4.65E − 01 6.83E − 01 3.17E − 01 2.12E − 02 6.00E − 01 1.70E + 04 11
Secretary bird optimization algorithm: a new metaheuristic…

WOA 1.26E + 02 1.80E + 01 5.34E + 00 6.00E − 01 6.00E − 01 4.11E − 01 6.09E − 01 3.00E − 01 2.33E − 02 6.00E − 01 1.70E + 04 15
CPSOGSA 1.27E + 02 1.80E + 01 5.00E + 00 6.00E − 01 6.00E − 01 4.08E − 01 6.09E − 01 3.03E − 01 3.17E − 02 6.00E − 01 1.70E + 04 14
SO 1.31E + 02 1.80E + 01 4.51E + 00 6.00E − 01 6.00E − 01 5.00E − 01 6.00E − 01 3.00E − 01 1.00E − 01 6.00E − 01 1.70E + 04 1
GTO 1.25E + 02 1.80E + 01 4.51E + 00 6.00E − 01 6.00E − 01 4.00E − 01 6.00E − 01 4.00E − 01 7.24E − 02 6.00E − 01 1.71E + 04 16
COA 1.31E + 02 1.80E + 01 5.29E + 00 6.00E − 01 6.00E − 01 4.11E − 01 6.00E − 01 3.00E − 01 9.96E − 02 6.00E − 01 1.70E + 04 8
RIME 1.31E + 02 1.80E + 01 4.51E + 00 6.00E − 01 6.00E − 01 4.95E − 01 6.49E − 01 3.00E − 01 8.25E − 02 6.00E − 01 1.70E + 04 9
GJO 1.29E + 02 1.80E + 01 5.34E + 00 6.00E − 01 6.00E − 01 4.82E − 01 6.04E − 01 3.56E − 01 7.31E − 02 6.00E − 01 1.70E + 04 13
DBO 1.31E + 02 1.80E + 01 4.51E + 00 6.00E − 01 6.00E − 01 5.00E − 01 7.00E − 01 3.00E − 01 1.00E − 01 6.00E − 01 1.70E + 04 1
NOA 1.31E + 02 1.80E + 01 4.64E + 00 6.00E − 01 6.00E − 01 4.21E − 01 6.83E − 01 3.00E − 01 9.88E − 02 6.00E − 01 1.70E + 04 6
SBOA 1.31E + 02 1.80E + 01 5.26E + 00 6.00E − 01 6.00E − 01 4.97E − 01 6.06E − 01 3.00E − 01 1.00E − 01 6.00E − 01 1.70E + 04 1
Page 81 of 102 123

13
123 Page 82 of 102 Y. Fu et al.

Fig. 26  Gear train design structure

Table 22  Experimental results of gear train design


Algorithm Optimal values for Variable Optimal Ranking
value
x1 x2 x3 x4

LSHADE_cnEpSin 1.37E + 01 3.23E + 01 5.30E + 01 5.77E + 01 2.48E − 14 12


LSHADE_SPACMA 3.54E + 01 1.21E + 01 5.68E + 01 5.22E + 01 3.08E − 33 6
MadDE 1.69E + 01 2.22E + 01 5.00E + 01 5.19E + 01 3.15E − 18 9
DE 2.63E + 01 1.71E + 01 5.76E + 01 5.44E + 01 7.42E − 13 15
AVOA 2.12E + 01 1.37E + 01 5.15E + 01 3.90E + 01 0.00E + 00 1
GWO 1.21E + 01 2.42E + 01 5.31E + 01 3.83E + 01 1.54E − 13 14
WOA 2.69E + 01 1.40E + 01 5.82E + 01 4.47E + 01 0.00E + 00 1
CPSOGSA 1.20E + 01 1.29E + 01 3.77E + 01 2.85E + 01 5.70E − 30 7
SO 2.27E + 01 2.16E + 01 5.66E + 01 5.99E + 01 6.57E − 15 11
GTO 2.38E + 01 1.20E + 01 4.50E + 01 4.39E + 01 0.00E + 00 1
COA 1.84E + 01 1.83E + 01 3.92E + 01 5.95E + 01 7.39E − 14 13
RIME 2.52E + 01 1.67E + 01 5.92E + 01 4.90E + 01 5.00E − 15 10
GJO 1.75E + 01 1.71E + 01 6.00E + 01 3.45E + 01 5.24E − 12 16
DBO 1.45E + 01 3.13E + 01 5.25E + 01 6.00E + 01 3.67E − 26 8
NOA 1.21E + 01 3.89E + 01 5.62E + 01 5.77E + 01 0.00E + 00 1
SBOA 3.41E + 01 1.22E + 01 5.99E + 01 4.80E + 01 0.00E + 00 1

13
Secretary bird optimization algorithm: a new metaheuristic… Page 83 of 102 123

Table 23  Experimental results of Hydrostatic thrust bearing design


Algorithm Optimal values for Variable Optimal value Ranking

𝜇 R0 R Q

LSHADE_cnEpSin 6.00E + 00 5.43E + 00 6.08E − 06 2.92E + 00 1.72E + 03 3


LSHADE_SPACMA 5.96E + 00 5.39E + 00 5.36E − 06 2.26E + 00 1.78E + 03 4
MadDE 5.96E + 00 5.39E + 00 6.00E − 06 2.80E + 00 1.69E + 03 2
DE 7.70E + 00 7.04E + 00 6.33E − 06 5.92E + 00 3.03E + 03 15
AVOA 7.52E + 00 7.08E + 00 8.90E − 06 1.44E + 01 2.73E + 03 13
GWO 6.06E + 00 5.50E + 00 8.15E − 06 6.68E + 00 2.08E + 03 7
WOA 8.23E + 00 7.83E + 00 6.91E − 06 6.52E + 00 2.66E + 03 11
CPSOGSA 4.27E + 00 7.32E + 00 6.53E − 06 9.58E + 00 2.64E + 20 16
SO 7.73E + 00 7.27E + 00 6.62E − 06 5.47E + 00 2.50E + 03 10
GTO 5.96E + 00 5.39E + 00 7.71E − 06 5.28E + 00 1.93E + 03 6
COA 6.82E + 00 6.32E + 00 9.01E − 06 1.35E + 01 2.68E + 03 12
RIME 7.18E + 00 6.69E + 00 7.04E − 06 5.59E + 00 2.30E + 03 9
GJO 7.12E + 00 6.65E + 00 6.90E − 06 5.21E + 00 2.24E + 03 8
DBO 8.75E + 00 8.38E + 00 8.21E − 06 1.23E + 01 2.97E + 03 14
NOA 5.96E + 00 5.39E + 00 7.30E − 06 4.46E + 00 1.85E + 03 5
SBOA 6.25E + 00 5.71E + 00 5.77E − 06 2.80E + 00 1.68E + 03 1

( ) QP0
Minimize f x = + Ef
0.7
( )
Subject to g1 x = 1000 − P0 ≤ 0,
( )
g2 x = W − 101000 ≤ 0,
( ) W
g3 x = 5000 − ( ) ≤ 0,
𝜋 R2 − R20
( )
g4 x = 50 − P0 ≤ 0,
( )
( ) 0.0307 Q
g5 x = 0.001 − ≤ 0,
386.4P0 2𝜋Rh
( )
g6 x = R − R0 ≤ 0,
( )
g7 x = h − 0.001 ≤ 0 (27)
( )
𝜋P0 R2 − R20 6𝜇Q R
Where, W= ( ) , P0 = ln ,
2 ln R 𝜋h3 R0
R0
( )
Ef = 9336Q × 0.0307 × 0.5ΔT, ΔT = 2 10P − 559.7 ,
( )
log10 log10 8.122 × 106 𝜇 + 0.8 + 3.55
P= ,
10.04
( )
( ) 4
2𝜋 × 750 2 2𝜋𝜇 R4 R0
h= −
60 Ef 4 4
With bounds, 1 ≤ R ≤ 16, 1 ≤ R0 ≤ 16,
1 × 10−6 ≤ 𝜇 ≤ 16 × 10−6 , 1 ≤ Q ≤ 16.

13
123 Page 84 of 102 Y. Fu et al.

From Table 23, it is evident that the parameters optimized by SBOA result in the mini-
mum cost for the hydrostatic thrust bearing design.

5.1.9 Single cone pulley design (SCPD)

The primary objective of the walking conical pulley design is to minimize the weight of
the four-stage conical pulley by optimizing five variables. The first four parameters repre-
sent the diameter of each stage of the pulley, and the last variable represents the pulley’s
width. The mathematical model for this problem is described by Eq. (28).
� � �2 � � � �2 �
⎡ 2 N1 2 N2 ⎤
⎢ d1 11 + + d2 1 + ⎥
� � ⎢ N N ⎥
Minimize f x = 𝜌𝜔⎢ � � �2 � � � �2 ⎥, �
⎢ 2 N3 N4 ⎥
⎢ +d3 1 + N + d42 1 + ⎥
⎣ N ⎦
� �
Subject to h1 x = C1 − C2 = 0,
� �
h2 x = C1 − C3 = 0,
� �
h3 x = C1 − C4 = 0,
� �
gi=1,2,3,4 x = −Ri ≤ 2, (28)
� �
gi=5,6,7,8 x = (0.75 × 745.6998) − Pi ≤ 0,
� �2
� � Ni
− 1
𝜋di N N
Where, Ci = 1+ i + + 2a, i = (1, 2, 3, 4),
2 N 4a
� � �� � ���
Ni di
Ri = exp 𝜇 𝜋 − 2sin−1 −1 , i = (1, 2, 3, 4)
N 2a
� � 𝜋di Ni
Pi = st𝜔 1 − Ri , i = (1, 2, 3, 4)
60
t = 8 mm, s = 1.75 MPa, 𝜇 = 0.35, 𝜌 = 7200 kg∕m3 , a = 3 mm

From Table 24, it is evident that the parameters optimized by SBOA result in the lowest
cost for the walking conical pulley design, amounting to 8.18E + 00.

5.1.10 Gas transmission compressor design (GTCD)

The mathematical model for the gas transmission compressor design problem is repre-
sented by Eq. (29).
( ) 1∕2 −2∕3 −1∕2
Minimize f x = 8.61 × 105 x1 x2 x3 x4 + 3.69 × 104 x3 + 7.72
( )
Subject to g1 x = 1000 − P0 ≤ 0,
With bounds 20 ≤ x1 ≤ 50
(29)
1 ≤ x2 ≤ 10
20 ≤ x3 ≤ 50
0.1 ≤ x4 ≤ 60

13
Secretary bird optimization algorithm: a new metaheuristic… Page 85 of 102 123

Table 24  Experimental results on the design of single cone pulley design


Algorithm Optimal values for Variables Optimal value Ranking
d1 d2 d3 d4 w

LSHADE_ 1.70E + 01 2.83E + 01 5.08E + 01 8.45E + 01 9.00E + 01 8.18E + 00 3


cnEpSin
LSHADE_ 1.70E + 01 2.83E + 01 5.08E + 01 8.45E + 01 9.00E + 01 8.18E + 00 2
SPACMA
MadDE 1.70E + 01 2.83E + 01 5.08E + 01 8.45E + 01 9.00E + 01 8.18E + 00 5
DE 1.70E + 01 2.83E + 01 5.08E + 01 8.45E + 01 9.00E + 01 8.19E + 00 10
AVOA 1.70E + 01 2.83E + 01 5.08E + 01 8.45E + 01 9.00E + 01 8.18E + 00 9
GWO 1.71E + 01 2.83E + 01 5.09E + 01 8.45E + 01 9.00E + 01 8.20E + 00 12
WOA 1.74E + 01 3.32E + 01 5.26E + 01 8.76E + 01 8.78E + 01 8.87E + 00 16
CPSOGSA 1.70E + 01 2.83E + 01 5.08E + 01 8.45E + 01 9.00E + 01 8.18E + 00 4
SO 1.70E + 01 2.83E + 01 5.08E + 01 8.45E + 01 9.00E + 01 8.18E + 00 8
GTO 1.70E + 01 2.83E + 01 5.08E + 01 8.45E + 01 9.00E + 01 8.18E + 00 6
COA 1.70E + 01 2.83E + 01 5.08E + 01 8.45E + 01 9.00E + 01 8.19E + 00 11
RIME 1.71E + 01 2.84E + 01 5.11E + 01 8.49E + 01 8.95E + 01 8.23E + 00 14
GJO 1.72E + 01 2.84E + 01 5.10E + 01 8.45E + 01 9.00E + 01 8.23E + 00 13
DBO 1.81E + 01 3.01E + 01 5.41E + 01 9.00E + 01 8.45E + 01 8.71E + 00 15
NOA 1.70E + 01 2.83E + 01 5.08E + 01 8.45E + 01 9.00E + 01 8.18E + 00 7
SBOA 1.70E + 01 2.83E + 01 5.08E + 01 8.45E + 01 9.00E + 01 8.18E + 00 1

Table 25  Experimental results of gas transmission compressor design


Algorithm Optimal values for Variable Optimal value Ranking
x1 x2 x3 x4

LSHADE_cnEpSin 5.00E + 01 1.18E + 00 2.46E + 01 3.88E − 01 2.96E + 06 1


LSHADE_SPACMA 5.00E + 01 1.18E + 00 2.46E + 01 3.88E − 01 2.96E + 06 4
MadDE 5.00E + 01 1.18E + 00 2.46E + 01 3.88E − 01 2.96E + 06 5
DE 5.00E + 01 1.19E + 00 2.42E + 01 4.05E − 01 2.97E + 06 11
AVOA 4.19E + 01 1.16E + 00 2.52E + 01 3.36E − 01 2.97E + 06 14
GWO 5.00E + 01 1.18E + 00 2.44E + 01 3.98E − 01 2.97E + 06 9
WOA 4.99E + 01 1.19E + 00 2.43E + 01 4.22E − 01 2.97E + 06 13
CPSOGSA 3.57E + 01 1.14E + 00 2.07E + 01 2.99E − 01 3.00E + 06 15
SO 5.00E + 01 1.18E + 00 2.46E + 01 3.88E − 01 2.96E + 06 7
GTO 5.00E + 01 1.18E + 00 2.46E + 01 3.88E − 01 2.96E + 06 6
COA 4.98E + 01 1.18E + 00 2.46E + 01 3.85E − 01 2.96E + 06 8
RIME 5.00E + 01 1.18E + 00 2.47E + 01 3.83E − 01 2.97E + 06 10
GJO 4.93E + 01 1.19E + 00 2.43E + 01 4.09E − 01 2.97E + 06 12
DBO 3.52E + 01 1.14E + 00 2.00E + 01 3.09E − 01 3.01E + 06 16
NOA 5.00E + 01 1.18E + 00 2.46E + 01 3.88E − 01 2.96E + 06 1
SBOA 5.00E + 01 1.18E + 00 2.45E + 01 3.89E − 01 2.96E + 06 1

13
123 Page 86 of 102 Y. Fu et al.

Table shows that SBOA, LSHADE_cnEpSin and NOA all simultaneously achieve the
best result, which is 2.96E + 06, while producing different solutions (Table 25).

5.1.11 Planetary gear train design (PGTD)

The primary objective of this problem is to minimize the error in the gear ratio by optimiz-
ing the parameters. To achieve this, the total number of gears in the automatic planetary
transmission system is calculated. It involves six variables, and the mathematical model is
represented by Eq. (30):
( )
Minimize f x = max||ik − i0k ||, k = {1, 2, .., R}, (30)
( )
N N6 N1 N3 + N2 N4
Where, i1 = 6 , i01 = 3.11, i2 = ( ) , i0R = −3.11,
N4 N1 N3 N6 − N4
N2 N6 { }
IR = − , i = 1.84, x = p, N6 , N5 , N4 , N3 , N2 , N1 , m2 , m1 ,
N1 N3 02
Subject to g1 x = m3 N6 + 2.5 − Dmax ≤ 0,
( ) ( )

g2 x = m1 N1 + N2 + m1 N2 + 2 − Dmax ≤ 0,
( ) ( ) ( )

g3 x = m3 (N4 + N5 ) + m3 (N5 + 2) − Dm ax ≤ 0,
( )

g4 x = ||m1 (N1 + N2 ) − m3 (N6 − N3 )|| − m1 − m3 ≤ 0,


( )

g5 x = − N1 + N2 sin − (𝜋∕p) + N2 + 2 + 𝛿22 ≤ 0,


( ) ( )

g6 x = − N6 − N3 sin − (𝜋∕p) + N3 + 2 + 𝛿33 ≤ 0,


( ) ( )

g7 x = − N4 + N5 sin − (𝜋∕p) + N5 + 2 + 𝛿55 ≤ 0,


( ) ( )
( )
≤ 0,
( ) ( )( ) 2𝜋
g8 x = (N3 + N5 + 2 + 𝛿35 )2 − (N6 − N3 )2 − (N4 + N5 )2 + 2 N6 − N3 N4 + N5 cos − −𝛽
p

g9 x = N4 − N6 + 2N5 + 2𝛿56 + 4 ≤ 0,
( )

g10 x = 2N3 − N6 + N4 + 2𝛿34 + 4 ≤ 0,


( )

( ) N − N4
h1 x = 6 = integer,
p
Where, 𝛿22 = 𝛿33 = 𝛿55 = 𝛿35 = 𝛿56 = 0.5,
( )
cos−1 ((N4 + N3 )2 + (N6 − N3 )2 − N3 + N3 )2
𝛽= ( )( ) ,
2 N6 − N3 N4 + N5

Dmax = 220,

Parameters range p = (3, 4, 5),

m1 = (1.75, 2.0, 2.25, 2.5, 2.75, 3.0),

m3 = (1.75, 2.0, 2.25, 2.5, 2.75, 3.0),

17 ≤ N1 ≤ 96, 14 ≤ N2 ≤ 54, 14 ≤ N3 ≤ 51

17 ≤ N4 ≤ 46, 14 ≤ N5 ≤ 51, 48 ≤ N6 ≤ 124,

and Ni = integer

Table displays the optimization results for the Planetary Gear Train design. The results
indicate that SBOA ultimately achieves the minimum error, with an optimal value of
5.23E – 01 (Table 26).

13
Table 26  Experimental results of planetary gear train design
Algorithm Optimal values for Variable Optimal value Ranking
X1 X2 X3 X4 X5 X6 X7 X8 X9

LSHADE_cnEpSin 4.22E + 01 3.05E + 01 2.72E + 01 2.73E + 01 2.44E + 01 9.80E + 01 1.05E + 00 1.53E + 00 2.10E + 00 5.26E − 01 7
LSHADE_SPACMA 5.24E + 01 2.13E + 01 1.67E + 01 3.02E + 01 2.55E + 01 1.09E + 02 5.10E − 01 2.80E + 00 5.80E − 01 5.23E − 01 2
MadDE 4.97E + 01 3.34E + 01 2.51E + 01 2.74E + 01 2.17E + 01 9.75E + 01 1.46E + 00 1.08E + 00 1.78E + 00 5.24E − 01 3
DE 6.43E + 01 2.84E + 01 1.87E + 01 3.07E + 01 2.69E + 01 1.12E + 02 2.04E + 00 5.74E − 01 1.36E + 00 5.31E − 01 14
AVOA 3.55E + 01 1.87E + 01 1.35E + 01 1.86E + 01 1.65E + 01 6.88E + 01 5.10E − 01 5.10E − 01 5.10E − 01 5.28E − 01 12
GWO 4.66E + 01 2.58E + 01 1.67E + 01 2.22E + 01 1.68E + 01 7.97E + 01 1.35E + 00 6.55E − 01 1.92E + 00 5.27E − 01 9
WOA 4.06E + 01 3.37E + 01 3.48E + 01 2.96E + 01 2.68E + 01 1.09E + 02 1.44E + 00 1.35E + 00 1.01E + 00 5.27E − 01 10
Secretary bird optimization algorithm: a new metaheuristic…

CPSOGSA 3.07E + 01 2.03E + 01 1.95E + 01 2.13E + 01 1.61E + 01 7.57E + 01 8.08E − 01 1.71E + 00 5.30E − 01 5.29E − 01 13
SO 2.88E + 01 1.85E + 01 2.61E + 01 2.97E + 01 1.49E + 01 1.09E + 02 5.46E − 01 6.49E + 00 1.36E + 00 5.28E − 01 11
GTO 3.66E + 01 2.44E + 01 2.04E + 01 2.19E + 01 2.03E + 01 8.01E + 01 6.97E − 01 8.41E − 01 1.06E + 00 5.26E − 01 8
COA 2.15E + 01 1.41E + 01 1.52E + 01 1.66E + 01 1.41E + 01 6.20E + 01 5.33E − 01 2.63E + 00 1.01E + 00 5.37E − 01 15
RIME 4.82E + 01 3.31E + 01 2.57E + 01 2.71E + 01 1.94E + 01 9.82E + 01 2.17E + 00 1.45E + 00 2.21E + 00 5.26E − 01 6
GJO 4.17E + 01 1.35E + 01 1.35E + 01 3.01E + 01 2.72E + 01 1.09E + 02 5.56E − 01 5.73E + 00 5.81E − 01 5.25E − 01 4
DBO 2.16E + 01 1.35E + 01 1.35E + 01 1.65E + 01 1.36E + 01 5.18E + 01 1.40E + 00 5.10E − 01 5.10E − 01 7.94E − 01 16
NOA 3.54E + 01 1.95E + 01 2.40E + 01 2.95E + 01 1.92E + 01 1.09E + 02 3.49E + 00 4.73E + 00 7.96E − 01 5.25E − 01 4
SBOA 2.37E + 01 1.80E + 01 2.04E + 01 1.87E + 01 1.62E + 01 6.92E + 01 5.56E − 01 3.20E + 00 2.02E + 00 5.23E − 01 1
Page 87 of 102 123

13
123 Page 88 of 102 Y. Fu et al.

5.1.12 Four‑stage gear box design (F‑sGBD)

The Four-stage Gear Box problem is relatively complex compared to other engineering
problems, involving 22 variables for optimization. These variables include the positions
of gears, positions of small gears, blank thickness, and the number of teeth, among others.
The problem comprises 86 nonlinear design constraints related to pitch, kinematics, con-
tact ratio, gear strength, gear assembly, and gear dimensions. The objective is to minimize
the weight of the gearbox. The mathematical model is represented by Eq. (31).
� � � �
Consider x⃗ = x1 x2 , … , x21 x22 �
= N N b x x y y , where, i = (1, 2, 3, 4) ,
� pi gi i p1 gi p1 gi
� �∑ 4 b c2 N 2 +N 2
𝜋 i i pi gi
Minimize f (̄x) = 1000 2 , where, i = (1, 2, 3, 4),
( N +N )� �
� i=1

pi gi
2
2c1 Np1 (Np1 +Ng1 ) 𝜎N JR
Subject to g1 (̄x) = 366000 𝜋𝜔1
+ Npi +Ng1 4b1 c21 Np1
− 0.0167WK ≤ 0,
o Km
� 366000N � � 2

2c N (Np2 +Ng2 ) 𝜎N JR
g2 (̄x) = 𝜋𝜔 N g1 + N 2+Np2 4b2 c22 Np2
− 0.0167WK ≤ 0,
o Km
1 p1
� 366000N N
p2 g2
� � 2

2c N (Np3 +Ng3 ) 𝜎N JR
g3 (̄x) = 𝜋𝜔 N g1N g2 + N 3+Np3 4b3 c23 Np3
− 0.0167WK ≤ 0,
o Km
1 p1 p2
� 366000N N N
p3 g3
� � 2

2c N (Np4 +Ng4 ) 𝜎N JR
g4 (̄x) = 𝜋𝜔 N g1N g2N g3 + N 4+Np4 4b4 c24 Np4
− 0.0167WK ≤ 0,
o Km

1 p1 p2 p3
� � p4 g4
3
� � � � �
2
2c N (Np1 +Ng1 ) 𝜎
g5 (̄x) = 366000 𝜋𝜔1
+ N 1+Np1 4b1 c21 Ng1 Np1 2 − CH sin(𝜙)cos(𝜙)
0.0334WKo Km
≤ 0,
� 366000N
p1 g1
� � 3
� �
p
� � �
2
2c N (Np2 +Ng2 ) 𝜎 sin(𝜙)cos(𝜙)
g6 (̄x) = 𝜋𝜔 N g1 + N 2+Np2 4b2 c22 Ng2 Np22 − CH 0.0334WKo Km
≤ 0,
1 p1
� 366000N N
p2 g2
� � 3
� � � � p

2
2c N (Np3 +Ng3 ) 𝜎 sin(𝜙)cos(𝜙)
g7 (̄x) = 𝜋𝜔 N g1N g2 + N 3+Np3 4b3 c23 Ng3 Np3 2 − CH 0.0334WKo Km
≤ 0,
1 p1 p2
� 366000N N N
p3 g3
� � 3
� � � � p

2
2c N (Np4 +Ng4 ) 𝜎 sin(𝜙)cos(𝜙)
g8 (̄x) = 𝜋𝜔 N g1N g2N g3 + N 4+Np4 4b4 c24 Ng4 Np4 2 − CH 0.0334WKo Km
≤ 0,

1 p1 p2 p3 p4 g4
� �
p

2
sin2 (𝜙) 1 1
g9−12 (̄x) = −Npi 4
− Npi
+ Npi
� � �2
sin2 (𝜙)
+Ngi 4
+ N1 1
Ngi
gi
sin(𝜙)(Npi +Ngi )
+ 2
+ CRmin 𝜋cos(𝜙) ≤ 0,
2ci Npi
g13−16 (̄x) = dmin − Npi +Ngi
≤ 0,
2ci Ngi
g17−20 (̄x) = dmin − N +N ≤ 0,
� N +2 pi gi�
( )c
g21 (̄x) = xp1 + Np1+N 1 − Lmax ≤ 0,
p1 �g1 �
(N +2)c
g22−24 (̄x) = −Lmax + Npi+N i + xg(i−1) ≤ 0,
gi pi i=2,3,4
(Np1 +2)c1
g25 (̄x) = −xp1 + N +N ≤ 0, ≤ 0,
� N +2p1c g1 �
( )
g26−28 (̄x) = Npi+N i − xg(i−1) ≤ 0,
pi gi i=2,3,4
(Np1 +2)c1
g29 (̄x) = yp1 + Np1 +Ng1
− Lmax ≤ 0,
(31)

13
Table 27  Experimental results of four-stage gear box design
Algorithm LSHADE_cnEpSin LSHADE_SPACMA MadDE DE AVOA GWO WOA CPSOGSA

Optimal for variable


X1 3.86E + 01 1.30E + 01 1.46E + 01 1.35E + 01 1.38E + 01 1.20E + 01 7.21E + 00 1.47E + 01
X2 6.32E + 01 3.58E + 01 3.53E + 01 6.69E + 01 5.89E + 01 3.70E + 01 5.05E + 01 3.94E + 01
X3 1.17E + 01 1.66E + 01 2.36E + 01 1.71E + 01 1.60E + 01 1.86E + 01 8.85E + 00 9.56E + 00
X4 7.19E + 01 3.10E + 01 4.83E + 01 4.97E + 01 5.68E + 01 2.25E + 01 2.42E + 01 2.06E + 01
X5 5.91E + 01 9.56E + 00 1.43E + 01 5.21E + 01 1.17E + 01 1.32E + 01 3.10E + 01 2.04E + 01
X6 4.95E + 01 3.85E + 01 4.09E + 01 4.29E + 01 1.47E + 01 1.98E + 01 2.75E + 01 1.20E + 01
X7 2.77E + 01 3.66E + 01 1.58E + 01 1.96E + 01 1.78E + 01 1.55E + 01 6.63E + 00 6.51E + 00
X8 7.44E + 01 3.94E + 01 2.27E + 01 3.35E + 01 1.87E + 01 5.69E + 01 7.61E + 00 4.17E + 01
X9 6.31E − 01 1.04E + 00 1.01E + 00 1.34E + 00 8.97E − 01 6.01E − 01 5.98E − 01 6.59E − 01
X10 7.25E − 01 1.68E + 00 1.22E + 00 5.57E − 01 1.21E + 00 5.92E − 01 5.56E − 01 6.37E − 01
X11 1.04E + 00 1.84E + 00 1.49E + 00 1.40E + 00 5.22E − 01 7.40E − 01 5.32E − 01 5.10E − 01
X12 8.68E − 01 1.23E + 00 2.11E + 00 1.32E + 00 5.10E − 01 5.57E − 01 5.38E − 01 7.92E − 01
Secretary bird optimization algorithm: a new metaheuristic…

X13 6.75E + 00 1.66E + 00 1.51E + 00 7.70E + 00 1.24E + 00 8.14E − 01 3.19E + 00 5.99E − 01


X14 1.67E + 00 5.54E + 00 6.41E + 00 4.15E + 00 5.28E + 00 3.08E + 00 3.50E + 00 1.52E + 00
X15 4.19E + 00 5.67E + 00 5.56E + 00 6.05E + 00 3.55E + 00 6.01E − 01 5.16E − 01 1.83E + 00
X16 3.41E + 00 6.08E + 00 6.09E + 00 5.62E + 00 3.16E + 00 2.94E + 00 5.23E − 01 1.63E + 00
X17 4.98E + 00 4.63E + 00 5.63E + 00 4.96E + 00 5.10E − 01 5.60E + 00 2.37E + 00 3.30E + 00
X18 1.22E + 00 6.03E + 00 6.26E + 00 1.92E + 00 2.57E + 00 3.08E + 00 5.22E − 01 3.77E + 00
X19 2.69E + 00 6.37E + 00 5.90E + 00 4.14E + 00 5.67E + 00 5.88E + 00 4.67E + 00 1.32E + 00
X20 4.72E + 00 6.20E + 00 6.19E + 00 5.54E + 00 5.70E + 00 4.81E + 00 5.65E − 01 1.61E + 00
X21 5.98E + 00 4.36E + 00 5.80E + 00 6.08E + 00 2.28E + 00 1.72E + 00 3.49E + 00 1.53E + 00
X22 5.23E + 00 3.66E + 00 5.13E + 00 1.73E + 00 7.21E − 01 3.65E + 00 5.27E − 01 1.70E + 00
Optimal value 3.02E + 15 3.01E + 15 7.06E + 01 3.33E + 16 1.94E + 16 3.14E + 16 8.07E + 17 6.95E + 17
Ranking 6 5 2 9 7 8 16 13
Page 89 of 102 123

13
Table 27  (continued)
123

Algorithm SO GTO COA RIME GJO DBO NOA SBOA

13
Optimal for variable
X1 1.19E + 01 6.51E + 00 1.33E + 01 1.73E + 01 1.16E + 01 1.20E + 01 2.05E + 01 1.82E + 01
Page 90 of 102

X2 1.91E + 01 1.07E + 01 2.50E + 01 4.58E + 01 5.70E + 01 1.26E + 01 2.76E + 01 3.84E + 01


X3 9.61E + 00 4.95E + 01 6.52E + 00 1.36E + 01 1.27E + 01 6.51E + 00 8.81E + 00 2.42E + 01
X4 2.52E + 01 6.47E + 01 2.71E + 01 2.21E + 01 2.08E + 01 2.62E + 01 3.74E + 01 5.54E + 01
X5 1.38E + 01 6.53E + 00 9.78E + 00 2.15E + 01 1.27E + 01 6.51E + 00 1.13E + 01 1.34E + 01
X6 4.23E + 01 4.72E + 01 2.75E + 01 6.74E + 01 2.26E + 01 3.71E + 01 2.29E + 01 2.24E + 01
X7 2.35E + 01 6.93E + 00 6.51E + 00 3.06E + 01 9.30E + 00 1.40E + 01 1.76E + 01 1.46E + 01
X8 3.76E + 01 9.66E + 00 6.67E + 00 4.72E + 01 1.27E + 01 1.31E + 01 3.14E + 01 3.62E + 01
X9 1.16E + 00 3.56E + 00 6.84E − 01 5.73E − 01 6.23E − 01 5.10E − 01 5.11E − 01 7.04E − 01
X10 5.10E − 01 5.22E − 01 8.19E − 01 3.26E + 00 8.15E − 01 5.10E − 01 3.14E + 00 6.82E − 01
X11 8.55E − 01 9.71E − 01 8.16E − 01 1.35E + 00 1.05E + 00 5.10E − 01 1.83E + 00 1.93E + 00
X12 8.19E − 01 3.52E + 00 1.52E + 00 5.51E − 01 2.77E + 00 5.10E − 01 5.38E − 01 5.52E − 01
X13 3.04E + 00 7.69E + 00 8.72E − 01 4.81E + 00 5.56E − 01 2.09E + 00 8.57E − 01 5.64E + 00
X14 5.10E − 01 6.38E + 00 1.10E + 00 6.57E + 00 5.89E + 00 1.02E + 00 1.95E + 00 2.55E + 00
X15 5.10E − 01 7.28E + 00 1.61E + 00 3.37E + 00 9.21E − 01 9.33E − 01 5.34E + 00 2.68E + 00
X16 4.13E + 00 7.33E + 00 8.60E − 01 6.46E + 00 9.17E − 01 4.62E + 00 3.53E + 00 5.86E + 00
X17 7.05E + 00 4.75E + 00 1.58E + 00 3.94E + 00 2.12E + 00 7.96E − 01 4.36E + 00 5.96E + 00
X18 3.00E + 00 7.45E + 00 3.03E + 00 1.43E + 00 3.88E + 00 1.56E + 00 1.29E + 00 1.81E + 00
X19 3.94E + 00 5.00E + 00 8.61E − 01 3.96E + 00 3.80E + 00 1.10E + 00 3.32E + 00 2.64E + 00
X20 2.87E + 00 1.78E + 00 1.10E + 00 5.41E + 00 2.41E + 00 3.62E + 00 4.74E + 00 4.04E + 00
X21 5.95E + 00 3.62E + 00 5.68E − 01 6.48E + 00 1.05E + 00 4.26E + 00 3.13E + 00 5.44E + 00
X22 2.72E + 00 7.49E + 00 3.65E + 00 5.83E + 00 6.86E − 01 5.10E − 01 2.78E + 00 6.24E + 00
Optimal value 2.49E + 17 3.92E + 17 7.93E + 17 1.70E + 15 7.75E + 17 6.49E + 17 1.91E + 15 5.01E + 01
Ranking 10 11 15 3 14 12 4 1
Y. Fu et al.
Secretary bird optimization algorithm: a new metaheuristic… Page 91 of 102 123

Fig. 27  Comparison chart of the ranking of engineering problems

Fig. 28  Mountain Environment


Simulation

13
123 Page 92 of 102 Y. Fu et al.

≤ 0,
(c )
(2+Npi )
g30−32 (̄x) = −Lmax + i
Npi +Ngi
+ yg(i−1)

− yp1 ≤ 0,
i=2,3,4
(2+Np1 )c1
g33 (̄x) = N +N
≤ 0,
( g1c 2+N
p1 )
i( pi )
g34−36 (̄x) = N +N −y

g37−40 (̄x) = −Lmax + N +N + xgi ≤ 0,


pi gi g(i−1) i=2,3,4
ci (2+Ngi )

g41−44 (̄x) = −xgi + Ngi +N i ≤ 0,


( N pi+2 gic )
( )

g45−48 (̄x) = ygi + Ngi +N i − Lmax ≤ 0,


( N +2 pi gi)
( )c

≤ 0,
( piN +2gi ci )
( )
g49−52 (̄x) = −ygi + N gi +N
g53−56 (̄x) = bi − 8.255 bi − 5.715 bi − 12.70 −Npi + 0.945ci − Ngi (−1) ≤ 0,
( )( gi
pi
)( )( )

g57−60 (̄x) = bi − 8.255 bi − 3.175 bi − 12.70 −Npi + 0.646ci − Ngi ≤ 0,


( )( )( )( )

g61−64 (̄x) = bi − 5.715 bi − 3.175 bi − 12.70 −Npi + 0.504ci − Ngi ≤ 0,


( )( )( )( )

g65−68 (̄x) = bi − 5.715 bi − 3.175 bi − 8.255 0ci − Ngi − Npi ≤ 0,


( )( )( )( )

g69−72 (̄x) = bi − 8.255 bi − 5.715 bi − 12.70 Ngi + Npi − 1.812ci (−1) ≤ 0,


( )( )( )( )

g73−76 (̄x) = bi − 8.255 bi − 3.175 bi − 12.70 −0.945ci + Npi + Ngi ≤ 0,


( )( )( )( )

g77−80 (̄x) = bi − 5.715 bi − 3.175 bi − 12.70 −0.646ci + Npi + Ngi (−1) ≤ 0,


( )( )( )( )

g81−84 (̄x) = bi − 5.715 bi − 3.175 bi − 8.255 Npi + Ngi − 0.504ci ≤ 0,


( )( )( )( )

g85 = 𝜔min − 1N p1N p2N p3N p4 ≤ 0,


𝜔 (N N N N )
( g1 g2 g3 g4 )
− 𝜔max ≤ 0,
𝜔1 (Np1 Np2 Np3 Np4 )
g86 = N N N N
(√g1 g2 g3 g4 )
( )2 ( )2
Where, ci = ygi − ypi + xgi − xpi , K0 = 1.5, dmin = 25, JR = 0.2, 𝜙 = 120◦ , W = 55.9, KM = 1.6, CRmin = 1.4,
Lmax = 127, Cp = 464, 𝜎H = 3290, 𝜔max = 255, 𝜔1 = 5000, 𝜎N = 2090, 𝜔min = 245,
Parameters range bi ∈ {3.175, 12.7, 8.255, 5.715},
yp1 , xp1 , ygi , xgi ∈ {12.7, 38.1, 25.4, 50.8, 76.2, 63.5, 88.9, 114.3, 101.6},
7 ≤ Ngi , Npi ≤ 76 ∈ integer.
(31)
From Table 27, it is evident that the optimization performance of SBOA is signifi-
cantly superior to that of other algorithms. Moreover, it achieves an optimal result of
5.01E + 01.
In Sects. 5.1.1–5.1.12, we conducted a comparative validation of the secretary bird opti-
mization algorithm (SBOA) against fourteen other advanced algorithms across twelve real
engineering problems. To highlight the comparative performance, we used radar charts
to illustrate the ranking of each algorithm in different engineering problems, as shown
in Fig. 27. Smaller areas in the algorithm regions indicate better performance across the
twelve engineering problems. From the graph, it is evident that SBOA achieved the opti-
mal solution in each engineering problem, clearly indicating not only its outstanding per-
formance but also its high level of stability in solving real-world problems. The experi-
ments in this section provide substantial evidence of the broad applicability and scalability
of the SBOA method, establishing a solid foundation for its use in practical engineering
applications.

5.2 3D Trajectory planning for UAVs

Unmanned aerial vehicles (UAVs) play a vital role in various civil and military applica-
tions, and their importance and convenience are widely recognized. As a core task of the
autonomous control system for UAVs, path planning and design aim to solve a complex
constrained optimization problem: finding a reliable and safe path from a starting point
to a goal point under certain constraints. In recent years, with the widespread applica-
tion of UAVs, research on the path planning problem has garnered considerable attention.
Therefore, we employ SBOA to address the UAV path planning problem and verify the

13
Secretary bird optimization algorithm: a new metaheuristic… Page 93 of 102 123

Table 28  Experimental results of 3D trajectory planning for UAV


Algorithms Bast Worst Ave Std Rank

LSHADE_cnEpSin 228.5641 412.7612 378.3891 53.05882 10


LSHADE_SPACMA 228.5641 439.6877 354.9096 77.92180 5
MadDE 228.5647 415.0733 328.5696 88.13463 3
DE 237.4253 391.9168 290.2272 37.71641 2
AVOA 228.6368 495.6185 375.3102 79.23921 9
GWO 377.9066 440.4302 408.5835 70.66736 12
WOA 245.2525 641.0803 417.3053 105.73628 13
CPSOGSA 400.1338 1123.6400 640.1461 150.29701 16
SO 228.5648 441.5901 335.5449 82.36531 4
GTO 228.5655 640.8621 425.2463 115.12432 14
COA 229.2279 473.5245 371.5825 77.23105 8
RIME 228.8587 568.3395 359.7655 97.28082 6
GJO 413.7198 625.2692 431.2313 45.30166 15
DBO 228.5668 439.6956 367.9958 83.54673 7
NOA 228.5641 566.6555 390.3788 74.7564 11
SBOA 228.5641 377.3408 283.8366 16.3099 1

algorithm’s effectiveness. A specific mathematical model for this problem is outlined as


follows.

5.2.1 SBOA is used to UAV 3D path planning modeling

In mountainous environments, the flight trajectory of unmanned aerial vehicles (UAVs) is


primarily influenced by factors such as high mountain peaks, adverse weather conditions,
and restricted airspace areas. Typically, UAVs navigate around these regions for safety rea-
sons during their flights. This paper focuses on the trajectory planning problem for UAVs
in mountainous terrain, taking into consideration factors like high mountain peaks, weather
threats, and no-fly zones, and establishes a trajectory planning model. Figure 28 illustrates
the simulation environment model. The mathematical model for the ground and obstacle
representation can be expressed by Eq. (32).
z = sin(y + 1) + sin(x) + cos(x2 + y2 ) + 2 × cos(y) + sin(x2 + y2 ) (32)
During the flight of unmanned aerial vehicles (UAV), certain trajectory constraints need
to be satisfied. These constraints primarily include trajectory length, maximum turning
angle, and flight altitude, among others.
(1) Trajectory Length: In general, the flight of unmanned aerial vehicles (UAV) aims to
minimize time and reduce costs while ensuring safety. Therefore, the path length in path
planning is crucial. The mathematical model is described by Eq. (33).

z = sin(y + 1) + sin(x) + cos(x2 + y2 ) + 2 × cos(y) + sin(x2 + y2 ) (33)



n−1
( ) ( )
The equation where L = ∥ xi+1 , yi+1 , zi+1 − xi , yi , zi ∥2 , (i = 2, 3, ⋯ , n) repre-
i=1
sents the (xi , yi , zi ) waypoint along the planned path of the unmanned aerial vehicle.

13
123 Page 94 of 102 Y. Fu et al.

Fig. 29  Optimal fitness search iteration curve

Fig. 30  Flight tracks optimized by algorithms

Flight Altitude: The flight altitude of the unmanned aerial vehicle significantly affects
the control system and safety. The mathematical model for this constraint is shown in
Eq. (34).

ith (34)
Maximum Turning Angle: The turning angle of the unmanned aerial vehicle must be
within a specified maximum turning angle. The constraint for the maximum turning angle
can be expressed as:

√ ( )2
√∑ ∑
√ n 1
g
H=√ zi − z (35)
i=1
n k=1 i

13
Secretary bird optimization algorithm: a new metaheuristic… Page 95 of 102 123

( )

g
𝜑i+1 × 𝜑i
where, S = cos(𝜑) − is the turning angle when moving from φi , and φ
|𝜑i+1 |×|𝜑i |
m=1 | |
represents the maximum turning angle.
To evaluate the planned trajectory, it is common to consider multiple factors, including
the maneuverability of the UAV, trajectory length, altitude above ground, and the magni-
tude of threats from various sources. Based on a comprehensive analysis of the impact of
these factors, the trajectory cost function is calculated using the formula shown in Eq. (36)
to assess the trajectory.
(xi+1 − xi , yi+1 − yi , zi+1 −zi ) (36)

n
where, C = (𝜔1 × L + 𝜔2 × H + 𝜔3 × S) and L are the length and altitude above ground
i
of the trajectory, respectively; H is the smoothing cost of the planned path; S , 𝜔1 and 𝜔2 are
weight coefficients satisfying 𝜔3. By adjusting these weight coefficients, the influence of
each factor on the trajectory can be modulated.

5.2.2 Example of 3D path planning for UAV

To validate the effectiveness of the proposed spatial-branch optimization algorithm


(SBOA) in three-dimensional trajectory planning for unmanned aerial vehicles (UAVs),
this study conducted a verification in a simulation environment. The UAV’s maximum
flight altitude (𝜔1 + 𝜔2 + 𝜔3 = 1 was set to 50 m, with a flight speed of 20 m/s and a
maximum turn angle Hmax ) of 60°. The spatial region for UAV flight in the environment
was defined as a three-dimensional space with dimensions of 200 m in length, 200 m in
width, and 100 m in height. The UAV’s starting point was set at coordinates (0, 0, 20), and
the destination point was set at (200, 200, 30). While utilizing the SBOA proposed in this
study for environmental trajectory planning, a comparative analysis was performed with 15
other algorithms. The population size for all methods was set to 50, with a maximum itera-
tion count of 500. To eliminate randomness in computational results, each algorithm was
independently run 30 times under the same environmental configuration.
The statistical results are presented in Table 28, where “Best” represents the optimal
path length, “Ave” indicates the average path length, “Worst” represents the worst path
length, “Std” denotes the standard deviation, and “Rank” signifies the ranking based on the
average path length. Figure 29 illustrates the search iteration curve for obtaining the opti-
mal fitness value, while Fig. 30 provides 3D and 2D schematic diagrams for the optimal
range obtained by the 16 different methods.
From Table 28, it can be observed that, in terms of both the optimal fitness value and
the average fitness value, the SBOA method proposed in this paper yields the smallest fit-
ness value, while CPSOGSA produces the largest. Additionally, the average fitness value
of SBOA is even smaller than the optimal fitness values of the other 15 methods, indicat-
ing higher computational precision for SBOA. In terms of standard deviation, SBOA has
the smallest value, suggesting stronger computational stability compared to the other 15
methods.
The search iteration curve in Fig. 29 also supports the conclusions drawn from Table 28.
Specifically, compared to methods such as GTO, CPSOGSA, LSHADE_cnEpSin, and oth-
ers, SBOA exhibits a faster convergence rate and higher convergence accuracy. As evi-
dent in Fig. 29, the trajectories planned by the 16 methods effectively avoid threat areas,

13
123 Page 96 of 102 Y. Fu et al.

demonstrating the feasibility of the generated trajectories. In Fig. 30b, the trajectory
obtained by SBOA is the shortest and relatively smoother, while DBO’s trajectory is the
worst, requiring ascent to a certain height and navigating a certain distance to avoid threat
areas. The results indicate that SBOA can effectively enhance the efficiency of trajectory
planning, demonstrating a certain advantage.

6 Summary and outlook

This article introduces a new bio-inspired optimization algorithm named the secretary bird
optimization algorithm (SBOA), which aims to simulate the behavior of secretary birds in
nature. The hunting strategy of secretary birds in capturing snakes and their mechanisms
for evading predators have been incorporated as fundamental inspirations in the design of
SBOA. The implementation of SBOA is divided into two phases: a simulated exploration
phase based on the hunting strategy and a simulated exploitation phase based on the eva-
sion strategy, both of which are mathematically modeled. To validate the effectiveness of
SBOA, we conducted a comparative analysis of exploration versus exploitation and con-
vergence behavior. The analysis results demonstrate that SBOA exhibits outstanding per-
formance in balancing exploration and exploitation, convergence speed, and convergence
accuracy.
To evaluate the performance of SBOA, experiments were conducted using the CEC-
2017 and CEC-2022 benchmark functions. The results demonstrate that, both in dif-
ferent dimensions of CEC-2017 and on CEC-2022 test functions, SBOA consistently
ranks first in terms of average values and function average rankings. Notably, even in
high-dimensional problems of CEC-2017, SBOA maintains its strong problem-solving
capabilities, obtaining the best results in 20 out of 30 test functions. Furthermore, to
assess the algorithm’s ability to solve real-world problems, it was applied to twelve clas-
sical engineering practical problems and a three-dimensional path planning problem
for Unmanned Aerial Vehicles (UAVs). In comparison to other benchmark algorithms,
SBOA consistently obtained the best results. This indicates that SBOA maintains strong
optimization capabilities when dealing with constrained engineering problems and
practical optimization problems and outperforms other algorithms in these scenarios,
providing optimal optimization results. In summary, the proposed SBOA demonstrates
excellent performance in both unconstrained and constrained problems, showcasing its
robustness and wide applicability.
In future research, there are many aspects that require continuous innovation. We
intend to optimize SBOA from the following perspectives:

1. Algorithm Fusion and Collaborative Optimization: Integrating SBOA with other


metaheuristic algorithms to collaboratively optimize them, leveraging the strengths of
each algorithm, and further enhancing the accuracy and robustness of problem-solving.
2. Adaptive and Self-Learning Algorithms: Enhancing SBOA’s adaptability and self-
learning capabilities by incorporating techniques such as machine learning and deep
learning. This allows SBOA to adapt to changes in different problem scenarios and
environments. It enables SBOA to learn from data and adjust its parameters and strate-
gies autonomously, thus improving its solving effectiveness and adaptability.
3. Multi-Objective and Constraint Optimization: As real-world problems become increas-
ingly complex, multi-objective and constraint optimization problems are gaining impor-

13
Secretary bird optimization algorithm: a new metaheuristic… Page 97 of 102 123

tance. In future research, we will place greater emphasis on enhancing SBOA’s capabil-
ity to address multi-objective problems, providing more comprehensive solutions for
optimization challenges.
4. Expanding Application Domains: The initial design of metaheuristic algorithms was
intended to address real-life problems. Therefore, we will further expand the application
domains of SBOA to make it applicable in a wider range of fields. This includes docu-
ment classification and data mining, circuit fault diagnosis, wireless sensor networks,
and 3D reconstruction, among others.

Supplementary Information The online version contains supplementary material available at https://​doi.​
org/​10.​1007/​s10462-​024-​10729-y.

Acknowledgements This work was supported by the Technology Plan of Guizhou Province (Contract No:
Qian Kehe Support [2023] General 117), (Contract No: Qiankehe Support [2023] General 124) and (Con-
tract No: Qiankehe Support [2023] General 302),Guizhou Provincial Science and Technology Projects
QKHZC[2023]118, and the National Natural Science Foundation of China (72061006).

Author contributions YF: Conceptualization, Methodology, Writing—Original Draft, Data Curation, Writ-
ing—Review and Editing, Software. DL: Supervision, Writing-Reviewing and Editing. JC: Writing-Origi-
nal Draft, Formal analysis. LH: Writing-Reviewing and Editing, Drawing, Funding Acquisition.

Funding Funding was provided by Natural Science Foundation of Guizhou Province (Contract No.: Qian
Kehe Support [2023] General 117) (Contract No: Qiankehe Support [2023] General 124) and (Contract No:
Qiankehe Support [2023] General 302), Guizhou Provincial Science and Technology Projects QKHZC[202
3]118, and the National Natural Science Foundation of China (Grant No. 72061006).

Data availability All data generated or analyzed in this study are included in this paper.

Declarations
Competing interests The authors declare no competing interests.

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License,
which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long
as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Com-
mons licence, and indicate if changes were made. The images or other third party material in this article
are included in the article’s Creative Commons licence, unless indicated otherwise in a credit line to the
material. If material is not included in the article’s Creative Commons licence and your intended use is not
permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly
from the copyright holder. To view a copy of this licence, visit http://​creat​iveco​mmons.​org/​licen​ses/​by/4.​0/.

References
Abdel-Basset M, El-Shahat D, Jameel M, Abouhawwash M (2023a) Exponential distribution optimizer
(EDO): a novel math-inspired algorithm for global optimization and engineering problems. Artif
Intell Rev 56:9329–9400. https://​doi.​org/​10.​1007/​s10462-​023-​10403-9
Abdel-Basset M, Mohamed R, Abouhawwash M (2024) Crested porcupine optimizer: a new nature-inspired
metaheuristic. Knowl-Based Syst 284:111257. https://​doi.​org/​10.​1016/j.​knosys.​2023.​111257
Abdel-Basset M, Mohamed R, Jameel M, Abouhawwash M (2023b) Nutcracker optimizer: a novel nature-
inspired metaheuristic algorithm for global optimization and engineering design problems. Knowl-
Based Syst 262:110248. https://​doi.​org/​10.​1016/j.​knosys.​2022.​110248

13
123 Page 98 of 102 Y. Fu et al.

Abdel-Basset M, Mohamed R, Jameel M, Abouhawwash M (2023c) Spider wasp optimizer: a novel


meta-heuristic optimization algorithm. Artif Intell Rev 56:11675–11738. https://​doi.​org/​10.​1007/​
s10462-​023-​10446-y
Abdollahzadeh B, Gharehchopogh FS, Mirjalili S (2021a) African vultures optimization algorithm: a new
nature-inspired metaheuristic algorithm for global optimization problems. Comput Ind Eng. https://​
doi.​org/​10.​1016/j.​cie.​2021.​107408
Abdollahzadeh B, Gharehchopogh FS, Mirjalili S (2021b) Artificial gorilla troops optimizer: a new nature-
inspired metaheuristic algorithm for global optimization problems. Int J Intell Syst 36:5887–5958.
https://​doi.​org/​10.​1002/​int.​22535
Abualigah L, Yousri D, Abd Elaziz M, Ewees AA, Al-qaness MAA, Gandomi AH (2021) Aquila optimizer:
a novel meta-heuristic optimization algorithm. Comput Ind Eng. https://​doi.​org/​10.​1016/j.​cie.​2021.​
107250
Agrawal P, Abutarboush HF, Ganesh T, Mohamed AW (2021) Metaheuristic algorithms on feature selec-
tion: a survey of one decade of research (2009–2019). IEEE ACCESS 9:26766–26791. https://​doi.​
org/​10.​1109/​ACCESS.​2021.​30564​07
Ahmadi B, Giraldo JS, Hoogsteen G (2023) Dynamic Hunting Leadership optimization: algorithm and
applications. J Comput Sci 69:102010. https://​doi.​org/​10.​1016/j.​jocs.​2023.​102010
Angeline PJ (1994) Genetic programming: on the programming of computers by means of natural selection.
In: Koza JR (ed) A bradford book. MIT Press, Cambridge
Asselmeyer T, Ebeling W, Rosé H (1997) Evolutionary strategies of optimization. Phys Rev E 56:1171
Attiya I, Abd Elaziz M, Abualigah L, Nguyen TN, Abd El-Latif AA (2022) An improved hybrid swarm
intelligence for scheduling IoT application tasks in the cloud. IEEE Trans Ind Inf 18:6264–6272.
https://​doi.​org/​10.​1109/​TII.​2022.​31482​88
Awad NH, Ali MZ, Suganthan PN (2017) Ensemble sinusoidal differential covariance matrix adaptation
with Euclidean neighborhood for solving CEC2017 benchmark problems. In: 2017 IEEE congress on
evolutionary computation (CEC), pp 372–379
Bai J, Li Y, Zheng M, Khatir S, Benaissa B, Abualigah L, Abdel Wahab M (2023) A Sinh Cosh optimizer.
Knowl-Based Syst 282:111081. https://​doi.​org/​10.​1016/j.​knosys.​2023.​111081
Biswas S, Saha D, De S, Cobb AD, Das S, Jalaian BA (2021) Improving differential evolution through
bayesian hyperparameter optimization. In: 2021 IEEE congress on evolutionary computation (CEC),
pp 832–840
Braik M, Hammouri A, Atwan J, Al-Betar MA, Awadallah MA (2022) White shark optimizer: a novel bio-
inspired meta-heuristic algorithm for global optimization problems. Knowl-Based Syst 243:108457.
https://​doi.​org/​10.​1016/j.​knosys.​2022.​108457
Braik MS (2021) Chameleon swarm algorithm: a bio-inspired optimizer for solving engineering design
problems. Exp Syst Appl. https://​doi.​org/​10.​1016/j.​eswa.​2021.​114685
Chakraborty P, Nama S, Saha AK (2023) A hybrid slime mould algorithm for global optimization. Mul-
timed Tool Appl 82:22441–22467. https://​doi.​org/​10.​1007/​s11042-​022-​14077-3
Chakraborty S, Nama S, Saha AK (2022a) An improved symbiotic organisms search algorithm for higher
dimensional optimization problems. Knowl-Based Syst 236:107779. https://​doi.​org/​10.​1016/j.​knosys.​
2021.​107779
Chakraborty S, Nama S, Saha AK, Mirjalili S (2022b) A modified moth-flame optimization algorithm for
image segmentation. In: Mirjalili S (ed) Handbook of moth-flame optimization algorithm: variants,
hybrids, improvements, and applications. CRC Press, Boca Raton, pp 111–128
Chen B, Chen H, Li M (2021) Improvement and optimization of feature selection algorithm in swarm intel-
ligence algorithm based on complexity. Complexity. https://​doi.​org/​10.​1155/​2021/​99851​85
Cheng MY, Sholeh MN (2023) Optical microscope algorithm: a new metaheuristic inspired by microscope
magnification for solving engineering optimization problems. Knowl-Based Syst 279:110939. https://​
doi.​org/​10.​1016/j.​knosys.​2023.​110939
Chopra N, Mohsin Ansari M (2022) Golden jackal optimization: a novel nature-inspired optimizer for engi-
neering applications. Expert Syst Appl 198:116924. https://​doi.​org/​10.​1016/j.​eswa.​2022.​116924
Choura A, Hellara H, Baklouti M, Kanoun O, IEEE (2021) Comparative study of different salp swarm algo-
rithm improvements for feature selection applications. In: 14th international workshop on impedance
spectroscopy (IWIS). Chemnitz, Germany, pp 146–149
Dao PB (2022) On Wilcoxon rank sum test for condition monitoring and fault detection of wind turbines.
Appl Energy 318:119209
Das B, Mukherjee V, Das D (2020) Student psychology based optimization algorithm: a new population
based optimization algorithm for solving optimization problems. Adv Eng Softw 146:102804. https://​
doi.​org/​10.​1016/j.​adven​gsoft.​2020.​102804

13
Secretary bird optimization algorithm: a new metaheuristic… Page 99 of 102 123

De Swardt DH (2011) Late-summer breeding record for Secretarybirds Sagittarius serpentarius in the free
state. Gabar 22:31–33
Dehghani M, Hubalovsky S, Trojovsky P (2021) Northern goshawk optimization: a new swarm-based algo-
rithm for solving optimization problems. IEEE Access 9:162059–162080. https://​doi.​org/​10.​1109/​
access.​2021.​31332​86
Deng L, Liu S (2023) Snow ablation optimizer: a novel metaheuristic technique for numerical optimization
and engineering design. Expert Syst Appl 225:120069
Dhiman G, Garg M, Nagar A, Kumar V, Dehghani M (2021) A novel algorithm for global optimization:
rat swarm optimizer. J Ambient Intell Humaniz Comput 12:8457–8482. https://​doi.​org/​10.​1007/​
s12652-​020-​02580-0
Dorigo M, Birattari M, Stützle T (2006) Ant colony optimization. Comput Intell Mag 1:28–39. https://​doi.​
org/​10.​1109/​MCI.​2006.​329691
Erol OK, Eksin I (2006) A new optimization method: big bang–big crunch. Adv Eng Softw 37:106–111
Eskandar H, Sadollah A, Bahreininejad A, Hamdi M (2012) Water cycle algorithm—a novel metaheuris-
tic optimization method for solving constrained engineering optimization problems. Comput Struct
110:151–166. https://​doi.​org/​10.​1016/j.​comps​truc.​2012.​07.​010
Faramarzi A, Heidarinejad M, Mirjalili S, Gandomi AH (2020a) Marine predators algorithm: a nature-
inspired metaheuristic. Exp Syst Appl. https://​doi.​org/​10.​1016/j.​eswa.​2020.​113377
Faramarzi A, Heidarinejad M, Stephens B, Mirjalili S (2020b) Equilibrium optimizer: a novel optimization
algorithm. Knowl-Based Syst. https://​doi.​org/​10.​1016/j.​knosys.​2019.​105190
Fatahi A, Nadimi-Shahraki MH, Zamani H (2023) An improved binary quantum-based avian navigation
optimizer algorithm to select effective feature subset from medical data: a COVID-19 case study. J
Bionic Eng. https://​doi.​org/​10.​1007/​s42235-​023-​00433-y
Feduccia A, Voorhies MR (1989) Miocene hawk converges on secretarybird. Ibis 131:349–354
Goodarzimehr V, Shojaee S, Hamzehei-Javaran S, Talatahari S (2022) Special relativity search: a novel
metaheuristic method based on special relativity physics. Knowl-Based Syst 257:109484. https://​doi.​
org/​10.​1016/j.​knosys.​2022.​109484
Guan Z, Ren C, Niu J, Wang P, Shang Y (2023) Great wall construction algorithm: a novel meta-heuristic
algorithm for engineer problems. Expert Syst Appl 233:120905. https://​doi.​org/​10.​1016/j.​eswa.​2023.​
120905
Hashim FA, Houssein EH, Hussain K, Mabrouk MS, Al-Atabany W (2022) Honey badger algorithm: new
metaheuristic algorithm for solving optimization problems. Math Comput Simul 192:84–110. https://​
doi.​org/​10.​1016/j.​matcom.​2021.​08.​013
Hashim FA, Hussien AG (2022) Snake optimizer: a novel meta-heuristic optimization algorithm. Knowl-
Based Syst. https://​doi.​org/​10.​1016/j.​knosys.​2022.​108320
Hofmeyr SD, Symes CT, Underhill LG (2014) Secretarybird Sagittarius serpentarius population trends
and ecology: insights from South African citizen science data. PLoS ONE 9:e96772
Holland JH (1992) Genetic algorithms. Sci Am 267:66–73
Hu G, Guo Y, Wei G, Abualigah L (2023) Genghis Khan shark optimizer: a novel nature-inspired algo-
rithm for engineering optimization. Adv Eng Inform 58:102210. https://​doi.​org/​10.​1016/j.​aei.​
2023.​102210
Jia H, Rao H, Wen C, Mirjalili S (2023) Crayfish optimization algorithm. Artif Intell Rev 56:1919–1979.
https://​doi.​org/​10.​1007/​s10462-​023-​10567-4
Kaur S, Awasthi LK, Sangal AL, Dhiman G (2020) Tunicate swarm algorithm: a new bio-inspired based
metaheuristic paradigm for global optimization. Eng Appl Artif Intell 90:103541. https://​doi.​org/​
10.​1016/j.​engap​pai.​2020.​103541
Kaveh A, Khayatazad M (2012) A new meta-heuristic method: ray optimization. Comput Struct
112:283–294. https://​doi.​org/​10.​1016/j.​comps​truc.​2012.​09.​003
Kennedy J, Eberhart R (1995) Particle swarm optimization. In: Proceedings of ICNN’95—international
conference on neural networks, pp 1942–1948
Khishe M, Mosavi MR (2020) Chimp optimization algorithm. Exp Syst Appl. https://​doi.​org/​10.​1016/j.​
eswa.​2020.​113338
Kirkpatrick S, Gelatt CD, Vecchi MP (1983) Optimization by simulated annealing. Science 220:671–
680. https://​doi.​org/​10.​1126/​scien​ce.​220.​4598.​671
Kumar A, Wu G, Ali MZ, Mallipeddi R, Suganthan PN, Das S (2020a) Guidelines for real-world single-
objective constrained optimisation competition. Tech Rep 2020:1–7
Kumar A, Wu G, Ali MZ, Mallipeddi R, Suganthan PN, Das S (2020b) A test-suite of non-convex con-
strained optimization problems from the real-world and some baseline results. Swarm Evol Com-
put 56:100693

13
123 Page 100 of 102 Y. Fu et al.

Kumar M, Kulkarni AJ, Satapathy SC (2018) Socio evolution & learning optimization algorithm: a
socio-inspired optimization methodology. Future Gener Comput Syst Int J Sci 81:252–272. https://​
doi.​org/​10.​1016/j.​future.​2017.​10.​052
Li S, Chen H, Wang M, Heidari AA, Mirjalili S (2020) Slime mould algorithm: a new method for sto-
chastic optimization. Future Gener Comput Syst Int J Sci 111:300–323. https://​doi.​org/​10.​1016/j.​
future.​2020.​03.​055
Lian J, Hui G (2024) Human evolutionary optimization algorithm. Exp Syst Appl 241:122638. https://​
doi.​org/​10.​1016/j.​eswa.​2023.​122638
Liu C, IEEE (2014) The development trend of evaluating face-recognition technology. In: International
conference on mechatronics and control (ICMC), Jinzhou, pp 1540–1544
Liu SH, Mernik M, Hrncic D, Crepinsek M (2013) A parameter control method of evolutionary algo-
rithms using exploration and exploitation measures with a practical application for fitting Sovova’s
mass transfer model. Appl Soft Comput 13:3792–3805. https://​doi.​org/​10.​1016/j.​asoc.​2013.​05.​010
Luo W, Lin X, Li C, Yang S, Shi Y (2022) Benchmark functions for CEC 2022 competition on seeking
multiple optima in dynamic environments. Preprint at https://​arxiv.​org/​abs/​2201.​00523
Mahdavi-Meymand A, Zounemat-Kermani M (2022) Homonuclear molecules optimization (HMO)
meta-heuristic algorithm. Knowl-Based Syst 258:110032. https://​doi.​org/​10.​1016/j.​knosys.​2022.​
110032
Manjarres D, Landa-Torres I, Gil-Lopez S, Del Ser J, Bilbao MN, Salcedo-Sanz S, Geem ZW (2013) A
survey on applications of the harmony search algorithm. Eng Appl Artif Intell 26:1818–1831
Mirjalili S, Gandomi AH, Mirjalili SZ, Saremi S, Faris H, Mirjalili SM (2017) Salp swarm algorithm: a
bio-inspired optimizer for engineering design problems. Adv Eng Softw 114:163–191. https://​doi.​
org/​10.​1016/j.​adven​gsoft.​2017.​07.​002
Mirjalili S, Lewis A (2016) The whale optimization algorithm. Adv Eng Softw 95:51–67. https://​doi.​
org/​10.​1016/j.​adven​gsoft.​2016.​01.​008
Mirjalili S, Mirjalili SM, Hatamlou A (2016) Multi-verse optimizer: a nature-inspired algorithm for
global optimization. Neural Comput Appl 27:495–513. https://​doi.​org/​10.​1007/​s00521-​015-​1870-7
Mirjalili S, Mirjalili SM, Lewis A (2014) Grey wolf optimizer. Adv Eng Softw 69:46–61. https://​doi.​org/​
10.​1016/j.​adven​gsoft.​2013.​12.​007
Mohamed AW, Hadi AA, Fattouh AM, Jambi KM (2017) LSHADE with semi-parameter adaptation
hybrid with CMA-ES for solving CEC 2017 benchmark problems. In: 2017 IEEE congress on
evolutionary computation (CEC), pp 145–152.
Mohammed H, Rashid T (2023) FOX: a FOX-inspired optimization algorithm. Appl Intell 53:1030–1050
Moosavian N, Roodsari BK (2014) Soccer league competition algorithm: a novel meta-heuristic algorithm
for optimal design of water distribution networks. Swarm Evol Comput 17:14–24. https://​doi.​org/​10.​
1016/j.​swevo.​2014.​02.​002
Morales-Castaneda B, Zaldivar D, Cuevas E, Fausto F, Rodriguez A (2020) A better balance in metaheuris-
tic algorithms: does it exist? Swarm Evol Comput. https://​doi.​org/​10.​1016/j.​swevo.​2020.​100671
Nama S (2021) A modification of I-SOS: performance analysis to large scale functions. Appl Intell
51:7881–7902. https://​doi.​org/​10.​1007/​s10489-​020-​01974-z
Nama S (2022) A novel improved SMA with quasi reflection operator: performance analysis, application
to the image segmentation problem of Covid-19 chest X-ray images. Appl Soft Comput 118:108483.
https://​doi.​org/​10.​1016/j.​asoc.​2022.​108483
Nama S, Chakraborty S, Saha AK, Mirjalili S (2022a) Hybrid moth-flame optimization algorithm with
slime mold algorithm for global optimization. In: Mirjalili S (ed) Handbook of moth-flame optimi-
zation algorithm: variants, hybrids, improvements, and applications. CRC Press, Boca Raton, pp
155–176
Nama S, Saha AK (2020) A new parameter setting-based modified differential evolution for function opti-
mization. Int J Model Simul Sci Comput 11:2050029
Nama S, Saha AK (2022) A bio-inspired multi-population-based adaptive backtracking search algorithm.
Cogn Comput 14:900–925. https://​doi.​org/​10.​1007/​s12559-​021-​09984-w
Nama S, Saha AK, Chakraborty S, Gandomi AH, Abualigah L (2023) Boosting particle swarm optimization
by backtracking search algorithm for optimization problems. Swarm Evol Comput 79:101304. https://​
doi.​org/​10.​1016/j.​swevo.​2023.​101304
Nama S, Saha AK, Sharma S (2020) A hybrid TLBO algorithm by quadratic approximation for func-
tion optimization and its application. In: Balas VE, Kumar R, Srivastava R (eds) Recent trends and
advances in artificial intelligence and internet of things. Springer, Cham, pp 291–341
Nama S, Sharma S, Saha AK, Gandomi AH (2022b) A quantum mutation-based backtracking search algo-
rithm. Artif Intell Rev 55:3019–3073. https://​doi.​org/​10.​1007/​s10462-​021-​10078-0

13
Secretary bird optimization algorithm: a new metaheuristic… Page 101 of 102 123

Portugal SJ, Murn CP, Sparkes EL, Daley MA (2016) The fast and forceful kicking strike of the secretary
bird. Curr Biol 26:R58–R59
Rao RV, Savsani VJ, Vakharia DP (2011) Teaching-learning-based optimization: a novel method for con-
strained mechanical design optimization problems. Comput Aided Des 43:303–315. https://​doi.​org/​
10.​1016/j.​cad.​2010.​12.​015
Rashedi E, Nezamabadi-Pour H, Saryazdi S (2009) GSA: a gravitational search algorithm. Inf Sci
179:2232–2248. https://​doi.​org/​10.​1016/j.​ins.​2009.​03.​004
Rather SA, Bala PS (2021) Constriction coefficient based particle swarm optimization and gravitational
search algorithm for multilevel image thresholding. Expert Syst 38:e12717
Reynolds RG (1994) An introduction to cultural algorithms. Proceedings of the 3rd annual conference on
evolutionary programming. World Scientific Publishing, Singapore, pp 131–139
Saha A, Nama S, Ghosh S (2021) Application of HSOS algorithm on pseudo-dynamic bearing capacity of
shallow strip footing along with numerical analysis. Int J Geotech Eng 15:1298–1311. https://​doi.​org/​
10.​1080/​19386​362.​2019.​15980​15
Sahoo SK, Saha AK, Nama S, Masdari M (2023) An improved moth flame optimization algorithm based on
modified dynamic opposite learning strategy. Artif Intell Rev 56:2811–2869. https://​doi.​org/​10.​1007/​
s10462-​022-​10218-0
Sharma S, Chakraborty S, Saha AK, Nama S, Sahoo SK (2022) mLBOA: a modified butterfly optimization
algorithm with lagrange interpolation for global optimization. J Bionic Eng 19:1161–1176. https://​
doi.​org/​10.​1007/​s42235-​022-​00175-3
Storn R, Price K (1997) Differential evolution—a simple and efficient heuristic for global optimization over
continuous spaces. J Global Optim 11:341–359
Su H, Zhao D, Heidari AA, Liu L, Zhang X, Mafarja M, Chen H (2023) RIME: a physics-based optimiza-
tion. Neurocomputing 532:183–214
Taheri A, RahimiZadeh K, Beheshti A, Baumbach J, Rao RV, Mirjalili S, Gandomi AH (2024) Partial rein-
forcement optimizer: an evolutionary optimization algorithm. Expert Syst Appl 238:122070. https://​
doi.​org/​10.​1016/j.​eswa.​2023.​122070
Tallini LG, Pelusi D, Mascella R, Pezza L, Elmougy S, Bose B (2016) Efficient non-recursive design of
second-order spectral-null codes. IEEE Trans Inf Theory 62:3084–3102. https://​doi.​org/​10.​1109/​TIT.​
2016.​25553​22
Trojovska E, Dehghani M, Trojovsky P (2022) Fennec fox optimization: a new nature-inspired optimization
algorithm. IEEE Access 10:84417–84443. https://​doi.​org/​10.​1109/​ACCESS.​2022.​31977​45
Trojovský P, Dehghani M (2022) Walrus optimization algorithm: a new bio-inspired metaheuristic
algorithm
Trojovský P, Dehghani M (2023) Subtraction-average-based optimizer: a new swarm-inspired metaheuris-
tic algorithm for solving optimization problems. Biomimetics (basel). https://​doi.​org/​10.​3390/​biomi​
metic​s8020​149
Wang L, Cao Q, Zhang Z, Mirjalili S, Zhao W (2022) Artificial rabbits optimization: a new bio-inspired
meta-heuristic algorithm for solving engineering optimization problems. Eng Appl Artif Intell
114:105082. https://​doi.​org/​10.​1016/j.​engap​pai.​2022.​105082
Wei ZL, Huang CQ, Wang XF, Han T, Li YT (2019) Nuclear reaction optimization: a novel and power-
ful physics-based algorithm for global optimization. IEEE Access 7:66084–66109. https://​doi.​org/​10.​
1109/​access.​2019.​29184​06
Wolpert DH, Macready WG (1997) No free lunch theorems for optimization. IEEE Trans Evol Comput
1:67–82. https://​doi.​org/​10.​1109/​4235.​585893
Wu X, Zhang S, Xiao W, Yin Y (2019) The exploration/exploitation tradeoff in whale optimization algo-
rithm. IEEE Access 7:125919–125928. https://​doi.​org/​10.​1109/​ACCESS.​2019.​29388​57
Xue J, Shen B (2022) Dung beetle optimizer: a new meta-heuristic algorithm for global optimization. J
Supercomput. https://​doi.​org/​10.​1007/​s11227-​022-​04959-6
Yapici H, Cetinkaya N (2019) A new meta-heuristic optimizer: pathfinder algorithm. Appl Soft Comput
78:545–568
Zamani H, Nadimi-Shahraki MH (2024) An evolutionary crow search algorithm equipped with interactive
memory mechanism to optimize artificial neural network for disease diagnosis. Biomed Signal Pro-
cess Control 90:105879. https://​doi.​org/​10.​1016/j.​bspc.​2023.​105879
Zamani H, Nadimi-Shahraki MH, Gandomi AH (2019) CCSA: conscious neighborhood-based crow search
algorithm for solving global optimization problems. Appl Soft Comput 85:28. https://​doi.​org/​10.​
1016/j.​asoc.​2019.​105583
Zamani H, Nadimi-Shahraki MH, Gandomi AH (2021) QANA: Quantum-based avian navigation optimizer
algorithm. Eng Appl Artif Intell 104:104314. https://​doi.​org/​10.​1016/j.​engap​pai.​2021.​104314

13
123 Page 102 of 102 Y. Fu et al.

Zamani H, Nadimi-Shahraki MH, Gandomi AH (2022) Starling murmuration optimizer: a novel bio-
inspired algorithm for global and engineering optimization. Comput Methods Appl Mech Eng
392:114616. https://​doi.​org/​10.​1016/j.​cma.​2022.​114616
Zervoudakis K, Tsafarakis S (2022) A global optimizer inspired from the survival strategies of flying foxes.
Eng Comput 2022:1–34
Zhao S, Zhang T, Ma S, Wang M (2023) Sea-horse optimizer: a novel nature-inspired meta-heuris-
tic for global optimization problems. Appl Intell 53:11833–11860. https://​doi.​org/​10.​1007/​
s10489-​022-​03994-3
Zhou A, Qu BY, Li H, Zhao SZ, Suganthan PN, Zhang Q (2011) Multiobjective evolutionary algorithms: a
survey of the state of the art. Swarm Evol Comput 1:32–49. https://​doi.​org/​10.​1016/j.​swevo.​2011.​03.​
001
Zolf K (2023) Gold rush optimizer: a new population-based metaheuristic algorithm. Op Res Decis. https://​
doi.​org/​10.​37190/​ord23​0108

Publisher’s Note Springer Nature remains neutral with regard to jurisdictional claims in published maps and
institutional affiliations.

13

You might also like