P E O A: Hilippine Agle Ptimization Lgorithm
P E O A: Hilippine Agle Ptimization Lgorithm
A BSTRACT
We propose the Philippine Eagle Optimization Algorithm (PEOA), which is a meta-heuristic and
population-based search algorithm inspired by the territorial hunting behavior of the Philippine Eagle.
From an initial random population of eagles in a given search space, the best eagle is selected and
undergoes a local food search using the interior point method as its means of exploitation. The
population is then divided into three subpopulations, and each subpopulation is assigned an operator
which aids in the exploration. Once the respective operators are applied, the new eagles with improved
function values replace the older ones. The best eagle of the population is then updated and conducts a
local food search again. These steps are done iteratively, and the food searched by the final best eagle
is the optimal solution of the search space. PEOA is tested on 20 optimization test functions with
different modality, separability, and dimension properties. The performance of PEOA is compared to
11 other optimization algorithms. To further validate the effectiveness of PEOA, it is also applied to
image reconstruction in electrical impedance tomography and parameter identification in a neutral
delay differential equation model. Numerical results show that PEOA can obtain accurate solutions to
various functions and problems. PEOA proves to be the most computationally inexpensive algorithm
relative to the others examined, while also helping promote the critically endangered Philippine
Eagle.
1 Introduction
Mathematical optimization is the study of finding solutions using mathematical tools to achieve objectives [1] optimally.
Finding solutions to optimization problems is usually very challenging, so various algorithms have been created to
tackle different kinds of problems.
In particular, metaheuristic search algorithms have been used because of their trial-and-error approach in finding
solutions, which have many advantages over traditional and purely deterministic methods [1, 2]. These advantages
can be seen when dealing with functions that have some discontinuity, design optimization problems that have highly
nonlinear functions or constraints, or stochastic problems where uncertainty and noise exist [1, 2, 3]. In these cases,
techniques using a trade-off between randomization and local search, such as metaheuristic algorithms, are preferred
[4].
A PREPRINT - D ECEMBER 21, 2021
A state-of-the-art metaheuristic algorithm is the Genetic Algorithm (GA) [5], which is based on Darwinian evolution
and natural selection of biological systems. The problem-solving strategy of GA is to use genetic operators, namely
crossover and recombination, mutation, and selection.
One further development to GA is the Differential Evolution (DE) [6], which is a vector-based, derivative-free
evolutionary algorithm. Unlike GA, DE treats solutions as real-number strings, and operations are carried out over each
component of the solution vectors.
More improved variants of these algorithms have also been developed recently, such as those that use adaptive parameter
control, an external archive, and combinations of multiple operators and methods.
Table 1. Summary of some nature-inspired optimization algorithms, including their inspiration source from nature,
algorithmic key features, and the year when they were proposed.
For example, the Improved Multi-Operator Differential Evolution (IMODE) [15] has been proposed, which uses
multiple DE operators, with more emphasis placed on the best-performing operator. IMODE also uses adaptation
mechanisms to determine parameter values and randomly chooses between binomial and exponential crossover. IMODE
has proven successful as an optimization algorithm, especially since it ranked first in the CEC 2020 Competition on
Single Objective Bound Constrained Numerical Optimization.
Many other metaheuristic algorithms have been developed, not only because of their capability of solving optimization
problems, but also due to their wide range of applications [16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27, 28, 29, 30, 31,
32].
Two essential components of metaheuristic algorithms are exploitation and exploration. Exploitation is the focusing of
the search in a local region, whereas exploration expands the search on a global scale [1, 2]. A proper balance between
these two components is crucial for the overall efficiency of metaheuristic algorithms.
2
A PREPRINT - D ECEMBER 21, 2021
Metaheuristic algorithms are mostly nature-inspired, deriving from the beauty and order that natural elements possess
[4]. For instance, animals and plants naturally develop strategies to ensure their survival through time. The abundance
and success of these strategies have led to the creation of many nature-inspired metaheuristics [33]. Specifically, flying
movements, foraging behavior, and hunting techniques of animals are some of the inspirations of nature-inspired
metaheuristics [34].
Another aspect of nature that has also been a basis for many algorithms is swarm intelligence, which concerns the
behavior of self-organizing systems, the members of which evolve and interact to achieve optimality [35]. Thus, many
algorithms are also swam-intelligence-based, such as the Particle Swarm Optimization (PSO) [36].
The main inspiration of PSO is the flocking behavior of birds. In PSO, each particle in a given swarm represents a
candidate solution to the optimization problem. Each particle is then updated based on its own local best position and
the position of the global best particle.
More recent SI-based algorithms have further been developed, including the Artificial Bee Colony, inspired by the
searching of bees for nectar flowers to produce honey for their colony [7, 37].
Further examples are the Firefly Algorithm, which is based on the flashing patterns and behavior of tropic fireflies [8],
and the Cuckoo Search Algorithm, which is inspired by the brood parasitism of cuckoo species [9, 38]. Additionally, we
have the Bat Algorithm, which is derived from the echolocation behavior of microbats [10], and the Flower Pollination
Algorithm, which is based on the flower pollination process of flowering plants [11].
Even more nature-inspired algorithms have been created over recent years, such as Moth Flame Optimization Algorithm
[12], Whale Optimization Algorithm [13], and Butterfly Optimization Algorithm [14].
Table 1 presents a summary of the nature-inspired algorithms mentioned above, along with their inspiration sources,
key features, and year.
With the increasing number of nature-inspired algorithms, various benchmarking tests have been developed to examine
their performance [39]. Such include testing the algorithms on different types of benchmark functions [40, 41], and
checking the number of objective function evaluations they use [42].
The No-Free-Lunch Theorem for Optimization states that if algorithm A performs better than algorithm B for some
optimization functions, then B will outperform A for other functions [1, 33]. In other words, there is no metaheuristic
best suited for all existing optimization problems.
Given this, the research area on metaheuristic algorithms is still quite active and steadily progressing. New metaheuristics
and nature-inspired algorithms are constantly being studied to determine what specific types of optimization problems
these algorithms could solve the best [33, 34].
In this study, we develop an optimization algorithm based on the hunting behavior of the Philippine Eagle (Pithecophaga
jefferyi), the national bird of the Philippines.
Tagged as the “Haribon” or bird king, the Philippine Eagle is among the rarest and most powerful birds in the world
whose species is endemic only to four islands of the Philippine archipelago, namely Luzon, Samar, Leyte, and Mindanao
[43]. It is commonly known as the Monkey-Eating Eagle, but it can also prey on other vertebrates apart from monkeys,
including mammals, reptiles, and other birds [44].
Unfortunately, it is now classified as critically endangered as it is continually being threatened by hunting and loss of
habitat [45].
According to [46], the hunting behavior of the Philippine Eagle follows a three-part sequence, where it first perches and
calls as a preparatory stage, then does the capture of prey by dropping from its perch, and finally circles back up to
return to its starting point. It can thus be observed that Philippine Eagles are highly territorial during hunting, besides
also being known to be loyal to their nest sites [43].
Furthermore, the Philippine Eagle can hunt both singly and in pairs [46], but they generally make a more successful
hunt when done in pairs. A particular strategy is for one eagle to distract the prey while the other captures this prey
from behind. It is additionally noted that a bulk of the Philippine Eagles’ time is spent at perch, because it is from perch
that they watch their surroundings and look out for prey.
3
A PREPRINT - D ECEMBER 21, 2021
Meanwhile, the Philippine Eagle’s flight behavior generally follows differing patterns, where they either glide in a
straight line from a higher to a lower elevation or make a sequence of short glides and large sweeping circles [46], [47].
We propose the Philippine Eagle Optimization Algorithm (PEOA), a novel, metaheuristic, nature-inspired, and SI-based
optimization algorithm inspired by the distinctive characteristics of the Philippine Eagle.
PEOA has three different global operators: the Movement Operator, the Mutation I Operator, and the Mutation II
Operator. The features of each operator are the following:
• The Movement Operator considers eagle proximity, wherein eagles close to each other swarm around the same
local solutions. One of these local solutions is possibly the global solution.
• The Mutation I Operator uses the concept of Lévy flights, which helps in the search within unknown, large-scale
spaces.
• The Mutation II Operator determines the overall picture of the search performance by considering the current
mean location of all the eagles.
These three operators are added to contribute to the exploration mechanism of PEOA. They make PEOA more
competitive not only against classical algorithms but also with other modern algorithms.
PEOA conducts an intensive local search in each iteration. In particular, food search is done regularly in a specific
territory of the best eagle, that is, the eagle with the least function value in a minimization problem. The interior point
method, a deterministic algorithm, is used here. This helps the exploitation capacity of PEOA.
PEOA uses an adaptive reduction of population size, that is, the population size of eagles linearly reduces depending
on the current number of function evaluations. This complements both the exploration and exploitation techniques of
PEOA. With more eagles at the beginning of the process, the three operators guide the eagles in exploring the better
locations of the space. Then, the worst eagles are regularly removed as a survival-of-the-fittest kind of mechanism.
Thus, in the latter stages of the process, the best eagles can use more function evaluations in their local food searches.
PEOA is evaluated on a varied set of 20 benchmark functions with different modality, separability, and dimension
properties. The results are compared to a set of 11 metaheuristics, nature-inspired, or swarm-intelligence-based
algorithms, which contain both classical and modern algorithms.
Given the No-Free-Lunch Theorem, we also explore the specific real-world optimization problems where PEOA can be
best and suitably applied. For this paper, the algorithm is used in two applications: solving the inverse conductivity
problem of electrical impedance tomography and estimating the parameters of a pendulum-mass-spring-damper system
that involves neutral delay differential equations.
Finally, in creating PEOA and proving that it has excellent results, we aspire to give the critically endangered Philippine
Eagle much more exposure and possibly help initiate further conservation efforts for the national bird.
The national bird of the Philippines, the Philippine Eagle, has particular hunting, flying, and foraging behaviors, which
had thus inspired the proposed Philippine Eagle Optimization Algorithm (PEOA). The main characteristics of the
Philippine Eagle that we incorporate into PEOA are the following:
• It is a highly territorial bird when hunting and is loyal to its nest site.
• Its pair hunt strategy is more successful than hunting alone.
• It has differing flight patterns, varying between straight glides and large circles.
• It watches its surroundings and looks out for prey at perch.
The pair-hunt strategy, differing flight patterns, and perching behavior of the Philippine Eagle are the sources of
inspiration for the three global operators of PEOA.
On the other hand, its territorial hunting behavior is modeled using the intensive local search of the algorithm, such that
the best eagle searches for food only within its local territory.
The adaptive reduction of the population size within PEOA is likewise due to the territorial behavior of the Philippine
Eagle, in the sense that eagles fight for their survival in the given region for every passing generation. Thus, the defeated
eagles would just fly out of the domain and live elsewhere, reducing the population of eagles that stay in the region.
4
A PREPRINT - D ECEMBER 21, 2021
We clarify that PEOA was conceptualized out of inspiration from the Philippine Eagle, but we do not intend to attribute
the whole process of the algorithm solely to this inspiration. Several nature-inspired algorithms in the literature only
derive from selected characteristics of their source of inspiration [33, 34].
Furthermore, besides finding direct relationships between the Philippine Eagle and our proposed algorithm, we also
seek to strengthen the algorithmic design of PEOA so it could perform efficiently on different kinds of optimization
problems. This way, PEOA could be comparable with recent algorithms and can be tested on specific applications.
The remainder of this paper is organized as follows. Section 2 provides a detailed description of the proposed PEOA
and its components, including the pseudocode and a flowchart. Section 3 discusses the experimental results and
performance comparison of PEOA with other algorithms in solving optimization test functions. Section 4 presents the
results of PEOA upon application to a real-world optimization problem. Finally, Section 5 gives the conclusion and
recommendations for future research.
Given a bound-constrained minimization problem, i.e., an objective function f to be minimized, a search space having
Xmin and Xmax as its lower and upper bounds, respectively, and a corresponding dimension D, PEOA starts with an
initial population of eagles X. Each row of X, given by Xi , represents the ith eagle and is generated as follows:
Xi = Xmin + [Xmax − Xmin ] · lhs, (1)
for i = 1, 2, . . . , S0 , where S0 is the initial population size of eagles. Here, Xmin and Xmax are 1 × D vectors and "·" is
used as a symbol for scalar multiplication. All throughout the paper, we will use this notation for scalar products.
Moreover, lhs is a number obtained from a matrix containing a Latin hypercube sample of S0 rows and D columns.
We use this sampling technique so that the initial eagles are randomly generated while being more or less uniformly
distributed over each dimension [48].
The function values of the eagles are then obtained and sorted. Because we are considering a minimization problem, the
eagle with the least function value is selected as the best eagle of the initial population. Denote this best eagle as X ? .
The best eagle obtained in the previous phase then conducts a local food search within its territory. We denote the best
food that it will search as Y ? . The territory has lower bound Ymin and upper bound Ymax , which are dependent on a
scalar radius Ysize . The radius and bounds of the territory are obtained as
Ysize = max[ρ · min(Xmax − Xmin ), 1], (2)
Ymin = X ? − Ysize · ~1, Ymax = X ? + Ysize · ~1, (3)
5
A PREPRINT - D ECEMBER 21, 2021
The basis for using this method is the technique proposed in the United Multi-Operator Evolutionary Algorithms-II
(UMOEAs-II), which has claimed that the interior point method can increase exploitation ability [49].
Once the best eagle obtains its best food, the Global Phase is conducted, generating a new population of eagles. This
new population will again be sorted using their function values, and its new best eagle will likewise be selected to
conduct another local food search.
In other words, each generation of eagles has a best eagle that searches locally for food. Therefore, PEOA heavily
capitalizes on exploitation to intensify the speed of the optimization process. On the other hand, for the inspiration
source, the territorial behavior of the Philippine Eagle can also be pictured through this local exploitation technique.
We further note that whenever two consecutive generations select the same X ? , the initial point taken for the interior
point method of the latter generation is the Y ? of the former generation.
After the Local Phase, the eagle population is divided into three subpopulations, the members of which are dependent
on a probability vector, denoted by P . The specific details on how the vector P is obtained can be found in Subsection
2.5. Each subpopulation is then assigned an operator, which makes the eagles either move from their original positions
or be replaced by new eagles using mutation. After application of the respective operators, the newly created eagles are
referred to as the eagle offspring, denoted by Xnew . Similar to X, Xnew has S0 rows and D columns.
Note that a selection process is carried out here, such that the eagle offspring with improved function values are the
only ones that will proceed to the next generation of eagles.
Furthermore, a parameter, called the scaling factor and denoted by F , is used in each operator. This parameter follows a
success-history-based parameter adaptation and will be explained in detail in Subsection 2.5.
We now thoroughly discuss the three operators, namely 1) the Movement Operator, 2) the Mutation I Operator, and 3)
the Mutation II Operator.
Let S denote the size of the whole eagle population of the current generation, and S1 , S2 , S3 denote the sizes of the
subpopulations assigned to the three operators, respectively. Therefore, we have S = S1 + S2 + S3 . Note that all
considered eagles in each operator are of size D.
where Xr1 is a randomly selected eagle from the current population that is different from X ? .
Also, Xarc is another randomly chosen eagle, different from both X ? and Xr1 , taken from the union of the current
population and an external archive of eagles.
Finally, Xnear is the eagle from the current population having the least Euclidean distance d to Xi .
The first part of the Movement Operator is based on an operator used in the Adaptive Differential Evolution Algorithm
(JADE) [50], referred to as “DE/current-to-pbest/1 with archive.” It is mentioned here that this operator has a good
searching ability and can also prevent the algorithm from getting trapped in a local minimum due to a bias towards
promising directions.
The external archive contains the eagles that were not successfully chosen to proceed to the next generations. This
archive, also based on JADE, can add more diversity to the eagle population. We note that the archive has a finite
size, obtained by multiplying a predefined archive rate A with the initial eagle population size S0 . Randomly selected
archive elements are removed if the archive exceeds its predefined size.
A novel feature of the Movement Operator is the addition of a term that considers neighboring eagle proximity. This
was included to model the pair hunt strategy of the Philippine Eagle, as the movement of an eagle is dependent on the
position of the eagle closest to it.
On the other hand, this term also enhances the efficiency of PEOA because it can make the subpopulation further divide
into subgroups, each swarming around different local solutions. One of these local solutions could be the global best
solution, so this feature is useful particularly when solving multimodal problems.
6
A PREPRINT - D ECEMBER 21, 2021
where u and v are values drawn from normal distributions. Also, the parameter β is a default constant set to 1.5, and
Γ(x) is the Gamma function.
The first part of the Mutation I Operator is based on an operator used in UMOEAs-II [49], called the “DE weighted-
rand-to-φbest.” However, a modification was made, which is the addition of a Lévy flight term. This was done to model
the differing flight patterns of the Philippine Eagle mathematically.
Lévy flights are random walks whose step sizes are drawn from a Lévy distribution [4]. They are commonly used to
demonstrate the irregular flight behavior of many animals and insects, which exhibit a Lévy-flight-style, intermittent
flight pattern [51]. For a more detailed discussion on Lévy flights, we refer the reader to [4] and [52].
where Xmean is the average of all eagles in the current population and X̂ is a newly generated random eagle inside the
search space.
The Mutation II Operator is similar to one of the operators used in the Harris Hawks Optimization Algorithm (HHO)
[53]. This operator not only strengthens the exploration capacity of the algorithm but also models the perching
characteristic of the Philippine Eagle. In particular, the addition of Xmean depicts how an eagle gets a general picture of
the search space, then consequently flies in consideration of the positions of other eagles.
Once the operators have been applied to their corresponding subpopulations, the eagle offsprings with improved function
values replace their corresponding parent eagles, thus generating a new eagle population.
In the case when some eagles have moved or mutated to locations outside the search space, a resetting scheme is applied
based on JADE [50]. The scheme truncates the component of the eagle outside the space bounds within the limits of the
space. The function values of these new eagles are sorted once again, and the best eagle of the new population goes
back to the Local Phase.
Hence, the Local and Global Phases are carried out iteratively for multiple generations until the given stopping criterion
is satisfied. The best food searched by the best eagle at the final generation is the optimal solution of PEOA.
The basic steps of the Philippine Eagle Optimization Algorithm are summarized in the pseudocode shown in Algorithm
1. In addition, a flowchart for PEOA is also provided in Figure 1.
To further improve the performance of PEOA, the algorithm uses adaptation schemes to control certain parameters.
These parameters are the eagle population size S, the probability vector P , and the scaling factor F .
We note that these adaptation schemes were derived from selected papers on evolutionary algorithms and differential
evolution. Our main reference paper is IMODE [15], but several parts of it were derived from other papers, such as
UMOEAs-II [49], JADE [50], and the Success-History Based Adaptive Differential Evolution with Linear Population
7
A PREPRINT - D ECEMBER 21, 2021
Figure 1. Flowchart summarizing the steps of the Philippine Eagle Optimization Algorithm.
8
A PREPRINT - D ECEMBER 21, 2021
Size Reduction Algorithm (L-SHADE) [54]. These papers were chosen because of their proven success as optimization
algorithms.
We discuss how S, P , and F are determined based on the papers mentioned. For a more in-depth analysis of the
behavior of these parameters, we refer the reader to [15], [49], [50], and [54].
9
A PREPRINT - D ECEMBER 21, 2021
where fold,z and fnew,z are the function values of the current eagle and its corresponding eagle offspring, respectively.
Then, the probability value Pi corresponding to operator i is updated as
Ri
Pi = max 0.1, min 0.9, . (10)
R1 + R2 + R3
Derived from UMOEAs-II [49], this mechanism highlights the best-performing operator per generation, giving it more
control of the optimization process. Meanwhile, the underperforming operators are given a chance to improve in the
next generations.
We apply PEOA on 20 optimization test functions having varied combinations of properties among modality, separability,
and dimension. We first explain what these properties mean and how they contribute to the difficulty of an optimization
problem.
A function with only one local optimum is called unimodal, whereas it is called multimodal if it has two or more local
optima [41]. One aspect of a well-designed exploration process in an algorithm is the capacity to escape from any local
yet nonglobal optimum. Unimodality, on the contrary, examines the exploitation capability of an algorithm [55].
Separable and nonseparable functions formulate another classification of functions. A function of n variables is called
separable if it can be written as a sum of n functions of just one variable, that is, its variables are independent of each
other [56]. On the other hand, a function is called nonseparable if its variables show interrelation among themselves
and are thus not independent. It is relatively easier to solve separable functions because they can be decomposed into
independent subfunctions, each one of which can be optimized independently [41].
Finally, the dimension, that is, the number of variables a function has, also dictates the difficulty of an optimization
problem. As the dimension increases, the search space enlarges exponentially, thus making it more challenging for an
algorithm to find the optimal solution [57].
Therefore, we divide our experimentation into four different types of functions, namely five unimodal and separable
functions, five multimodal and separable functions, five unimodal and nonseparable functions, and five multimodal and
nonseparable functions, obtained from [41] and [58].
10
A PREPRINT - D ECEMBER 21, 2021
Unimodal & Separable Schwefel 2.21 f3 (x) = max |xi | 2, 5, 10, 20 [−100, 100] 0 (0, . . . , 0)
i=1,...,D
D
X
Sphere f4 (x) = x2i 2, 5, 10, 20 [−5.12, 5.12] 0 (0, . . . , 0)
i=1
XD
Sum Squares f5 (x) = ix2i 2, 5, 10, 20 [−10, 10] 0 (0, . . . , 0)
i=1
D
X
Alpine 1 f6 (x) = |xi sin(xi ) + 0.1xi | 2, 5, 10, 20 [0, 10] 0 (0, . . . , 0)
i=1
D
1 X 1
Wavy f7 (x) = 1 − cos (10xi ) exp − x2i 2, 5, 10, 20 [−π, π] 0 (0, . . . , 0)
D i=1
2
D
X √
Multimodal & Separable Qing f8 (x) = (x2i − i)2 2, 5, 10, 20 [−500, 500] 0 (±1, . . . , ± D)
i=1
D
X 2
Rastrigin f9 (x) = 10D + xi − 10 cos(2πxi ) 2, 5, 10, 20 [−5.12, 5.12] 0 (0, . . . , 0)
i=1
D
X
Xin-She Yang 1 f10 (x) = randi |xi |i 2, 5, 10, 20 [−5, 5] 0 (0, . . . , 0)
i=1
D−1
Xh i
2 2
Brown f11 (x) = (x2i )(xi+1 +1) + (x2i+1 )(xi +1) 2, 5, 10, 20 [−1, 4] 0 (0, . . . , 0)
i=1
D−1
X
100(xi+1 − x2i )2 + (1 − xi )2
Rosenbrock f12 (x) = 2, 5, 10, 20 [−5, 10] 0 (1, . . . , 1)
i=1
D
X D
Y
Unimodal & Nonseparable Schwefel 2.22 f13 (x) = |xi | + |xi | 2, 5, 10, 20 [−100, 100] 0 (0, . . . , 0)
i=1 i=1
D
! D
! D
X xi 10 X Y
Xin-She Yang 3 f14 (x) = exp − − 2 exp − x2i cos2 (xi ) 2, 5, 10, 20 [−2π, 2π] −1 (0, . . . , 0)
i=1
15 i=1 i=1
D D
!2 D
!4
X X X
Zakharov f15 (x) = x2i + 0.5ixi + 0.5ixi 2, 5, 10, 20 [−5, 10] 0 (0, . . . , 0)
i=1 i=1 i=1
v !
u
u1 X D D
1 X
Ackley f16 (x) = −20 · exp −0.2t x2 − exp cos(2πxi ) + 20 + exp(1) 2, 5, 10, 20 [−32.768, 32.768] 0 (0, . . . , 0)
D i=1 i D i=1
D D
!
X X
Periodic f7 (x) = 1 + sin2 (xi ) − 0.1 exp − x2i 2, 5, 10, 20 [−10, 10] 0.9 (0, . . . , 0)
i=1 i=1
D D
x2i
Multimodal & Nonseparable
X xi Y
Griewank f18 (x) = 1 + − cos √ 2, 5, 10, 20 [−100, 100] 0 (0, . . . , 0)
4000 i=1
i=1
i
u v v
uX D uD
uX
Salomon f19 (x) = 1 − cos 2π t x2i + 0.1t x2i 2, 5, 10, 20 [−100, 100] 0 (0, . . . , 0)
i=1 i=1
D D
!! D
!
X X X p
Xin-She Yang 4 f20 (x) = sin2 (xi ) − exp − x2i exp − sin2 ( |xi |) 2, 5, 10, 20 [−10, 10] −1 (0, . . . , 0)
i=1 i=1 i=1
Table 2. Optimization test functions having varied combinations of types and dimensions which were applied to the
Philippine Eagle Optimization Algorithm and 11 other examined algorithms.
For each of these 20 functions, we use dimensions of 2, 5, 10, and 20, thus giving 80 experiments in total. Therefore,
we have chosen an extensive test suite that accommodates a wide variety of function properties.
Table 2 presents the functions used in our experiments, along with their corresponding search range, true optimal
function value, and true optimal solution.
In solving the test functions, we compare the performance of PEOA to a set of metaheuristic algorithms, swarm
intelligence algorithms, and nature-inspired heuristics.
Specifically, the 11 selected algorithms for comparison are Genetic Algorithm [5], Particle Swarm Optimization
[36], Flower Pollination Algorithm [11], [59], Bat Algorithm [10], [60], Cuckoo Search Algorithm [9], [61], Firefly
Algorithm [8], [62], Whale Optimization Algorithm [13], [63], Moth Flame Optimization Algorithm [12], [64], Butterfly
Optimization Algorithm [14], [65], Artificial Bee Colony [7], [37], [66], and Improved Multi-Operator Differential
Evolution [15].
11
A PREPRINT - D ECEMBER 21, 2021
parameter ρ 0.01 0.02 0.03 0.04 0.05 0.06 0.07 0.08 0.09 0.1
Ackley, D = 5 1.7782E-09 9.7759E-10 4.1470E-10 1.0584E-09 3.2541E-10 2.8472E-10 4.6380E-10 3.8248E-10 9.9210E-10 4.3085E-10
Wavy, D = 5 2.0982E-14 2.7235E-14 9.6250E-15 2.1256E-14 9.3207E-15 9.5119E-15 2.1869E-14 1.2354E-14 2.8362E-14 1.9498E-14
Salomon, D = 5 4.9488E-11 1.8624E-11 1.5483E-11 1.3563E-11 4.9937E-03 4.9937E-03 1.4981E-02 9.9873E-03 1.1997E-11 1.2657E-11
Xin-She Yang 4, D = 5 1.8607E-09 1.7819E-09 1.8065E-09 2.1454E-09 2.7005E-09 1.8437E-09 1.1687E-09 3.2604E-09 1.0773E-09 1.6042E-09
Alpine 1, D = 5 0 0 0 0 0 0 0 0 0 0
Periodic, D = 5 3.2145E-14 1.2688E-14 2.1868E-14 4.7343E-15 1.4275E-14 2.6242E-14 4.4572E-14 2.4870E-14 2.2104E-14 1.0323E-14
Qing, D = 5 2.3845E-14 1.5176E-14 1.9453E-14 1.9354E-14 1.4227E-14 1.6687E-14 1.9944E-14 2.5450E-14 3.3501E-10 1.9552E-10
Xin-She Yang 1, D = 5 1.6708E-07 2.2784E-07 3.4547E-07 1.5467E-07 1.0378E-07 1.5530E-07 8.7675E-07 2.6181E-07 2.4350E-07 1.9822E-06
Griewank, D = 5 2.6104E-13 3.2783E-13 2.5521E-13 4.0403E-13 5.3205E-13 4.0645E-13 4.0323E-13 1.6342E-13 4.8389E-13 1.8385E-13
Sum Squares, D = 5 6.1803E-15 5.0441E-15 9.9334E-15 7.5859E-15 8.1303E-15 5.4330E-15 1.1270E-14 6.7452E-13 1.9735E-15 4.0433E-14
Schwefel 2.20, D = 5 3.1603E-09 1.5984E-09 2.0958E-09 1.9398E-09 2.0064E-09 1.9515E-09 1.9019E-09 1.5489E-09 1.6727E-09 2.0413E-09
Powell Sum, D = 5 3.5203E-09 3.4654E-09 3.4013E-09 3.3918E-09 4.5497E-09 4.1739E-09 2.7353E-09 2.9486E-09 3.2139E-09 3.1862E-09
Zakharov, D = 5 4.0617E-14 1.3912E-14 3.2947E-14 1.4049E-14 1.5183E-14 2.1480E-14 1.6598E-14 9.3915E-15 1.0069E-14 8.0972E-14
Xin-She Yang 3, D = 5 2.3057E-13 2.4066E-14 1.3899E-14 1.2321E-14 2.6746E-14 5.4959E-15 3.7886E-14 6.7816E-13 8.0520E-15 8.8837E-15
Schwefel 2.22, D = 5 1.5485E-09 2.4281E-09 3.6965E-09 2.0066E-09 1.2923E-09 2.1754E-09 2.2947E-09 2.0411E-09 1.8680E-09 2.3485E-09
Schwefel 2.21, D = 5 4.6529E-10 5.2274E-10 4.9110E-10 2.7100E-10 1.2535E-09 1.3283E-10 5.4769E-10 1.3827E-09 5.9170E-10 3.7210E-10
Brown, D = 5 8.1158E-13 2.5861E-13 3.6792E-11 1.1512E-13 1.7539E-13 1.3452E-13 2.4183E-13 1.0823E-13 4.4943E-13 2.9509E-13
Rastrigin, D = 5 0 0 0 0 0 0 0 0 0 0
Sphere, D = 5 5.6480E-12 8.7120E-14 1.5976E-12 4.2641E-13 6.2517E-13 3.4748E-12 1.2214E-13 1.7405E-12 3.7974E-14 7.0133E-13
Rosenbrock, D = 5 5.3620E-11 5.8565E-11 5.4455E-11 1.2323E-10 5.6634E-11 5.6250E-11 5.3570E-11 5.4670E-11 5.4045E-11 4.4657E-11
average function value 8.9761E-09 1.1934E-08 1.7874E-08 8.2810E-09 2.4969E-04 2.4969E-04 7.4909E-04 4.9938E-04 1.2666E-08 9.9623E-08
Table 3. Optimal function values obtained by the Philippine Eagle Optimization Algorithm using different values for
the parameter ρ which determines the cluster size per local search. Experiments were done on each test function with
dimension 5, and results were averaged over 20 independent runs for each function. The value of 0.04 gave the best
result.
Our experimental setup is based on the experimental settings recommended by the CEC 2020 Special Session and
Competition on Single Objective Bound Constrained Numerical Optimization [67]. These settings ensure the efficiency
and fairness of the comparison of competing algorithms.
The features of our experimental setup are the following:
12
A PREPRINT - D ECEMBER 21, 2021
The values of the parameters of PEOA are chosen as follows: initial eagle population size (S0 ) is 20 · D2 , local food
size (Sloc ) is 10 · D2 , minimum eagle population size (Smin ) is 5, archive rate (A) is 2.6, and memory size (H) for the
scaling factor is 20 · D.
We recall that the constant value of 0.04 is used in Equation (2). This parameter controls the cluster size of each local
food search. An experiment was done to determine the best value of this constant such that PEOA could give the most
optimal results.
In particular, let ρ be the parameter such that
Ysize = max[ρ · min(Xmax − Xmin ), 1].
Different values for the parameter ρ were considered. For each value of ρ, PEOA was tested on the 20 test functions
given in Table 2. For this simulation, we set the dimension to 5 and run the algorithm 20 times.
The results obtained by PEOA for this experiment are summarized in Table 3. Observe that the value of ρ that gave the
best average result (highlighted in green) is 0.04.
We also show a brief analysis of the population size S. PEOA was implemented once for the Xin-She Yang 1 function
with dimension 2. After 21 generations, PEOA attained an optimal function value of 6.7459E-09.
The population sizes obtained from this experiment are
80 79 79 78 78 77 77
" #
S = 76 76 76 75 75 74 74 .
73 73 73 72 72 71 71
We thus see that the population sizes decrease linearly. Recall from Equation (8) that the slope of this decrease is
Smin −S0
Nmax , where Smin is the minimum population size, S0 is the initial population size, and Nmax is the maximum number
of function evaluations.
For more detailed analyses of the other parameter schemes used by PEOA, such as the scaling factors Fi in Equations
(4), (5), and (7), we refer the reader to JADE [50], UMOEAs-II [49], and IMODE [15].
For brevity, we only present here the numerical results for functions with dimension D = 20. Results for functions with
dimensions D = 2, 5, 10 can be found in the Appendix.
Tables 4, 6, 5, and 7 provide the average, best, and worst function value errors as well as the standard deviations
obtained for functions with dimension D = 20 using the different examined algorithms. The cells having a value of 0
are highlighted in green for emphasis.
Table 4. Average, best, and worst function value errors and standard deviations over 30 independent runs obtained by
the Philippine Eagle Optimization Algorithm compared to the 11 other examined algorithms for 5 different unimodal
and separable functions of dimension 20. PEOA and BOA performed the best in this table since all their obtained
results here are less than 10−8 .
13
A PREPRINT - D ECEMBER 21, 2021
Table 5. Average, best, and worst function value errors and standard deviations over 30 independent runs obtained by
the Philippine Eagle Optimization Algorithm compared to the 11 other examined algorithms for 5 different multimodal
and separable functions of dimension 20. PEOA performed the best among all the algorithms in this table. While PEOA
did not achieve perfect results for the Xin-She Yang 1 function, PEOA still gave relatively low errors for this function.
WOA gave better results than PEOA for the Xin-She Yang 1 function but PEOA was able to produce significantly better
results than WOA for the Qing function.
Table 6. Average, best, and worst function value errors and standard deviations over 30 independent runs obtained by
the Philippine Eagle Optimization Algorithm compared to the 11 other examined algorithms for 5 different unimodal
and nonseparable functions of dimension 20. PEOA showed excellent performance for the functions here since it gave
results that are all less than 10−8 . PEOA is the only algorithm in this case that was able to give perfect results.
On the other hand, Figures 2, 3, 4, and 5 present the boxplots for functions with dimension D = 20. The boxplots show
the function value error |ftrue − f (x∗ )|, where ftrue is the true function value and x∗ is the obtained optimal solution of
the corresponding algorithm labelled in the bottom axis. For better illustration purposes, all values less than or equal to
10−8 are treated as 10−8 in the boxplots. Also, the logarithmic scale is used to accommodate a wide range of values.
Figure 6 shows the average number of function evaluations taken by the different examined algorithms for each
dimension D = 2, 5, 10, and 20 when the stopping criterion is satisfied. The averages are computed over the 30
independent runs of each test function and the 20 test functions per dimension.
From these results, we see that PEOA obtained the most number of solutions with errors less than 10−8 among all the
12 examined algorithms found in Tables 4, 6, 5, and 7. Also, most of the optimal solutions that PEOA found for the
different functions of dimension 20 are close to the true optimal solutions. While PEOA did not attain values less than
the tolerance for the Xin-She Yang 1, Salomon, and Xin-She Yang 4 functions, its obtained values for these functions
are still relatively small.
Moreover, the boxplots in Figures 2, 3, 4, and 5 further validate the superior performance of PEOA among the examined
algorithms. The boxplots corresponding to PEOA are generally thin and placed at 10−8 for almost all functions,
indicating that the errors obtained by PEOA are consistently small. In particular, PEOA shows highly competitive
results for the Schwefel 2.21, Periodic, Rosenbrock, Xin-She Yang 3, and Xin-She Yang 4 functions.
14
A PREPRINT - D ECEMBER 21, 2021
Table 7. Average, best, and worst function value errors and standard deviations over 30 independent runs obtained by
the Philippine Eagle Optimization Algorithm compared to the 11 other examined algorithms for 5 different multimodal
and nonseparable functions of dimension 20. PEOA performed the best in this case. While PEOA was not able to give
perfect results for the Xin-She Yang 4 function, it still gave relatively low function errors and significantly outperforms
all the other algorithms here.
At the same time, we see in Figure 6 that for all the different dimensions of functions tested, PEOA used the least
average number of function evaluations until the error tolerance of 10−8 is reached. PEOA thus fared well in comparison
with the other examined algorithms in terms of the speed and cost function value. This computationally inexpensive
feature of PEOA can be attributed to its heavy exploitation technique, depicted through its regular and intensive local
food search.
In the previous section, we have shown how PEOA is an efficient global optimization algorithm through various
benchmark tests. We now present two applications that PEOA has effectively solved.
Electrical Impedance Tomography (EIT) is a non-invasive imaging technique that reconstructs the conductivity
distribution of an object using electric currents. EIT has gained great interest for research due to its affordability,
portability, and as a radiation-free imaging technique [69, 70, 71, 72, 73, 74, 75, 76, 77, 78, 79]. In particular, the main
application of EIT is (continuous) lung monitoring in medical imaging [71, 69].
In this work, PEOA is applied to solve the inverse conductivity problem of EIT using the Complete Electrode Model
(CEM), which is the most accurate and commonly used model for EIT.
EIT as a mathematical problem is divided into two parts: the forward and the inverse problem. The forward problem is
where the data acquisition is made, that is, it computes for the voltages at the electrodes given a current pattern and the
conductivity distribution inside the object. Let Ω ⊂ Rd , d = 2, 3 be bounded with a smooth boundary ∂Ω.
Let a set of patches e` ⊂ ∂Ω, ` = 1, 2, . . . , L, where L ∈ N, be the mathematical model of disjoint contact electrodes.
Denote I` ∈ R the current injected on the `th electrode and suppose that the current pattern I = (I` )` satisfies the
PL
conservation of charge, i.e., `=1 I` = 0.
The effective contact impedance is denoted by Z ∈ RL , where Z = (z` )` , ` = 1, . . . , L and z` > zmin , for some
positive constant zmin . Moreover, the conductivity distribution σ ∈ L∞ (Ω) is assumed to satisfy 0 < σmin ≤ σ(x) ≤
σmax < +∞, for some constants σmin , σmax . Let u ∈ H 1 (Ω) be the potential inside the domain and the measured
PL
voltages at the electrodes be U = (U` )` which satisfies the arbitrary choice of ground, that is, `=1 U` = 0.
15
A PREPRINT - D ECEMBER 21, 2021
10-3 102
Function Value Error
10-7 10-6
10-8
10-8
IM C
O
A
O
O
OA
FA
OA
FA
E
BA
BA
CS
CS
A
GA
GA
A
BO
BO
OD
OD
WO
WO
FP
FP
AB
AB
PS
PS
MF
MF
PE
PE
Schwefel 2.21, D = 20 IM Sphere, D = 20
102
100
Function Value Error
10-4
10-4
10-6 10-6
10-8 10-8
C
IM C
O
A
O
O
OA
FA
OA
FA
E
BA
BA
CS
CS
A
GA
GA
A
BO
BO
OD
OD
WO
WO
FP
FP
AB
AB
PS
PS
MF
MF
PE
PE
IM
Sum Squares, D = 20
102
Function Value Error
100
10-2
10-4
10-6
10-8
C
O
A
O
A
FA
E
BA
CS
A
GA
BO
O
OD
WO
FP
AB
PS
MF
PE
IM
Figure 2. Boxplots over 30 independent runs (in logarithmic scale) of the function value errors obtained by the
Philippine Eagle Optimization Algorithm and the 11 other examined algorithms for 5 unimodal and separable functions
with 20 dimensions. PEOA consistently obtained thin boxplots that are placed at 10−8 for all the functions. This shows
that PEOA can provide accurate and consistent results for the functions here.
16
A PREPRINT - D ECEMBER 21, 2021
Alpine 1, D = 20 Wavy, D = 20
100
0
10 10-1
Function Value Error
10-6 10-6
10-7
10-8 10-8
IM C
IM C
O
A
O
O
OA
FA
OA
FA
E
BA
BA
CS
CS
A
GA
GA
A
BO
BO
OD
OD
WO
WO
FP
FP
AB
AB
PS
PS
MF
MF
PE
PE
Qing, D = 20 Rastrigin, D = 20
10 2
10 10
Function Value Error
100 10-4
10-6
10-5
10-8
IM C
IM C
O
A
O
O
OA
FA
OA
FA
E
BA
BA
CS
CS
A
GA
GA
A
BO
BO
OD
OD
WO
WO
FP
FP
AB
AB
PS
PS
MF
MF
PE
PE
Xin-She Yang 1, D = 20
105
Function Value Error
100
10-5
IM C
O
A
O
A
FA
E
BA
CS
A
GA
BO
O
OD
WO
FP
AB
PS
MF
PE
Figure 3. Boxplots over 30 independent runs (in logarithmic scale) of the function value errors obtained by the Philippine
Eagle Optimization Algorithm and the 11 other examined algorithms for 5 multimodal and separable functions with 20
dimensions. PEOA obtained thin and low boxplots for the first 4 functions here. For the 5th function, PEOA still has
the second best result.
17
A PREPRINT - D ECEMBER 21, 2021
Brown, D = 20 Rosenbrock, D = 20
0 104
10
Function Value Error
IM C
IM C
O
A
O
O
OA
FA
OA
FA
E
BA
BA
CS
CS
A
GA
GA
A
BO
BO
OD
OD
WO
WO
FP
FP
AB
AB
PS
PS
MF
MF
PE
PE
Schwefel 2.22, D = 20 Xin-She Yang 3, D = 20
1025 100
Function Value Error
1010 10-4
105
10-6
100
10-5
10-8
IM C
IM C
O
A
O
O
OA
FA
OA
FA
E
BA
BA
CS
CS
A
GA
GA
A
BO
BO
OD
OD
WO
WO
FP
FP
AB
AB
PS
PS
MF
MF
PE
PE
Zakharov, D = 20
104
Function Value Error
102
100
10-2
10-4
10-6
10-8
IM C
O
A
O
A
FA
E
BA
CS
A
GA
BO
O
OD
WO
FP
AB
PS
MF
PE
Figure 4. Boxplots over 30 independent runs (in logarithmic scale) of the function value errors obtained by the Philippine
Eagle Optimization Algorithm and the 11 other examined algorithms for 5 unimodal and nonseparable functions with
20 dimensions. Again, PEOA is consistent in obtaining thin and low boxplots for all the functions. Furthermore, PEOA
is the only algorithm in this case that has excellent boxplots for all 5 functions here.
18
A PREPRINT - D ECEMBER 21, 2021
Ackley, D = 20 Periodic, D = 20
0
100
10
Function Value Error
10-4 10-4
10
-6 10-6
10-8 10-8
IM C
O
A
O
O
OA
FA
OA
FA
E
BA
BA
CS
CS
A
GA
GA
A
BO
BO
OD
OD
WO
WO
FP
FP
AB
AB
PS
PS
MF
MF
PE
PE
Griewank, D = 20 IM Salomon, D = 20
100
100
Function Value Error
10-2
10-2
10-4 10-4
10-6 10-6
10-8 10-8
C
IM C
O
A
O
O
OA
FA
OA
FA
E
BA
BA
CS
CS
A
GA
GA
A
BO
BO
OD
OD
WO
WO
FP
FP
AB
AB
PS
PS
MF
MF
PE
PE
IM
Xin-She Yang 4, D = 20
0
10
Function Value Error
10-2
-4
10
-6
10
10-8
C
O
A
O
A
FA
E
BA
CS
A
GA
BO
O
OD
WO
FP
AB
PS
MF
PE
IM
Figure 5. Boxplots over 30 independent runs (in logarithmic scale) of the function value errors obtained by the
Philippine Eagle Optimization Algorithm and the 11 other examined algorithms for 5 multimodal and nonseparable
functions with 20 dimensions. Like the previous figures, PEOA has excellent boxplots for all the functions. Again,
PEOA is the only algorithm in this case that has thin and low boxplots for all 5 considered functions here. In particular,
see Periodic, Salomon, and Xin-She Yang 4 functions.
19
A PREPRINT - D ECEMBER 21, 2021
1.6
1.5
1.4
1.3
1.2
1.1
0.9
0.8
0.7
0.6
0.5
0.4
0.3
0.2
0.1
0
PEOA GA PSO FPA BA CS FA WOA MFO BOA ABC IMODE
Figure 6. Average number of function evaluations of the Philippine Eagle Optimization Algorithm and the 11 other
examined algorithms upon reaching a function value error of 10−8 . This is over 30 independent runs of each test
function with 20 test functions per dimension.
The CEM forward problem for EIT is: given current pattern I and conductivity distribution σ, find potentials (u, U )
such that
∇ · (σ∇u) = 0, in Ω, (12)
u + z ` σ∂~
n u = U ` , on e ` , ` = 1, 2, . . . , L, (13)
∂u
σ = 0, on ∂Ω \ Γe , (14)
Z ∂~n
∂u
σ ds = I` , ` = 1, 2, . . . , L. (15)
∂~n
e`
To learn more on the background of the equations, see [80], [81]. The existence and uniqueness of the solution of the
forward problem are proven in [81]. The discussion of the numerical solution and sensitivity analysis of the forward
problem can be found in [82].
Meanwhile, the inverse problem reconstructs the conductivity distribution given the voltage measurements on the
PN
electrodes. First, we assume that σ is piecewise constant, i.e., σ(x) = i=0 σi χi (x), x ∈ Ω, where σ is the
SN 0
background conductivity, χ0 (x) is the characteristic function of the background domain Ω0 = Ω \ i=1 Ωi , N
corresponds to the number of (possible) inclusions Ωi (i = 1, . . . , N ) in Ω, χi (x) = 1 if x ∈ Ωi and 0 otherwise.
Our goal is to retrieve the N inclusions of different conductivities in Ω. More precisely, we want to estimate vectors
P ∈ Rm and S ∈ RN iteratively. P contains the geometric attributes (e.g., center, side length) of the inclusions Ωi ,
i = 1, . . . , N and S has the respective conductivity σi for each inclusion, i = 1, 2, . . . , N such that the error between
the observed voltages and that predicted by the CEM forward problem is minimized. Now, the inverse conductivity
problem of EIT can be formulated as an optimization problem with the following objective function:
C(P, S) = kU (P, S) − Uobs k22 .
The voltages U (P, S) are determined by solving the CEM forward problem (12) − (13) − (14) − (15) at a fixed
conductivity σ and Uobs is the observed voltage at the electrodes, and k · k2 is the Euclidean norm.
Because of the importance of EIT in various fields, numerous approaches in solving the inverse problem can be found
in the literature [83, 84, 85, 86, 87, 88]. Several meta-heuristic algorithms were applied to the EIT inverse conductivity
problem and produced promising results [80, 89, 90, 91]. We show how PEOA can also effectively solve the EIT
inverse problem.
In this paper, we consider a disk domain with one elliptical inclusion. In particular, we aim to find the value of
unknowns, that is, σe the conductivity of the inclusion, (h, k) the center of the ellipse, and the lengths of the major
and minor axes, a and b, respectively. The conductivity σ0 of the background medium is known and equal to 1.0. We
work with synthetic data generated by setting the conductivity of the elliptical inclusion to be 6.7. The number of
electrodes is L = 32 and the contact impedance is set to be constant across all electrodes with z` = 0.03. The first
current applied to the electrodes has the form I 1 = {I`1 }L−1 2π`
`=0 , with I` = sin( L ) and we obtained the fifteen more
20
A PREPRINT - D ECEMBER 21, 2021
current patterns by ‘rotating’ the values of the first current pattern for a total of sixteen current patterns. A 1% random
(additive) noise is added to the voltage data as Udata = (1 + 0.01 · rand(L)) · U to model the error obtained from the EIT
experiments. In our simulations, one noise seed comprises sixteen different noise vectors added to the corresponding
sixteen current-voltage measurements. The algorithm is applied for 20 independent runs with the same noise seed for
all the runs, and a a maximum number of function evaluations (6 000) as the stopping criterion.
Table 8. Final solutions and their corresponding relative error (|truevalue − avevalue|/|truevalue|) generated by
PEOA for the EIT inverse conductivity problem in a disk domain with one elliptical inclusion. Note that 7π/8 ≈
2.74889357.
Parameter σe h k a b θ
bounds [5, 9] [−1, 1] [−1, 1] [0, 2] [0, 2] [0, π]
true value 6.7 −0.4 0.5 0.7 0.4 7π/8
ave. value 6.963 -0.399 0.484 0.689 0.449 2.583
rel. error 3.9E-02 1.4E-03 3.0E-02 1.4E-02 1.2E-01 6.0E-02
Figure 7. Left: true conductivity distribution. Right: reconstructed conductivity distribution (mean of the 20 approximate
solutions).
The results obtained by PEOA in solving the inverse conductivity problem of EIT are shown in Table 8 and Figure 7.
We observed that the algorithm approximated the conductivity value, the center, and the shape of elliptical inclusion
quite well, but was less accurate in approximating one of the axis lengths and angle of rotation.
Given a mathematical model of a system, some of its parameter values might be unknown. While one can search for
some of the parameters from the literature, the others need to be estimated. Parameter identification is a minimization
problem that solves for the parameters of the model that will best fit the available data. Depending on the problem,
various techniques on estimating parameters of models can be found in the literature [92, 93, 94, 95, 96, 97, 98].
As an application of POEA, we present an approach in identifying the parameters of a pendulum system model.
This model involves a neutral delay differential equation (NDDE), which is a differential equation with delay both
in state and the derivative. NDDEs have been used in modeling various applications in science and engineering
[99, 100, 101, 102, 103, 104, 105].
In this work, we consider a Pendulum-Mass-Spring-Damper (PMSD) system consisting of a mass M mounted on a
linear spring. Attached to the spring via a hinged rod of length l is a pendulum of mass m [106]. The angular deflection
of the pendulum from the downward position is assumed to be negligible. The parameter C is the damping coefficient.
Furthermore, it is assumed that external force does not act on the system. This mechanical system can be modeled
using the following delay differential equation of neutral type
M ẍ(t) + C ẋ(t) + Kx(t) + mẍ(t − τ ) = 0. (16)
Here, K and C denote the stiffness and damping coefficients, respectively. The position, velocity, and acceleration of
the system at a given time t is represented by the quantities x(t), ẋ(t), and ẍ(t), respectively. By dividing both sides of
(16) by M , we obtain the following modified equivalent equation
ÿ + 2ζ ẏ + y + pÿ(t − τ ) = 0. (17)
For this model, the history function is given by φ(t) = cos(t/2) [106].
The parameters of (17) are estimated from a set of simulated noisy data, which are generated in two steps. First, the
following parameter values from [106] are used to solve (17): τ = 1, ζ = 0.05, and p = 0. Secondly, the noisy data
21
A PREPRINT - D ECEMBER 21, 2021
y ∗ (ti ), i = 1, 2, . . . , n are generated by assuming a normal distribution, with the standard deviation equal to the 10% of
the standard deviation of the computed solution of the model [98]. For this study, we set n = 50. We find the minimum
of least-squares error formulation given by
50
2
(yi? − yθ (ti ))
P
i=1
min 50
,
θ∈R3 2
(yi? )
P
i=1
where θ is the parameter vector containing the triple τ, ζ, and p. We denote yθ (ti ) as the model solution at time ti given
θ.
Figure 8. Plots of the solution curves (cyan) to the pendulum model in (17) using the estimated parameters obtained by
PEOA.
Because PEOA is probabilistic, we run the algorithm 20 times independently. This way, we can gauge the accuracy and
consistency of the solutions obtained. The results are presented in Figure 8 and Table 9. We can see that all the 20
obtained estimates are close to the true solution. The different plots of the y(t) using the estimated parameters fit the
simulated data well. Furthermore, the relative errors of the calculated parameters are all less than 2%.
5 Conclusion
This work proposes a novel, meta-heuristic, and nature-inspired optimization algorithm called the Philippine Eagle
Optimization Algorithm. It is an algorithm that is inspired by the hunting behavior of the Philippine Eagle and uses three
different global operators for its exploration strategy. It also has an intensive local search every iteration, contributing to
its strong exploitation ability.
Twenty optimization test functions of varying properties on modality, separability, and dimension were solved using
PEOA, and the results were compared to those obtained by 11 other optimization algorithms. PEOA was also applied to
two real-world optimization problems: the inverse conductivity problem in Electrical Impedance Tomography (EIT)
and parameter estimation in a Pendulum-Mass-Spring-Damper system (PMSD) involving neutral delay differential
equations.
Results show that PEOA effectively solves the different benchmark tests implemented in this work. The algorithm
outperforms the other examined algorithms in terms of accuracy and precision in finding the optimal solution of the
tested functions. PEOA also uses the least number of function evaluations compared to the other algorithms, indicating
22
A PREPRINT - D ECEMBER 21, 2021
that it employs a computationally inexpensive optimization process. Such a feature of PEOA is due to its heavy
exploitation technique. Furthermore, PEOA can provide good results for the six unknowns in the EIT problem and
gives proper estimates for the parameters involved in the PMSD model.
We emphasize that PEOA gave better results than IMODE in solving the test functions chosen in this paper. This
is a significant highlight because IMODE ranked first in the CEC 2020 Competition on Single Objective Bound
Constrained Numerical Optimization [67]. Since certain aspects of PEOA were derived from IMODE and its several
source algorithms, PEOA can be considered a further improved version of these algorithms.
Therefore, PEOA is a competitive algorithm that can be applied to a variety of functions and problems while keeping
the number of function evaluations at a minimum. It shows promising features in comparison to the other optimization
algorithms selected. It also highlights the distinctive characteristics of the national bird of the Philippines, the Philippine
Eagle, which could hopefully initiate conservation efforts for the critically endangered bird.
Future research will consider more modifications of PEOA that can further improve its performance, experimentation
of PEOA to a broader scope of optimization functions, finding more real-world applications where PEOA can be used,
and creating versions of PEOA that can handle constrained or multi-objective optimization problems.
References
[1] Xin-She Yang. Nature-Inspired Metaheuristic Algorithms. Luniver Press, 2008.
[2] Xin-She Yang. Engineering Optimization: An Introduction with Metaheuristic Applications. 09 2010.
[3] C.A. Floudas and P.M. Pardalos. Encyclopedia of Optimization. Encyclopedia of Optimization. Springer US,
2008.
[4] Xin-She Yang. Optimization Techniques and Applications with Examples. John Wiley & Sons, Inc., Sep 2018.
[5] John H. Holland. Genetic algorithms. Scientific American, 267(1):66–72, Jul 1992.
[6] Rainer Storn and Kenneth Price. Differential evolution - a simple and efficient heuristic for global optimization
over continuous spaces. Journal of Global Optimization, 11(4):341–359, 1997.
[7] Dervis Karaboga and Bahriye Basturk. Artificial Bee Colony (ABC) Optimization Algorithm for Solving
Constrained Optimization Problems, pages 789–798. Springer Berlin Heidelberg, 2007.
[8] Xin-She Yang. Firefly Algorithms for Multimodal Optimization, pages 169–178. Springer Berlin Heidelberg,
2009.
[9] Xin-She Yang and Suash Deb. Cuckoo search via levy flights. In 2009 Proc. World Congress on Nature &
Biologically Inspired Computing (NaBIC 2009), pages 210–214. Institute of Electrical and Electronics Engineers
(IEEE), Dec 2009.
[10] Xin-She Yang. A New Metaheuristic Bat-Inspired Algorithm, pages 65–74. Springer Berlin Heidelberg, 2010.
[11] Xin-She Yang. Flower Pollination Algorithm for Global Optimization, pages 240–249. Springer Berlin
Heidelberg, 2012.
[12] Seyedali Mirjalili. Moth-flame optimization algorithm: A novel nature-inspired heuristic paradigm. Knowledge-
Based Systems, 89:228–249, Nov 2015.
[13] Seyedali Mirjalili and Andrew Lewis. The whale optimization algorithm. Advances in Engineering Software,
95:51–67, May 2016.
[14] Sankalap Arora and Satvir Singh. Butterfly optimization algorithm: a novel approach for global optimization.
Soft Computing, 23(3):715–734, Mar 2018.
23
A PREPRINT - D ECEMBER 21, 2021
Table 10. Average, best, and worst function value errors and standard deviations over 30 independent runs obtained by
the Philippine Eagle Optimization Algorithm compared to those obtained by the 11 other examined algorithms for the
20 different functions of varied types and having dimension 2.
24
A PREPRINT - D ECEMBER 21, 2021
Table 11. Average, best, and worst function value errors and standard deviations over 30 independent runs obtained by
the Philippine Eagle Optimization Algorithm compared to those obtained by the 11 other examined algorithms for the
20 different functions of varied types and having dimension 5.
25
A PREPRINT - D ECEMBER 21, 2021
Table 12. Average, best, and worst function value errors and standard deviations over 30 independent runs obtained by
the Philippine Eagle Optimization Algorithm compared to those obtained by the 11 other examined algorithms for the
20 different functions of varied types and having dimension 10.
26
Function Value Error Function Value Error
Function Value Error Function Value Error
10
10
10
10
10
10
10
10
10
-8.5
10-7.5
-6.5
10-5.5
10-8.5
-7.5
100
102
0
-8
-6
10-4
-2
10-8
10-7
10-6
10-8
-6
10-4
10-2
-8
PE PE PE PE
OA O A O A OA
GA GA GA GA
PS PS PS PS
O O O O
FP FP FP FP
A A A A
BA BA BA BA
Function Value Error CS CS Function Value Error CS CS
10
FA FA FA FA
10-8
10-7
10-6
10-5
-4
10-3
10-8
10-7
10-6
10-5
PE WO WO WO WO
A A PE A A
Brown, D = 2
OA O A MF MF
Powell Sum, D = 2
MF MF O O
O O
Schwefel 2.21, D = 2
Schwefel 2.22, D = 2
GA GA
BO BO BO BO
PS A A PS A A
O O AB AB
FP AB AB FP
A IM C IM C A IM C IM C
OD OD OD OD
BA E E BA E E
10
10
10
10
FA FA
100
102
100
-8
10-6
10-4
10-2
10-8
10-6
10-4
10-2
10-8.5
10-7.5
10-6.5
-5.5
100
102
10-8
10-7
10-6
-8
10-6
-4
-2
WO PE PE WO
A OA O A A PE
O
PE
A OA
Zakharov, D = 2
MF MF
O O
Sum Squares, D = 2
GA GA GA GA
BO BO
A PS
O
PS
O A PS PS
AB AB O O
FP FP IM C FP FP
IM C
OD A A OD A A
E BA BA E
BA BA
CS CS CS CS
FA FA FA FA
WO WO WO WO
A A A A
Sphere, D = 2
MF MF MF MF
Rosenbrock, D = 2
O O O O
Schwefel 2.20, D = 2
Xin-She Yang 3, D = 2
BO BO BO BO
A A A A
AB AB AB AB
IM C IM C IM C IM C
OD OD OD OD
E E E E
27
Function Value Error Function Value Error Function Value Error Function Value Error
10
10
10
10
10
10
10
10
10
100
0
100
102
104
106
-8
-6
10-4
10-2
10-8
10-6
10-4
10-2
10-8
-6
-4
10-2
-8
-7
-6
10-5
10-4
10-3
-2
10-1
PE PE PE PE
OA O A O A OA
GA GA GA GA
PS PS PS PS
O O O O
FP FP FP FP
A A A A
BA BA BA BA
Function Value Error CS CS Function Value Error CS CS
FA FA FA FA
100
10-8
10-6
10-4
10-2
10-8
10-7
10-6
10-5
10-4
10-3
10-2
10-1
Qing, D = 2
PE WO WO PE WO WO
OA A A Ackley, D = 2 O A A A
Alpine 1, D = 2
Griewank, D = 2
MF MF MF MF
GA O O GA O O
BO BO BO BO
PS
O A A PS
O A A
FP AB AB FP AB AB
A IM C IM C A IM C IM C
OD OD OD OD
BA E E BA E E
CS Function Value Error Function Value Error CS Function Value Error Function Value Error
10
10
10
10
FA FA
100
0
100
10-8
10-6
10-4
10-2
10-8
10-7
10-6
10-5
10-4
10-3
-2
10-1
10-8
10-6
10-4
10-2
10-8
-7
10-6
10-5
10-4
10-3
10-2
-1
WO PE PE WO PE PE
A OA O A A O A OA
MF MF
O GA GA O GA GA
Xin-She Yang 4, D = 2
Xin-She Yang 1, D = 2
BO BO
A PS
O
PS
O A PS
O
PS
O
AB FP FP AB FP FP
IM C A A IM C A A
OD OD
E BA BA E BA BA
CS CS CS CS
FA FA FA FA
WO WO WO WO
Wavy, D = 2
A A A A
Periodic, D = 2
Salomon, D = 2
Rastrigin, D = 2
MF MF MF MF
O O O O
BO BO BO BO
A A A A
AB AB AB AB
IM C IM C IM C IM C
OD OD OD OD
E E E E
A PREPRINT - D ECEMBER 21, 2021
Philippine Eagle Optimization Algorithm and the 11 other examined algorithms for the 20 different functions of varied
Figure 9. Boxplots over 30 independent runs (in logarithmic scale) of the function value errors obtained by the
Function Value Error Function Value Error Function Value Error
Function Value Error
10
10
10
10
10
100
102
-8
10-6
10-4
10-2
10-8
-7
-6
10-5
10-4
10-3
10-2
10-8.5
-7.5
10-6.5
10-5.5
100
102
10-8
10-6
10-4
-2
10-8
10-7
10-6
PE PE
OA O A PE PE
O A OA
GA GA
GA GA
PS PS
O O PS PS
FP FP O O
A A FP FP
A A
BA BA
BA BA
Function Value Error CS CS
Function Value Error CS CS
10
10
10
FA FA
10-8
10-6
10-4
10-2
FA FA
10-8
10-7
-6
10-5
-4
10-3
10-2
10-1
PE WO WO
A A
Brown, D = 5
OA PE WO WO
MF MF O A A A
O O MF MF
Powell Sum, D = 5
Schwefel 2.22, D = 5
GA O O
Schwefel 2.21, D = 5
BO BO GA
PS
O A A PS BO
A
BO
A
AB AB O
FP AB AB
A IM C IM C FP
A IM C IM C
OD OD
BA E E BA
OD
E
OD
E
10
10
10
10
10
10
10
10
FA
100
100
102
-8
10-6
10-4
10-2
10-8
-6
10-4
10-2
FA
100
102
-8
10-7
-6
10-5
10-4
-3
10-2
10-8
-6
-4
-2
WO PE PE WO
A O PE PE
OA A A O A OA
Zakharov, D = 5
MF MF
O GA GA O
Sum Squares, D = 5
GA GA
BO BO
A PS
O
PS
O A PS PS
AB O O
FP FP AB FP FP
IM C A A IM C A A
OD OD
E BA BA E BA BA
CS CS CS CS
FA FA FA FA
WO WO WO WO
A A A A
Sphere, D = 5
MF MF MF MF
Rosenbrock, D = 5
O O O O
Schwefel 2.20, D = 5
Xin-She Yang 3, D = 5
BO BO BO BO
A A A A
AB AB AB AB
IM C IM C IM C IM C
OD OD OD OD
E E E E
28
Function Value Error Function Value Error Function Value Error Function Value Error
10
10
10
10
10
10
10
10
10
100
0
-8
10-6
10-4
10-2
10-8
-6
10-4
10-2
100
5
0
1010
10-5
-8
10-7
-6
10-5
-4
-3
10-2
10-1
PE PE PE PE
OA O A O A OA
GA GA GA GA
PS PS PS PS
O O O O
FP FP FP FP
A A A A
BA BA BA BA
Function Value Error CS CS Function Value Error CS CS
FA FA
100
10-8
10-6
10-4
10-2
FA FA
100
10-8
10-6
10-4
10-2
PE WO WO
Qing, D = 5
PE WO WO
OA A A Ackley, D = 5 O A A
A
Alpine 1, D = 5
Griewank, D = 5
MF MF MF MF
GA O O GA O O
BO BO BO BO
PS
O A A PS
O A A
FP AB AB AB AB
FP
A IM C IM C A IM C IM C
OD OD OD OD
BA E E BA E E
CS Function Value Error Function Value Error CS Function Value Error Function Value Error
10
10
10
10
10
FA
100
100
10-8
10-6
10-4
10-2
10-8
10-7
10-6
10-5
10-4
10-3
10-2
10-1
FA
0
102
100
10-8
10-6
10-4
-2
10-8
-7
10-6
10-5
-4
10-3
10-2
-1
WO PE PE WO PE PE
A OA O A A O A OA
MF MF
O GA GA O GA GA
Xin-She Yang 4, D = 5
Xin-She Yang 1, D = 5
BO BO
A PS
O
PS
O A PS
O
PS
O
AB FP FP AB FP FP
IM C A A IM C A A
OD OD
E BA BA E BA BA
CS CS CS CS
FA FA FA FA
WO WO WO WO
Wavy, D = 5
A A A A
Periodic, D = 5
Salomon, D = 5
Rastrigin, D = 5
MF MF MF MF
O O O O
BO BO BO BO
A A A A
AB AB AB AB
IM C IM C IM C IM C
OD OD OD OD
E E E E
A PREPRINT - D ECEMBER 21, 2021
Philippine Eagle Optimization Algorithm and the 11 other examined algorithms for the 20 different functions of varied
Figure 10. Boxplots over 30 independent runs (in logarithmic scale) of the function value errors obtained by the
Function Value Error Function Value Error Function Value Error Function Value Error
10
10
10
10
10
10
100
102
104
106
100
100
102
10-8
10-6
10-4
10-2
10-8
-7
10-6
10-5
10-4
-3
10-2
-1
10-8
10-6
-4
-2
10-8
-7
10-6
10-5
PE
OA
PE
O
PE
O
PE 10-4
A A OA
GA GA GA GA
PS PS PS PS
O O O O
FP FP FP FP
A A A A
BA BA BA BA
Function Value Error CS CS Function Value Error CS CS
10
FA FA FA FA
100
102
100
10-8
10-6
10-4
-2
10-8
10-6
10-4
10-2
PE WO WO PE WO WO
A A O A A
Brown, D = 10
OA A
MF MF MF MF
O O O O
Powell Sum, D = 10
GA GA
Schwefel 2.22, D = 10
Schwefel 2.21, D = 10
BO BO BO BO
PS
O A A PS
O A A
FP AB AB FP AB AB
A IM C IM C A IM C IM C
OD OD OD OD
BA E E BA E E
CS Function Value Error Function Value Error CS Function Value Error Function Value Error
FA FA
100
100
102
100
100
102
-8
10-6
10-4
10-2
10-8
10-6
10-4
-2
10-8
10-6
10-4
10-2
10-8
10-6
10-4
-2
WO PE PE WO PE PE
A OA O A A O A OA
Zakharov, D = 10
MF MF
O GA GA O GA GA
Sum Squares, D = 10
BO BO
A PS
O
PS
O A PS
O
PS
O
AB FP FP AB FP FP
IM C A A IM C A A
OD OD
E BA BA E BA BA
CS CS CS CS
FA FA FA FA
WO WO WO WO
A A A A
Sphere, D = 10
MF MF MF MF
O O O O
Rosenbrock, D = 10
Schwefel 2.20, D = 10
Xin-She Yang 3, D = 10
BO BO BO BO
A A A A
AB AB AB AB
IM C IM C IM C IM C
OD OD OD OD
E E E E
29
Function Value Error Function Value Error Function Value Error Function Value Error
10
10
10
10
10
10
100
0
-8
10-6
-4
10-2
10-8
10-6
10-4
10-2
0
105
100
1010
-5
-8
10-6
10-4
10-2
PE PE PE PE
OA O A O A OA
GA GA GA GA
PS PS PS PS
O O O O
FP FP FP FP
A A A A
BA BA BA BA
Function Value Error CS CS Function Value Error CS CS
FA FA
100
10-8
10-6
10-4
10-2
FA FA
100
102
10-8
10-6
10-4
10-2
PE WO WO PE WO WO
A A
Qing, D = 10
OA Ackley, D = 10 O A A A
Alpine 1, D = 10
Griewank, D = 10
MF MF MF MF
GA O O GA O O
BO BO BO BO
PS
O A A PS
O A A
FP AB AB AB AB
FP
A IM C IM C A IM C IM C
OD OD OD OD
BA E E BA E E
CS Function Value Error Function Value Error CS Function Value Error Function Value Error
10
10
10
10
10
10
10
FA
100
100
10-8
10-6
10-4
10-2
10-8
10-7
10-6
10-5
10-4
10-3
10-2
-1
FA
100
2
100
10-8
-6
10-4
-2
10-8
-7
10-6
10-5
-4
10-3
10-2
-1
WO PE PE WO PE PE
A OA O A A O A OA
MF MF
O GA GA O GA GA
Xin-She Yang 4, D = 10
Xin-She Yang 1, D = 10
BO BO
A PS
O
PS
O A PS
O
PS
O
AB FP FP AB FP FP
IM C A A IM C A A
OD OD
E BA BA E BA BA
CS CS CS CS
FA FA FA FA
WO WO WO WO
A A A A
Wavy, D = 10
Periodic, D = 10
Salomon, D = 10
Rastrigin, D = 10
MF MF MF MF
O O O O
BO BO BO BO
A A A A
AB AB AB AB
IM C IM C IM C IM C
OD OD OD OD
E E E E
A PREPRINT - D ECEMBER 21, 2021
Philippine Eagle Optimization Algorithm and the 11 other examined algorithms for the 20 different functions of varied
Figure 11. Boxplots over 30 independent runs (in logarithmic scale) of the function value errors obtained by the
A PREPRINT - D ECEMBER 21, 2021
[15] Karam M. Sallam, Saber M. Elsayed, Ripon K. Chakrabortty, and Michael J. Ryan. Improved multi-operator
differential evolution algorithm for solving unconstrained problems. In 2020 IEEE Congress on Evolutionary
Computation (CEC). Institute of Electrical and Electronics Engineers (IEEE), Jul 2020.
[16] C. Munien, S. Mahabeer, E. Dzitiro, S. Singh, S. Zungu, and A.E. Ezugwu. Metaheuristic approaches for
one-dimensional bin packing problem: A comparative performance study. IEEE Access, 2020. cited By 2.
[17] P. Agrawal, H.F. Abutarboush, T. Ganesh, and A.W. Mohamed. Metaheuristic algorithms on feature selection: A
survey of one decade of research (2009-2019). IEEE Access, 9:26766–26791, 2021. cited By 4.
[18] M.M. Ahsan, K.D. Gupta, A.K. Nag, S. Poudyal, A.Z. Kouzani, and M.A.P. Mahmud. Applications and
evaluations of bio-inspired approaches in cloud security: A review. IEEE Access, 8:180799–180814, 2020. cited
By 2.
[19] Ghanshyam G Tejani, Vimal J Savsani, Vivek K Patel, and Seyedali Mirjalili. An improved heat transfer
search algorithm for unconstrained optimization problems. Journal of Computational Design and Engineering,
6(1):13–32, 2019.
[20] Angelie R Ferrolino, Jose Ernie C Lope, and Renier G Mendoza. Optimal location of sensors for early detection
of tsunami waves. In International Conference on Computational Science, pages 562–575. Springer, 2020.
[21] Angelie Ferrolino, Renier Mendoza, Ikha Magdalena, and Jose Ernie Lope. Application of particle swarm
optimization in optimal placement of tsunami sensors. PeerJ Computer Science, 6:e333, 2020.
[22] T Srihari, Madhu Boppa, S Anil Kumar, and Harish Pulluri. The application of genetic algorithm with multi-
parent crossover to optimal power flow problem. In Innovations in Electrical and Electronics Engineering, pages
417–427. Springer, 2020.
[23] Meisam Mahdavi, Hassan Haes Alhelou, Amir Bagheri, Sasa Z Djokic, and Ricardo Alan Verdú Ramos. A
comprehensive review of metaheuristic methods for the reconfiguration of electric power distribution systems
and comparison with a novel approach based on efficient genetic algorithm. IEEE Access, 2021.
[24] Elizabeth Jordan, Delia E. Shin, Surbhi Leekha, and Shapour Azarm. Optimization in the context of covid-19
prediction and control: A literature review. IEEE Access, 9:130072–130093, 2021.
[25] Ghanshyam G Tejani, Vimal J Savsani, and Vivek K Patel. Modified sub-population teaching-learning-based
optimization for design of truss structures with natural frequency constraints. Mechanics Based Design of
Structures and Machines, 44(4):495–513, 2016.
[26] Ghanshyam G Tejani, Nantiwat Pholdee, Sujin Bureerat, Doddy Prayogo, and Amir H Gandomi. Structural opti-
mization using multi-objective modified adaptive symbiotic organisms search. Expert Systems with Applications,
125:425–441, 2019.
[27] Ghanshyam G Tejani, Vimal J Savsani, Sujin Bureerat, Vivek K Patel, and Poonam Savsani. Topology optimiza-
tion of truss subjected to static and dynamic constraints by integrating simulated annealing into passing vehicle
search algorithms. Engineering with Computers, 35(2):499–517, 2019.
[28] Tawatchai Kunakote, Numchoak Sabangban, Sumit Kumar, Ghanshyam G Tejani, Natee Panagant, Nantiwat
Pholdee, Sujin Bureerat, and Ali R Yildiz. Comparative performance of twelve metaheuristics for wind farm
layout optimisation. Archives of Computational Methods in Engineering, pages 1–14, 2021.
[29] Sumit Kumar, Ghanshyam G Tejani, Nantiwat Pholdee, and Sujin Bureerat. Multiobjecitve structural optimization
using improved heat transfer search. Knowledge-Based Systems, 219:106811, 2021.
[30] Sumit Kumar, Ghanshyam G Tejani, Nantiwat Pholdee, and Sujin Bureerat. Improved metaheuristics through
migration-based search and an acceptance probability for truss optimization. Asian Journal of Civil Engineering,
21(7):1217–1237, 2020.
[31] Vimal J Savsani, Ghanshyam G Tejani, and Vivek K Patel. Truss topology optimization with static and dynamic
constraints using modified subpopulation teaching–learning-based optimization. Engineering optimization,
48(11):1990–2006, 2016.
[32] Sumit Kumar, Pradeep Jangir, Ghanshyam G Tejani, Manoharan Premkumar, and Hassan Haes Alhelou. Mopgo:
A new physics-based multi-objective plasma generation optimizer for solving structural optimization problems.
IEEE Access, 9:84982–85016, 2021.
[33] Hui Li, Xiao Liu, Zhiguo Huang, Chenbo Zeng, Peng Zou, Zhaoyi Chu, and Junkai Yi. Newly emerging
nature-inspired optimization - algorithm review, unified framework, evaluation, and behavioural parameter
optimization, 2020.
30
A PREPRINT - D ECEMBER 21, 2021
[34] Daniel Molina, Javier Poyatos, Javier Del Ser, Salvador Garcia, Amir Hussain, and Francisco Herrera. Compre-
hensive taxonomies of nature- and bio-inspired optimization: Inspiration versus algorithmic behavior, critical
analysis recommendations, Jul 2020.
[35] Iztok Fister Jr., Xin-She Yang, Iztok Fister, Janez Brest, and Dušan Fister. A brief review of nature-inspired
algorithms for optimization, 2013.
[36] J. Kennedy and R. Eberhart. Particle swarm optimization. In Proceedings of ICNN’95 - International Conference
on Neural Networks, volume 4, pages 1942–1948 vol.4, 1995.
[37] Bahriye Akay and Dervis Karaboga. A modified artificial bee colony algorithm for real-parameter optimization.
Information Sciences, 192:120–142, Jun 2012.
[38] Cheng-Xu Zhang, Kai-Qing Zhou, Shao-Qiang Ye, and Azlan Mohd Zain. An improved cuckoo search algorithm
utilizing nonlinear inertia weight and differential evolution for function optimization problem. IEEE Access,
2021.
[39] I. Fister, J. Brest, A. Iglesias, A. Galvez, and S. Deb. On selection of a benchmark by determining the algorithms’
qualities. IEEE Access, 9:51166–51178, 2021. cited By 0.
[40] Z.-M. Gao, J. Zhao, Y.-R. Hu, and H.-F. Chen. The challenge for the nature-inspired global optimization
algorithms: Non-symmetric benchmark functions. IEEE Access, 9:106317–106339, 2021. cited By 0.
[41] Momin Jamil and Xin She Yang. A literature survey of benchmark functions for global optimisation problems.
International Journal of Mathematical Modelling and Numerical Optimisation, 4(2):150–194, 2013.
[42] A. Kazikova, M. Pluhacek, and R. Senkerik. How does the number of objective function evaluations impact our
understanding of metaheuristics behavior? IEEE Access, 9:44032–44048, 2021. cited By 0.
[43] Jayson Ibanez, Anna Mae Sumaya, Giovanne Tampos, and Dennis Salvador. Preventing philippine eagle hunting:
what are we missing? Journal of Threatened Taxa, 8(13):9505, Nov 2016.
[44] Adrian U. Luczon, Ian Kendrich C. Fontanilla, Perry S. Ong, Zubaida U. Basiao, Anna Mae T. Sumaya, and
Jonas P. Quilang. Genetic diversity of the critically endangered philippine eagle pithecophaga jefferyi (aves:
Accipitridae) and notes on its conservation. Journal of Threatened Taxa, 6(10):6335–6344, Sep 2014.
[45] BirdLife International. Pithecophaga jefferyi. IUCN Red List of Threatened Species, Oct 2016.
[46] Robert S Kennedy. Notes on the biology and population status of the monkey-eating eagle of the philippines.
Wilson Bull, 89(1):1–20, 1977.
[47] Camille B. Concepcion, Margaret Sulapas, and J. Ibañez. Notes on food habits and breeding and nestling
behavior of philippine eagles in mount apo natural park, mindanao, philippines. 2006.
[48] Marco Cavazzuti. Optimization Methods. Springer Berlin Heidelberg, 2013.
[49] Saber Elsayed, Noha Hamza, and Ruhul Sarker. Testing united multi-operator evolutionary algorithms-ii on
single objective optimization problems. In 2016 IEEE Congress on Evolutionary Computation (CEC), pages
2966–2973, 2016.
[50] Jingqiao Zhang and Arthur C. Sanderson. Jade: Adaptive differential evolution with optional external archive.
IEEE Transactions on Evolutionary Computation, 13(5):945–958, 2009.
[51] Andy M. Reynolds and Mark A. Frye. Free-flight odor tracking in drosophila is consistent with an optimal
intermittent scale-free search. PLOS ONE, 2(4):1–9, 04 2007.
[52] M. Gutowski. Levy flights as an underlying mechanism for global optimization algorithms. arXiv: Mathematical
Physics, 07 2001.
[53] Ali Asghar Heidari, Seyedali Mirjalili, Hossam Faris, Ibrahim Aljarah, Majdi Mafarja, and Huiling Chen. Harris
hawks optimization: Algorithm and applications. Future Generation Computer Systems, 97:849–872, Aug 2019.
[54] Ryoji Tanabe and Alex S. Fukunaga. Improving the search performance of shade using linear population size
reduction. In 2014 IEEE Congress on Evolutionary Computation (CEC), pages 1658–1665, 2014.
[55] Nermin Covic and Bakir Lacevic. Wingsuit flying search-a novel global optimization algorithm. IEEE Access,
8:53883–53900, 2020.
[56] D. Ortiz-Boyer, C. Hervas-Martinez, and N. Garcia-Pedrajas. Cixl2: A crossover operator for evolutionary
algorithms based on population features. Journal of Artificial Intelligence Research, 24:1–48, Jul 2005.
[57] Xin Yao, Yong Liu, Ko-Hsin Liang, and Guangming Lin. Fast Evolutionary Algorithms, pages 45–94. Springer
Berlin Heidelberg, 2003.
31
A PREPRINT - D ECEMBER 21, 2021
[58] S. Surjanovic and D. Bingham. Virtual library of simulation experiments: Test functions and datasets. Retrieved
September 25, 2021, from http://www.sfu.ca/~ssurjano, 2013.
[59] Xin-She Yang. Flower pollination algorithm. Retrieved September 25, 2021, from https://www.mathworks.
com/matlabcentral/fileexchange/45112-flower-pollination-algorithm, MATLAB Central File
Exchange, 2021.
[60] Xin-She Yang. Bat algorithm (demo). Retrieved September 25, 2021, from https://www.mathworks.com/
matlabcentral/fileexchange/37582-bat-algorithm-demo, MATLAB Central File Exchange, 2021.
[61] Xin-She Yang. Cuckoo search (cs) algorithm. Retrieved September 25, 2021, from https://www.mathworks.
com/matlabcentral/fileexchange/29809-cuckoo-search-cs-algorithm, MATLAB Central File Ex-
change, 2021.
[62] Xin-She Yang. Firefly algorithm. Retrieved September 25, 2021, from https://www.mathworks.com/
matlabcentral/fileexchange/29693-firefly-algorithm, MATLAB Central File Exchange, 2021.
[63] Seyedali Mirjalili. The whale optimization algorithm. Retrieved September 25, 2021, from https://www.
mathworks.com/matlabcentral/fileexchange/55667-the-whale-optimization-algorithm, MAT-
LAB Central File Exchange, 2021.
[64] Seyedali Mirjalili. Moth-flame optimization (mfo) algorithm. Retrieved Septem-
ber 25, 2021, from https://www.mathworks.com/matlabcentral/fileexchange/
52269-moth-flame-optimization-mfo-algorithm, MATLAB Central File Exchange, 2021.
[65] Sankalap Arora. Butterfly optimization algorithm (boa) source codes demo v1.0. Retrieved Septem-
ber 25, 2021, from https://www.mathworks.com/matlabcentral/mlc-downloads/downloads/
b4a529ac-c709-4752-8ae1-1d172b8968fc/67a434dc-8224-4f4e-a835-bc92c4630a73/previews/
BOA.m/index.html, 2021.
[66] SKS Labs. Artificial bee colony optimization. Retrieved September 25, 2021, from https://www.mathworks.
com/matlabcentral/fileexchange/74122-artificial-bee-colony-optimization, MATLAB Cen-
tral File Exchange, 2021.
[67] C. T. Yue, K. V. Price, P. N. Suganthan, J. J. Liang, M. Z. Ali, B. Y. Qu, N. H. Awad, and Partha P Biswas.
Problem definitions and evaluation criteria for the cec 2020 special session and competition on single objective
bound constrained numerical optimization. Technical Report, Computational Intelligence Laboratory, Zhengzhou
University, Zhengzhou China and Technical Report, Nanyang Technological University, Singapore, Nov 2019.
[68] Erika Antonette Enriquez. Philippine-eagle-optimization-algorithm. https://github.com/
ErikaAntonette/Philippine-Eagle-Optimization-Algorithm.git, 2021.
[69] J. Newell, D. Isaacson, and J. Mueller. Electrical impedance tomography. IEEE Trans. Med. Imaging, 21:553–554,
2002.
[70] E. Teschner, M. Imhoff, and S. Leonhardt. Electrical Impedance Tomography: The realisation of regional
ventilation monitoring, 2nd edition. Dräger Medical GmbH, 2015.
[71] D. Holder. Clinical and physiological applications of electrical impedance tomography. UCL Press, 1993.
[72] N. D. Harris, A. J. Suggett, D. C. Barber, and B. H. Brown. Applications of applied potential tomography (APT)
in respiratory medicine. Clinical Physics and Physiological Measurement, 8(4A):155–165, 1987.
[73] I. Frerichs, J. Hinz, P. Herrmann, G. Weisser, G. Hahn, T. Dudykevych, M. Quintel, and G. Hellige. Detection of
local lung air content by electrical impedance tomography compared with electron beam CT. Journal of applied
physiology (1985), 93:660–666, 2002.
[74] D. Tingay, A. Waldmann, I. Frerichs, S. Ranganathan, and A. Adler. Electrical impedance tomography can
identify ventilation and perfusion defects: A neonatal case. American Journal of Respiratory and Critical Care
Medicine, 199:384–386, 2019.
[75] A. Adler, Y. Berthiaume, R. Guardo, and R. Amyot. Imaging of pulmonary edema with electrical impedance
tomography. In Proceedings of 17th International Conference of the Engineering in Medicine and Biology
Society, volume 1, pages 557–558, 1995.
[76] V. Cherepenin, A. Karpov, A. Korjenevsky, V. Kornienko, A. Mazaletskaya, D. Mazourov, and D. Meister. A 3D
electrical impedance tomography (EIT) system for breast cancer detection. Physiological measurement, 22:9–18,
2001.
[77] R. J. Halter, A. Hartov, and K. D. Paulsen. A broadband high-frequency electrical impedance tomography system
for breast imaging. IEEE Transactions on Biomedical Engineering, 55(2):650–659, 2008.
32
A PREPRINT - D ECEMBER 21, 2021
33
A PREPRINT - D ECEMBER 21, 2021
[100] Y.N. Kyrychko and S.J. Hogan. On the use of delay equations in engineering applications. Journal of Vibration
and Control, 16(7–8):943–960, 2010.
[101] M. Fliess, H. Mounier, P. Rouchon, and J. Rudolph. Controllability and motion planning for linear delay systems
with an application to a flexible rod. In Proceedings of 1995 34th IEEE Conference on Decision and Control.
IEEE.
[102] Dalma J. Nagy, László Bencsik, and Tamás Insperger. Experimental estimation of tactile reaction delay during
stick balancing using cepstral analysis. Mechanical Systems and Signal Processing, 138:106554, 2020.
[103] Alexander Domoshnitsky, Shai Levi, Ron Hay Kappel, Elena Litsyn, and Roman Yavich. Stability of neutral
delay differential equations with applications in a model of human balancing. Mathematical Modelling of Natural
Phenomena, 16:21, 2021.
[104] Cristeta Jamilla, Renier Mendoza, and Victoria Victoria May Mendoza. Explicit solution of a lotka-sharpe-
mckendrick system involving neutral delay differential equations using the r-lambert w function. Mathematical
Biosciences and Engineering, 17(5):5686–5708, 2020.
[105] C. T. H. Baker, G. A. Bocharov, C. A. H. Paul, and F. A. Rihan. Modelling and analysis of time-lags in some
basic patterns of cell proliferation. Journal of Mathematical Biology, 37(4):341–371, 1998.
[106] Y.N Kyrychko, K.B Blyuss, A Gonzalez-Buelga, S.J Hogan, and D.J Wagg. Real-time dynamic substructuring
in a coupled oscillator–pendulum system. Proceedings of the Royal Society A: Mathematical, Physical and
Engineering Sciences, 462(2068):1271–1294, 2006.
34