Expert Systems With Applications: Wen Long, Ming Xu, Jianjun Jiao, Tiebin Wu, Mingzhu Tang, Shaohong Cai
Expert Systems With Applications: Wen Long, Ming Xu, Jianjun Jiao, Tiebin Wu, Mingzhu Tang, Shaohong Cai
A R T I C L E I N F O A B S T R A C T
Keywords: Throughout the last decade, high-dimensional function optimization problems have received substantial research
Butterfly optimization algorithm interest in the field of intelligence computation. Butterfly optimization algorithm is a new meta-heuristic tech
High-dimensional optimization nique that has been proven to be more competitive than other optimization methods on low- dimensional
Refraction-learning
problems. The basic butterfly optimization algorithm and its variants are not used to solve high-dimensional
Feature selection
problems. Therefore, this paper develops an improved version of butterfly optimization algorithm to deal with
Particle swarm optimization
Global optimization the high- dimensional optimization problems. Inspired by particle swarm optimization, a modified position
updated equation is designed by introducing velocity item and memory item in the local search phase to guide
the search for candidate individuals. Moreover, a novel refraction-based learning strategy is introduced into
butterfly optimization algorithm to effectively enhance diversity and exploration. The effectiveness of the pro
posed algorithm is tested on 40 benchmark problems with 100, 1000, 10,000 dimensions. The statistical results
show that our algorithm has better performance than the basic butterfly optimization algorithm, its variants, and
other population-based approaches to deal with the high-dimensional optimization problems. Finally, the high-
dimensional feature selection and fault identification of wind turbine problems are solved and the comparisons
show that the proposed algorithm outperforms better than most methods in terms of classification accuracy and
number of the optimal feature subset.
* Corresponding author.
E-mail addresses: [email protected] (W. Long), [email protected] (M. Xu), [email protected] (J. Jiao), [email protected]
(T. Wu), [email protected] (M. Tang), [email protected] (S. Cai).
https://doi.org/10.1016/j.eswa.2022.117217
Received 2 December 2020; Received in revised form 21 March 2022; Accepted 9 April 2022
Available online 13 April 2022
0957-4174/© 2022 Elsevier Ltd. All rights reserved.
W. Long et al. Expert Systems With Applications 201 (2022) 117217
2
W. Long et al. Expert Systems With Applications 201 (2022) 117217
Table 1
The descriptions of benchmark test functions.
Function type Function formulation Search range fmin
∑D
f1 (x) = i=1 x2i [-100, 100] 0
∑D
f2 (x) = i=1 ix2i [-10, 10] 0
∑D (∑i )2
[-100, 100] 0
f3 (x) = i=1 j=1 xj
∑ ( )
f13 (x) = Di=1 ⌊x2i ⌋ [-100, 100] 0
∑D− 1 ( 2 )(x2i+1 +1) ( 2
f14 (x) = i=1 [ xi + xi+1 + 1 i
)(x2 +1)
] [-1, 1] 0
∑D
f15 (x) = i=1 xi 4 [-100, 100] 0
(∑ )2
D [-100, 100] 0
f16 (x) = i=1 xi
2
∑
f17 (x) = Di=1 x10 i
[-10, 10] 0
∑
f18 (x) = Di=1 [x2i − 10cos(2π xi ) + 10] [-5.12, 5.12] 0
√̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅ ) )
(
1 ∑D ( 1 ∑D [–32, 32] 0
f19 (x) = − 20exp − 0.2 x2 − exp cos( 2π xi ) + 20 + e
D i=1 i D i=1
( )
1 ∑D 2 ∏D xi [-600, 600] 0
f20 (x) = x − cos √̅ + 1
4000 i=1 i i=1 i
∑
f21 (x) = Di=1 |xi ⋅sin(xi ) + 0.1⋅xi | [-10, 10] 0
∑
f22 (x) = sin2 (π x1 ) + D− 1 2 2
i=1 [xi ⋅(1 + 10sin (π x1 )) + (xi − 1) ⋅sin (2π xi )]
2 2 [-10, 10] 0
( √̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅ ̅) √∑ ̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅
∑D D [-100, 100] 0
f23 (x) = 1 − cos 2π i=1 xi2 + 0.1 i=1 xi2
( ∑ ∑D 2 )
f24 (x) = 0.1D − 0.1 Di=1 cos(5π xi ) − x [-1, 1] 0
i=1 i
√̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅
sin2 ( 100x2i− 1 + x2i ) − 0.5 [-100, 100] 0
∑D
f25 (x) = i=2 0.5 + 2
1 + 0.001 (x2i− 1 − 2xi− 1 xi + x2i )
∑D− 1
f26 (x) = 0.1(sin (3πx1 ) + i=1 (xi − 1) (1 + sin (3πxi+1 )) + (xD − 1)2 (1 + sin2 (2π xD )))
2 2 2 [-5, 5] 0
∑
f27 (x) = Di=1 (0.2x2i + 0.1x2i ⋅sin(2xi )) [-10, 10] 0
∏ ∑D
f28 (x) = (− 1)D+1 Di=1 cos(xi )⋅exp[− i=1 (xi − π) ]
2 [-100, 100] 0
∑ 1 2
f29 (x) = D− i=1 [x i + 2x 2
i+1 − 0.3cos(3 π x i ) − 0.4cos(4 π xi+1 ) + 0.7] [-15, 15] 0
∑ 1 2 0.25 0.1 2 [-10, 10] 0
f30 (x) = D− i=1 (x i + 2x 2
i+1 ) ⋅((sin50(x 2
i + x 2
i+1 ) ) + 1)
)
∑D 6 ( 1 [-1, 1] 0
f31 (x) = i=1 xi ⋅ 2 + sin
Multimodal xi
{ }
π ∑D− 1 ∑D [-50, 50] 0
f32 (x) = 10sin(πy1 ) + (yi − 1)2 [1 + 10sin2 (πyi+1 )] + (yn − 1)2 + u(xi , 10, 100, 4)
D i=1 i=1
⎧
⎪ k(xi − a)m , xi > a
⎪
xi + 1 ⎨
where yi = 1 + , u(xi , a, k, m) = 0, − a < xi < a
4 ⎪
⎪
⎩
k(− xi − a)m ,
xi < − a
{ } ∑
∑
f33 (x) = 0.1 sin2 (3πx1 ) + Di=1 (xi − 1)2 [1 + sin2 (3πxi + 1)] + (xD − 1)2 [1 + sin2 (2πxD )] + Di=1 u(xi , 5, 100, 4) [-50, 50] 0
)2 )4
∑ (1∑D (1∑D [-10, 10] 0
f34 (x) = Di=1 x2i + ixi + ixi
2 i=1 2 i=1
√̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅ ⎞
( sin2 x2i + x2i+1 − 0.5 [-100, 100] 0
∑
f35 (x) = Di=1 0.5 + 2
⎠
[1 + 0.001(x2i + x2i+1 )]
(∑ ) ∑D
D sin(x2i ) [-2π, 2π] 0
f36 (x) = i=1 |xi | ⋅e
− i=1
∑D ∑ ∑ 2
f37 (x) = i=1 ix2i + Di=1 20isin2 (xi− 1 sinxi + sinxi+1 ) + Di=1 i⋅log10 (1 + i⋅(x2i− 1 − 2xi + 3xi+1 − cosxi + 1) ) [-10, 10] 0
∑D ( 2 ∑ )
[-100, 100] 0
f38 (x) = i=1 xi + 0.4 j∕ =1 xi xj
∑
f39 (x) = Di=1 ((xi − 1)2 + (x1 − x2i ) )
2 [0, 10] 0
∑D ( ∑D )2
[0, π] 0
f40 (x) = i=1 D − j=1 cosxj + i⋅(1 − cosxi − sinxi )
the adaptive disruption and Cauchy mutation. Jia et al. (2019) devel the food searching and mating behavior of biological butterflies in na
oped a two tier distributed collaborative evolution framework by ture. Preliminary studies suggested that the BOA exhibited strong
introducing an effective strategy. De Falco et al. (2019) investigated a competitiveness on function optimization problems, when compared to
novel surrogate-assisted cooperative co-evolution algorithm. the other population-based meta-heuristics such as ABC, DE, and PSO
This paper concentrated on the butterfly optimization algorithm (Arora & Singh, 2019). Therefore, BOA is widely applied for dealing
(BOA), which is presented by Arora and Singh (2019). BOA simulates with the real-world applications such as feature selection (Arora &
3
W. Long et al. Expert Systems With Applications 201 (2022) 117217
Table 2
Comparison results of four methods for 40 benchmark problems with 100 dimensions.
Function BOA OLBOA MBOA VBOA
Mean St.dev Mean St.dev Mean St.dev Mean St.dev
Anand, 2019), optimal design of combined cooling, heating and power 2) A modified position updated equation based on the velocity item and
system (Yuan, et al., 2020), data classification (Jalali, et al., 2019), memory item is designed to generate the potential candidate
maximum power point tracking (Shams et al., 2021), wavelet neural individuals.
networks training (Tan et al., 2020), parameters identification of 3) A novel refraction theory learning operator is proposed to enhance
photovoltaic models (Long et al., 2021b), optimal multilevel image the diversity of VBOA.
thresholding (Sowjanya & Injeti, 2021), and so on. Similar to other 4) The effectiveness of VBOA is verified by using 40 benchmark func
population-based meta-heuristic optimization algorithms, BOA has also tions with high dimensionality, 10 high-dimensional benchmark
some drawbacks, i.e., low solution precision, slow convergence speed, feature selection and one fault identification problems.
and premature convergence to local optima in solving complex
nonlinear optimization problems. To improve the overall performance 2. Butterfly optimization algorithm
of the basic BOA, some BOA variants have been developed (Li et al.,
2019; Sharma & Saha, 2020; Guo et al., 2019; Long et al., 2021a; BOA is a population-based optimization algorithm recently devel
Mortazavi & Moloodpoor, 2021; Long et al., 2021b). Despite BOA has oped by Arora and Singh (2019). It simulates the foraging and mating
strong competitiveness compared with the other population-based meta- behavior of butterflies. In BOA, the fragrance is formulated as:
heuristic methods, it is only for optimization problems with low or
f = cI a (1)
moderate dimensionality. The basic BOA and its variants are not used to
solve optimization problems with high- dimensionality. Additionally, where f represents the perceived magnitude of the fragrance, c de
according to the position search equation of BOA, the personal best notes the sensory modality, I is the stimulus intensity and a is the power
history position of each individual is not exploited much. Motivation of exponent.
these considerations, this paper proposes a novel BOA variant, called In the initialization phase, the initial parameters and an initial
velocity-based BOA (VBOA), to improve the performance of the basic population of butterflies are generated. In the iteration phase, two key
BOA for high dimensional optimization problems. More specifically, the phases are introduced.
main contributions of this paper are listed as follows: In global search phase, the mathematical expression is formulated as:
1) An efficient and robust BOA variant is developed to solve high- xi (t + 1) = xi (t) + (r2 × g* − xi (t)) × fi (2)
dimensional optimization and feature selection problems. where xi represents the position of ith butterfly, t denotes the
4
W. Long et al. Expert Systems With Applications 201 (2022) 117217
Fig. 4. Convergence graphs of four algorithms for twelve representative problems with 100 dimensions.
5
W. Long et al. Expert Systems With Applications 201 (2022) 117217
Fig. 4. (continued).
6
W. Long et al. Expert Systems With Applications 201 (2022) 117217
Table 3
Comparison results of four methods for 40 benchmark problems with 1000 dimensions.
Function BOA OLBOA MBOA VBOA
Mean St.dev Mean St.dev Mean St.dev Mean St.dev
From literature, the basic BOA and its variants have still some
shortcomings such as low solution accuracy, slow convergence speed
and prone to local optimum in solving complex function problems. This
is also the primary motivation for proposing the new BOA variant in this
paper. It is necessary to point out that the VBOA is not change the
framework of the basic BOA, and improves it by using two new strate
gies. These two new strategies are described as follows.
7
W. Long et al. Expert Systems With Applications 201 (2022) 117217
Table 4
Comparison results of four methods for 40 benchmark problems with 10,000 dimensions.
Function BOA OLBOA MBOA VBOA
Mean St.dev Mean St.dev Mean St.dev Mean St.dev
phase and the two randomly selected solutions (xj and xk) in the local
xi (t + 1) = xi (t ) + vi (t + 1) (7)
search phase are introduced to achieve a good tradeoff both conver
gence and diversity, respectively. In other words, on one hand, the in where vi is the velocity of the ith particle, xi represents the ith particle
dividual in the global search phase updates its position by learning from position, w denotes the inertia weight, pbest represents the personal his
the global best individual, to accelerate the convergence and improve torical best position, gbest is the global best position, b1 and b2 are coef
the exploitation ability. On the other hand, the individual in the local ficient factors, r1 and r2 are the random numbers, respectively.
phase renews its position by learning from the current and two randomly From Eq. (3), in the local search phase, the new candidate butterfly is
selected individuals simultaneously, to improve the diversity of popu produced by moving the current butterfly towards another two butterfly
lation. However, the parameter r in the Eq. (4) is a random number, its individuals selected randomly from the population. This equation can
ability to achieve a good tradeoff both convergence and diversity is effectively maintain the diversity of population but poor at convergence.
limited. Furthermore, it is observed that execution of global search and At present, no research has been conducted which concentrates on the
local search stages by using a switch probability which sometimes result local search phase of BOA to improve its performance. In fact, the per
in BOA to lose direction or/and move away from the global best but sonal historical best butterfly information is not exploited much.
terfly. Thus, one active research interest of BOA variant is to investigate Inspired by PSO, this paper presents a modified position update equation
its position update equation. by adding a velocity item and an individual memory item in the local
PSO is a powerful and robust meta-heuristic algorithm proposed by search phase as follows:
Kennedy and Eberhart (1995). It simulates the flocking and swarming
behavior of birds. In PSO, each particle has its own velocity (vi) and vi (t + 1) = w(t) × vi (t) + b3 × r3 × (pbest (t) − xi (t)) + (r2 × xj (t) − xk (t))
position (xi). In the evolution search process, each particle can memorize × fi
its own personal best position for further searching the optimal particle. (8)
In each iteration, the velocity and position are updated by (Shi &
Eberhart, 1998): xi (t + 1) = xi (t ) + vi (t + 1) (9)
vi (t + 1) = w⋅vi (t ) + b1 r1 (pbest (t) − xi (t)) + b2 r2 (gbest (t) − xi (t)) where vi is the velocity of the ith butterfly, pbest represents the per
(6) sonal historical best position of butterfly, b3 is the coefficient factor, r3 is
a random number, and w is the inertia weight. Empirical studies
8
W. Long et al. Expert Systems With Applications 201 (2022) 117217
Fig. 5. Convergence graphs of four methods for twelve representative problems with 1000 dimensions.
demonstrate that a larger w exhibits better performance on global of the butterfly, which provides the necessary impetus for butterflies to
search, while a smaller w lead to local search. Therefore, the w value is search the entire solution space. The second part on the right side of Eq.
calculated as: (8) is the memory-guided item, which guides the butterflies to move
t towards their own historical best positions. The third term of Eq. (8) is
w (t) = wmax − (wmax − wmin ) × (10) the diversity-guided item, which guides the butterflies to move towards
MaxIter
another two selected randomly butterflies from the population.
where MaxIter represents the maximum iterations number, wmax and Compared with Eq. (3), Eqs. (8) and Eq. (9) can generate some more
wmin are the maximum and minimum values of w, respectively. promising candidate butterflies, which balance between diversity and
The first part of the right side of Eq. (8) denotes the previous velocity convergence.
9
W. Long et al. Expert Systems With Applications 201 (2022) 117217
Fig. 5. (continued).
3.2. Refraction-based learning strategy novel learning mechanism, called refraction-based learning (RBL)
strategy, to improve the exploration capability of BOA. The detailed
In basic BOA, according to the position updating way of the other explain are introduced as follows.
butterflies, the new candidate butterflies are produced by moving the The refraction of light (as shown in Fig. 1) is a common physical
current objective butterfly to the global best butterfly (g*). In the latter phenomenon where the light slants from a medium into another ones
stage of evolution search, the other butterflies in the population are and its propagation direction will be changed (Shao et al., 2017).
attracted to the global best butterfly. Therefore, they may appear pre In Fig. 1, light from medium 1, point P, enters medium 2 via junction
mature convergence without enough explore the solution space. point O, refraction occurs, and reaches point Q finally. θ1 is the angle of
Accordingly, one of the most promising goal in BOA improvement is incidence and θ2 is the angle of refraction. Snell Law (Born & wolf, 1959)
how to escape from local optima. To reduce the probability of trapping states that the ratio of the sine of θ1 and θ2 is equivalent to the reciprocal
in local optima, the commonly used strategies are mutation operator, of the ratio of the indices of refraction:
opposition-based learning, and Lévy flight etc. This paper introduces a
10
W. Long et al. Expert Systems With Applications 201 (2022) 117217
Fig. 6. Convergence graphs of four methods for twelve representative problems with 10,000 dimensions.
11
W. Long et al. Expert Systems With Applications 201 (2022) 117217
Fig. 6. (continued).
computed as:
*
= (a + b)/2 + (a + b)/( 2 k η) − x* /k η (16)
′
x
sinθ1 ((a + b)/2 − x* )/h
η= = (14) In Eq. (16), when k = 1 andη = 1, it may modify as.
sinθ2 (x * − (a + b)/2)/h′
′
*
= a + b − x* (17)
′
learning is computed by. Generally, Eq. (16) may extend to n-dimensional space:
12
W. Long et al. Expert Systems With Applications 201 (2022) 117217
5 (18)
′
*
xj = (aj + bj )/2 + (aj + bj )/( 2 k η) − x*j /k η
4.5 BOA OLBOA MBOA VBOA
where x*j and xj * are the jth dimension of x∗ andx ∗ , aj and bj are
′ ′
3.03
3 2.93 2.9 mentation is as follows. Suppose r ∈ [0, 1] is a random number, if r < q
(q ∈ [0, 1] is a constant number), the RBL strategy is implemented,
2.5 otherwise, the RBL strategy is not performed. The purpose of this scheme
2.09 2.15 2.16
is not performed the RBL strategy in every iteration, and thereby reduce
2
the computational complexity of algorithm.
1.44
1.5 1.3 1.39 Fig. 3 plots the flow chart of the proposed algorithm.
1
4. Simulation experiments and comparisons
0.5
To comprehensively study the overall performance of VBOA, a series
0 of experiments are used to deal with various benchmark test cases. All
100 1000 10000
Dimensions the algorithms are implemented using MATLAB R2014a software,
operating system used is Microsoft WINDOWS 8 64-bit Home, and
Fig. 7. Mean ranks of four algorithms on 40 benchmark problems with 100, simulations supported by Intel (R) Core (TM) i5-5200U CPU @ 2.20 GHz
1000 and 10,000 dimensions. with 4 GB RAM.
Forty widely used benchmark functions are chosen from (Jamil &
Yang, 2013; Hussain et al., 2017) to investigate the effectiveness of
Table 5
Comparison of CPU execute time results (seconds) on 40 benchmark problems with 100, 1000, and 10,000 dimensions.
Function D = 100 D = 1000 D = 10000
BOA OLBOA MBOA VBOA BOA OLBOA MBOA VBOA BOA OLBOA MBOA VBOA
f1 0.7037 0.6554 2.7935 0.8060 1.5380 1.6375 5.2672 1.8733 19.008 21.300 29.238 20.188
f2 0.6952 0.7403 2.7810 0.8237 1.9063 1.8014 5.8488 1.9496 20.318 23.018 30.665 22.339
f3 9.0660 9.0103 25.423 6.5206 134.25 136.97 402.24 92.132 1961.3 2012.6 2565.9 1853.9
f4 0.7790 0.8173 3.2573 0.8872 1.6882 1.7951 5.9183 2.0280 25.282 22.168 42.516 25.384
f5 0.7195 0.7845 2.7989 0.8617 2.0883 2.3013 6.4729 2.1975 27.782 30.872 55.331 29.448
f6 1.5590 1.6470 4.7660 1.4038 9.1108 9.1723 22.517 6.1512 67.410 64.762 135.22 47.611
f7 1.5016 1.6068 4.7552 1.6030 8.9134 9.3600 22.343 6.9404 59.616 61.956 124.20 50.288
f8 1.6686 1.6680 4.4763 1.2588 7.5081 7.7180 21.593 4.9612 56.360 54.337 109.53 41.655
f9 1.5522 1.6411 4.6252 1.4810 9.1264 9.5546 23.192 7.3576 70.242 66.712 124.58 55.831
f10 1.5214 1.7390 4.7522 1.2818 9.2705 9.4018 23.596 6.2242 67.412 65.512 137.65 47.760
f11 1.4946 1.6200 4.8841 1.2212 9.1141 9.4368 22.588 5.1809 63.576 62.421 134.67 42.261
f12 1.1852 1.4238 1.4992 1.0855 2.2738 2.6050 2.7377 2.2682 25.400 24.726 34.614 27.250
f13 0.8114 0.8258 1.0376 0.9006 2.2279 2.2879 2.6416 2.2025 28.186 28.226 39.188 29.745
f14 2.5758 2.7898 1.4880 1.2188 16.844 16.286 5.8230 3.8732 115.39 89.940 124.71 38.666
f15 1.5167 1.7473 1.8901 1.3000 9.0060 9.4372 10.889 6.1775 67.587 63.775 75.335 47.256
f16 0.6295 0.7158 0.8380 0.8151 1.5877 1.6844 1.9030 1.8728 22.366 21.022 30.486 23.245
f17 1.5216 1.7318 1.7893 1.3085 9.1019 9.4797 9.9321 5.8131 66.664 64.700 76.124 46.412
f18 1.1524 1.4266 3.5159 1.0362 3.0533 2.8386 7.5296 2.3964 27.290 30.130 54.813 29.788
f19 1.0250 1.2803 3.5599 1.1027 2.4496 2.5624 7.3107 2.5253 30.084 31.627 53.835 30.982
f20 1.2322 1.2854 3.1743 1.0483 2.6382 2.8008 7.5238 2.6602 31.973 31.795 54.435 32.040
f21 1.0372 1.2347 3.1569 0.9850 2.3901 2.4082 6.6716 2.4110 28.242 29.768 45.887 29.848
f22 0.9869 1.2132 3.3691 1.0591 2.4618 2.7972 7.9512 2.6036 32.231 32.390 54.898 32.207
f23 0.9170 1.1904 3.2252 1.0220 1.7694 1.9746 6.3933 2.2264 19.774 20.080 34.099 20.603
f24 0.8524 0.8876 3.0515 0.9155 2.1714 2.3371 7.4302 2.2409 27.788 28.683 48.198 30.323
f25 1.7712 1.7444 3.1339 1.3754 4.3758 4.8407 6.1250 3.2938 43.072 39.470 55.078 36.628
f26 0.9106 1.0148 3.1231 0.9738 2.4627 2.7735 7.2352 2.7038 32.836 29.920 50.964 31.561
f27 1.1138 1.3224 3.3533 1.1104 2.5048 2.6674 7.2206 2.5800 31.451 30.150 49.787 30.473
f28 1.2222 1.4668 4.1626 1.2195 3.5642 3.9902 10.517 3.2182 38.691 36.357 68.179 34.504
f29 1.0966 1.3934 3.7319 1.1590 3.1218 3.4753 9.4217 3.3503 38.543 34.566 63.073 35.554
f30 2.6720 2.8880 7.4206 1.7886 18.596 18.388 42.611 9.7193 123.56 112.32 232.17 66.808
f31 1.7015 1.8401 7.6064 1.7903 10.778 11.055 47.069 10.846 75.367 71.202 158.61 71.200
f32 3.1428 3.2470 3.5468 2.3599 20.267 19.641 23.297 13.698 126.76 118.78 141.60 89.543
f33 2.9542 3.0356 3.4211 2.2892 18.551 19.640 21.374 13.428 113.96 117.04 141.22 86.993
f34 0.9252 0.9350 1.2183 1.0248 2.1709 2.3344 2.6765 2.2566 25.608 27.282 35.855 28.432
f35 0.8423 0.8700 1.2973 1.0422 1.7228 1.8032 2.2248 2.1158 17.845 19.501 29.766 20.061
f36 0.8350 0.8721 0.9789 0.8616 2.8885 2.6290 2.9400 2.4078 32.632 33.352 37.919 30.774
f37 2.0409 1.9244 2.0440 1.5517 5.7502 5.6135 5.9412 5.4480 57.998 54.666 70.146 51.174
f38 0.7877 0.7935 1.0855 0.8856 1.8148 2.0308 2.2275 2.0979 21.103 22.379 33.532 20.961
f39 0.6917 0.7805 1.0480 0.8216 1.5529 1.7632 2.0900 1.9024 17.592 18.031 27.122 17.378
f40 0.9866 0.9951 1.4408 1.0357 2.6572 3.2554 3.5524 2.9192 30.038 32.951 42.208 29.856
13
W. Long et al. Expert Systems With Applications 201 (2022) 117217
Table 6
Comparison results of eight methods for 40 benchmark problems with 100 dimensions.
Function Index MSCA CCAA MPA HGS GWO WOA HHO VBOA
14
W. Long et al. Expert Systems With Applications 201 (2022) 117217
Table 6 (continued )
Function Index MSCA CCAA MPA HGS GWO WOA HHO VBOA
f38 Mean 2.29E-52 0.00E þ 00 0.00E þ 00 4.90E-166 4.53E-12 4.93E-69 3.10E-100 0.00E þ 00
St.dev 5.09E-52 0.00E + 00 0.00E + 00 0.00E + 00 2.53E-12 9.46E-69 5.32E-100 0.00E + 00
f39 Mean 9.93E + 01 0.00E þ 00 4.18E + 00 1.84E + 01 4.53E + 01 4.56E + 01 1.12E-02 5.25E-02
St.dev 0.00E + 00 0.00E + 00 3.18E-01 4.12E + 01 2.61E + 00 3.01E + 01 9.86E-03 3.77E-02
f40 Mean 0.00E þ 00 5.58E-16 1.04E-11 0.00E þ 00 3.51E-05 0.00E þ 00 0.00E þ 00 0.00E þ 00
St.dev 0.00E + 00 1.25E-15 1.29E-11 0.00E + 00 6.83E-05 0.00E + 00 0.00E + 00 0.00E + 00
+/≈/- 23/16/1 6/30/4 4/33/3 24/12/4 35/2/3 28/8/4 22/13/5 –
respectively.
8 From Table 2, VBOA obtains the theoretical optimal values (0) on all
7.33 of the functions except for seven bench- mark functions (i.e., f7, f12, f19,
7 f32, f33, f36, and f39). With respect to the basic BOA, VBOA provides
better and similar results on 34 and five functions (i.e., f5, f12, f13, f18,
6 5.74 and f28), respectively. For f32, the better value is obtained by the basic
5.29 BOA. The performance of VBOA is superior to OLBOA on 34 benchmark
Mean ranking
5 test functions. However, the similar results are found by VBOA and
4.26 4.19
OLBOA on six functions (i.e., f5, f12, f13, f18, f28, and f40). Compared with
4 MBOA, VBOA finds better results on 30 test functions. For f5, f12, f13, f17,
f18, f20, f24, f28, f29, and f33, VBOA and MBOA achieve similar results. To
2.95 3.04 3.01 further show the effectiveness of VBOA, the convergence graphs of four
3
methods for twelve representative problems (i.e., f1, f3, f6, f8, f14, f16, f19,
f22, f26, f30, f34, and f37) are shown in Fig. 4. As can be seem from Fig. 4,
2
the proposed VBOA has faster convergence and higher accuracy than
three compared methods on twelve representative problems with 100
1
dimensions.
0
MSCA CCAA MPA HGS GWO WOA HHO VBOA 4.3. Scalability test
Algorithms
The impact of dimension on optimization results and algorithm
Fig. 8. Mean Friedman test ranks of eight algorithms on 40 functions with feasibility are simultaneously analyzed by the scalability test experi
100 dimensions. ment. To further test the scalability of the proposed method, the VBOA is
applied to solve the higher dimensional functions (i.e., 1000 and 10,000
VBOA. The formulation and descriptions of functions are outlined in dimensions). In this experiment, the same parameter settings as in the
Table 1. In Table 1, these test functions have different characteristics, i. experiments above are used. The Mean and standard deviation (St.dev)
e., unimodal, multi-modal, non-separable, and composition character values are obtained by VBOA and other three algorithms, and the results
istics. The functions with different characteristics are used to test the of the Wilcoxon’s sign rank test in Tables 3 and 4, respectively. The best
different capabilities of algorithm. For instance, the unimodal functions value of each function is highlighted by bold.
are very beneficial to test the local search capability of algorithms since From Tables 3-4, the basic BOA, OLBOA, and VBOA show very good
they only have one global optimum. The multimodal functions have scalability for the space dimensions on all of the problems with 1000 and
many local optimal solutions and are very suitable to evaluate the global 10,000 dimensions, in other words, as the problem dimensions increase,
search capability of algorithms. the performances of three algorithms does not seriously deteriorate. It
should be pointed out that an optimization problem with 1000 and
10,000 dimensions is very challenging for BOA because it does not
4.2. Compared with butterfly optimization algorithm and its variants employ any specific mechanisms customized for dealing with the opti
mization problems with high dimensionality. Additionally, MBOA also
To verify the effectiveness of VBOA, forty classical benchmark display quite strong scalability for the space dimensions for all the
problems in Table 1 are selected. The dimensions of functions are set functions except for f8, f10, and f39. For these three functions, the per
100. In addition, the basic BOA (Arora & Singh, 2019), the opposition- formance of MBOA is seriously deteriorated. Compared with BOA and
based learning BOA (OLBOA), and the mutualism BOA (MBOA) (Sharma OLBOA, VBOA obtains better and similar values for 32 and eight prob
and Saha, 2020) are also selected to solve these test functions and the lems (i.e., f5, f12, f13, f18, f28, f33, f36, and f40) with 1000 and 10,000
simulation results are compared with VBOA. For fair comparisons, the dimensions, respectively. With respect to the MBOA, VBOA finds the
parameters of four optimizers are set to: the population number is 30, better and similar values for 27 and twelve benchmark problems,
the maximum iteration number is 500, switch probabilityp = 0.7, respectively. However, for f32, the better value is obtained by MBOA.
power exponenta = 0.1, and sensory modalityc = 0.01. In VBOA, wmax In addition, Figs. 5-6 provide the convergence graphs of four
= 1, wmin = 0, coefficient factor b3 = 2, refraction index η = 2, k = 2. methods for twelve representative problems with 1000 and 10,000 di
Four algorithms are conducted under the same condition. To reduce mensions. As can be seen from Figs. 5-6, VBOA can reveal obvious ad
random error, each algorithm is independently executed 30 runs. The vantages over other three approaches for convergence speed and
experimental results of four algorithms for the average value (Mean) and solution accuracy.
standard deviation (St.dev) of functions are provided in Table 2. It According to the Friedman test of the average (Mean) results in Ta
should be noted that the best value of each function is highlighted by bles 2-4, Fig. 7 plots the average ranks of four approaches for 40
bold. Additionally, the Wilcoxon sign test is utilized to investigate the benchmark test problems with 100, 1000, and 10,000 dimensions. For
statistically performance between VBOA and other three compared al each function, the best-performance algorithm has a ranking of 1 while
gorithms. The symbols of “+”, “≈”, and “-” represent that VBOA is su the worst algorithm has a ranking of 4. As seen from Fig. 7, for each case,
perior to, similar to, and inferior to the corresponding methods, VBOA ranks first followed by MBOA, OLBOA, and BOA, respectively.
15
W. Long et al. Expert Systems With Applications 201 (2022) 117217
16
W. Long et al. Expert Systems With Applications 201 (2022) 117217
-5 0
10 10
Function values
Function values
-10 -2
10 10
-15 -4
10 10
-20 -6
10 10
10
-25 BOA 10
-8 BOA
VBOA-1 VBOA-1
-30
VBOA-2 -10
VBOA-2
10 10
0 100 200 300 400 500 100 200 300 400 500
Iteration Iteration
Fig. 9. Convergence graphs of three algorithms on eight representative functions with 100 dimensions.
similar results on 28 and eight benchmark test functions (i.e., f5, f13, f18, 33 benchmark functions. VBOA is superior to MPA on four functions (i.
f20, f24, f28, f29, and f40), respectively. However, compared with VBOA, e., f7, f36, f39, and f40). For f12, f32, and f33, the better results are obtained
the better results are obtained by CCAA, HGS, and WOA on f12, f32, f33, by MPA. Compared to the GWO algorithm, VBOA obtains better and
and f36 functions, respectively. The results of VBOA are equal to MPA on similar results on 35 and two functions (i.e., f5 and f13), respectively.
17
W. Long et al. Expert Systems With Applications 201 (2022) 117217
Table 8
Comparison results of VBOA using different k and η values for 40 benchmark functions with 100 dimensions.
Function k = 0.1, η = 0.1 k = 0.5, η = 0.5 k = 1.0, η = 1.0 k = 1.5, η = 1.5 k = 2.0, η = 2.0
Mean St.dev Mean St.dev Mean St.dev Mean St.dev Mean St.dev
f1 2.51E-15 2.37E-15 1.06E-108 2.25E-108 0.00E þ 00 0.00E + 00 0.00E þ 00 0.00E + 00 0.00E þ 00 0.00E + 00
f2 2.02E-09 3.45E-09 1.68E-82 3.63E-82 0.00E þ 00 0.00E + 00 0.00E þ 00 0.00E + 00 0.00E þ 00 0.00E + 00
f3 6.40E-121 1.40E-120 0.00E þ 00 0.00E + 00 0.00E þ 00 0.00E + 00 0.00E þ 00 0.00E + 00 0.00E þ 00 0.00E + 00
f4 1.71E + 00 2.41E + 00 2.76E-07 4.21E-07 1.17E-56 1.71E-56 4.00E-216 0.00E + 00 0.00E þ 00 0.00E + 00
f5 0.00E þ 00 0.00E + 00 0.00E þ 00 0.00E + 00 0.00E þ 00 0.00E + 00 0.00E þ 00 0.00E + 00 0.00E þ 00 0.00E + 00
f6 1.59E-248 0.00E + 00 0.00E þ 00 0.00E + 00 0.00E þ 00 0.00E + 00 0.00E þ 00 0.00E + 00 0.00E þ 00 0.00E + 00
f7 1.88E + 00 2.39E + 00 5.31E-02 5.71E-02 9.23E-05 6.04E-05 4.93E-05 6.25E-05 2.91E-05 2.17E-05
f8 5.02E-04 4.80E-04 3.00E-08 6.15E-08 1.62E-98 2.34E-98 0.00E þ 00 0.00E + 00 0.00E þ 00 0.00E + 00
f9 2.53E-22 4.76E-22 4.16E-77 1.31E-78 1.00E-289 0.00E + 00 0.00E þ 00 0.00E + 00 0.00E þ 00 0.00E + 00
f10 3.40E + 05 5.45E + 05 9.83E-07 1.48E-06 6.03E-26 1.32E-25 6.10E-202 0.00E + 00 0.00E þ 00 0.00E + 00
f11 1.75E-14 2.40E-14 4.70E-114 1.10E-113 0.00E þ 00 0.00E + 00 0.00E þ 00 0.00E + 00 0.00E þ 00 0.00E + 00
f12 9.90E þ 01 8.42E-03 9.90E þ 01 1.74E-02 9.90E þ 01 1.28E-02 9.90E þ 01 1.74E-02 9.90E þ 01 2.40E-03
f13 0.00E þ 00 0.00E + 00 0.00E þ 00 0.00E + 00 0.00E þ 00 0.00E + 00 0.00E þ 00 0.00E + 00 0.00E þ 00 0.00E + 00
f14 3.50E-02 5.26E-02 5.59E-10 6.70E-10 1.56E-114 2.07E-114 0.00E þ 00 0.00E + 00 0.00E þ 00 0.00E + 00
f15 4.50E-99 1.01E-98 0.00E þ 00 0.00E + 00 0.00E þ 00 0.00E + 00 0.00E þ 00 0.00E + 00 0.00E þ 00 0.00E + 00
f16 6.75E-29 1.83E-28 3.66E-121 4.63E-121 0.00E þ 00 0.00E + 00 0.00E þ 00 0.00E + 00 0.00E þ 00 0.00E + 00
f17 6.40E-96 1.43E-95 0.00E þ 00 0.00E + 00 0.00E þ 00 0.00E + 00 0.00E þ 00 0.00E + 00 0.00E þ 00 0.00E + 00
f18 2.20E + 01 4.32E + 01 4.10E-11 9.17E-11 0.00E þ 00 0.00E + 00 0.00E þ 00 0.00E + 00 0.00E þ 00 0.00E + 00
f19 1.83E + 01 3.38E + 00 1.96E-05 3.70E-05 8.88E-16 0.00E + 00 8.88E-16 0.00E + 00 8.88E-16 0.00E + 00
f20 3.38E-01 6.20E-01 0.00E þ 00 0.00E + 00 0.00E þ 00 0.00E + 00 0.00E þ 00 0.00E + 00 0.00E þ 00 0.00E + 00
f21 6.95E-02 7.38E-02 4.27E-09 4.63E-09 1.04E-68 1.74E-68 1.71E-239 0.00E + 00 0.00E þ 00 0.00E + 00
f22 4.87E-01 1.09E + 00 2.19E-47 3.95E-47 2.90E-277 0.00E + 00 0.00E þ 00 0.00E + 00 0.00E þ 00 0.00E + 00
f23 1.38E + 01 7.75E-01 8.54E-01 3.72E-01 1.42E-01 5.10E-02 0.00E þ 00 0.00E + 00 0.00E þ 00 0.00E + 00
f24 3.99E + 00 7.37E + 00 1.43E-08 2.95E-08 0.00E þ 00 0.00E + 00 0.00E þ 00 0.00E + 00 0.00E þ 00 0.00E + 00
f25 2.37E + 01 2.18E + 00 2.27E + 01 2.53E + 00 5.69E + 00 1.66E + 00 0.00E þ 00 0.00E + 00 0.00E þ 00 0.00E + 00
f26 5.87E-02 4.13E-02 3.11E-14 3.08E-14 6.78E-131 7.22E-131 0.00E þ 00 0.00E + 00 0.00E þ 00 0.00E + 00
f27 6.73E-02 7.01E-02 7.54E-17 1.04E-16 2.70E-153 5.70E-153 0.00E þ 00 0.00E + 00 0.00E þ 00 0.00E + 00
f28 0.00E þ 00 0.00E + 00 0.00E þ 00 0.00E + 00 0.00E þ 00 0.00E + 00 0.00E þ 00 0.00E + 00 0.00E þ 00 0.00E + 00
f29 8.44E-05 1.07E-04 0.00E þ 00 0.00E + 00 0.00E þ 00 0.00E + 00 0.00E þ 00 0.00E + 00 0.00E þ 00 0.00E + 00
f30 5.55E + 00 7.40E + 00 1.13E-01 2.48E-01 7.55E-34 1.37E-33 0.00E þ 00 0.00E + 00 0.00E þ 00 0.00E + 00
f31 3.74E-04 8.03E-04 1.83E-12 3.31E-12 0.00E þ 00 0.00E + 00 0.00E þ 00 0.00E + 00 0.00E þ 00 0.00E + 00
f32 1.27E + 00 5.38E-02 1.29E + 00 3.36E-02 1.26E + 00 8.24E-02 1.21E + 00 1.38E-01 1.15E þ 00 4.29E-02
f33 1.00E þ 01 2.45E-04 1.00E þ 01 5.70E-04 1.00E þ 01 4.72E-04 1.00E þ 01 1.92E-03 1.00E þ 01 1.65E-03
f34 6.25E-40 1.89E-39 7.77E-148 4.65E-148 0.00E þ 00 0.00E + 00 0.00E þ 00 0.00E + 00 0.00E þ 00 0.00E + 00
f35 5.00E-01 3.70E-05 4.95E-01 7.24E-03 4.45E-02 6.52E-03 2.37E-02 2.18E-02 0.00E þ 00 0.00E + 00
f36 7.42E-41 5.06E-41 3.16E-41 1.44E-41 1.10E-41 5.34E-42 8.13E-42 6.81E-42 4.96E-42 8.54E-42
f37 8.38E-15 7.29E-15 2.70E-94 3.83E-94 0.00E þ 00 0.00E + 00 0.00E þ 00 0.00E + 00 0.00E þ 00 0.00E + 00
f38 3.79E-135 5.25E-135 0.00E þ 00 0.00E + 00 0.00E þ 00 0.00E + 00 0.00E þ 00 0.00E + 00 0.00E þ 00 0.00E + 00
f39 8.10E + 01 2.75E + 01 1.59E-01 1.36E-01 1.24E-01 8.43E-02 7.53E-02 4.86E-02 5.25E-02 3.77E-02
f40 1.49E + 01 2.11E + 01 4.56E-02 1.02E-01 0.00E þ 00 0.00E + 00 0.00E þ 00 0.00E + 00 0.00E þ 00 0.00E + 00
However, for f12, f32, and f33, the better values are obtained by GWO. basic BOA over 30 times trials.
With respect to HHO, VBOA provides better and similar results on 22 From Table 7, compared with the conventional BOA, the VBOA-1
and thirteen functions (i.e., f5, f13, f17-f20, f24, f25, f28, f29, f31, f35, and shows the better and similar results on functions f36, f39, and f40 and
f40), respectively. Meanwhile, the better results are obtained by HHO on three functions (i.e., f5, f13, and f28), respectively. However, the better
functions f12, f32, f33, f36, and f39, respectively. In addition, Fig. 8 dis results for the remaining 34 functions are obtained by BOA. This phe
plays the average Friedman test ranks of eight algorithms on 40 func nomenon fully shows that the velocity operator used in the global search
tions with 100 dimensions. As shown in Fig. 8, the CCAA ranks first phase cannot effectively balance between convergence and diversity.
followed by the VBOA, MPA, HHO, HGS, MSCA, WOA, and GWO Moreover, with respect to the basic BOA, the VBOA-2 finds the better
algorithms. and similar values for 31 and five problems (i.e., f5, f12, f13, f18, and f28),
respectively. However, the better values for four functions (f10, f25, f32,
4.6. Effectiveness of the velocity component and f35) are obtained by the basic BOA. This is easy to know, since the
velocity strategy used in the local search phase can achieve a good
As mentioned previously, the velocity-based position update equa tradeoff between convergence and diversity. According to the above
tion is a critical strategy to balance between convergence and diversity comparison experiments, the velocity operator used in the local search
in VBOA. The purpose of this subsection is to study the effectiveness of phase to modify the position update equation is an effective and efficient
the velocity component. Thus, two additional experiments are con strategy in VBOA.
ducted for 40 benchmark problems in Table 1. The dimensions of the To further display the effectiveness of the velocity strategy in VBOA,
selected functions are set 100. In the first experiment, the velocity Fig. 9 provides the convergence graphs of the conventional BOA, VBOA-
strategy is utilized for improving the position update equation of the 1, and VBOA-2 on eight representative functions with 100 dimensions.
conventional BOA in the global search phase (VBOA-1). The strategy in As seen from Fig. 9, the velocity operator used in the local search phase
the local search phase is unchanged. In the second experiment, the ve (i.e., VBOA-2) obtains good performance for convergence speed and
locity operator is used to modify the position update equation of the solution quality.
classical BOA in the local search phase (abbreviated as VBOA-2). Similar
to (Arora & Singh, 2019), the strategy (i.e., Eq. (2)) in the global search 4.7. The impact of k and η
phase is used. The population number and the maximum iteration
number of two methods are set 30 and 500, respectively. Table 7 out As mentioned in subsection 3.2, the refraction-based learning strat
lines the mean and standard deviation values of two methods and the egy (i.e., Eq. (18)) is more effective on enhance exploration capability
18
W. Long et al. Expert Systems With Applications 201 (2022) 117217
Fig. 10. Convergence graphs of different k and η values on four representative functions.
during the search process of BOA. However, in Eq. (18), determining the
Table 9 values of k and η are important. In this subsection, several experiments
The detailed characteristics of the ten selected high-dimensional datasets. are conducted to investigate the impact of k and η. In experiment, the
Number Dataset Number of features Number o samples values of k and η are manipulated while keeping the other parameters
1 BreastEW 30 569 fixed. Values k = 0.1, η = 0.1, k = 0.5, η = 0.5, k = 1.0, η = 1.0, and k =
2 Clean1 166 476 1.5, η = 1.5 are examined in Table 8 for 40 benchmark functions with
3 Clean2 166 6598 100 dimensions. It needs to be pointed out that, in previous experiments,
4 IonosphereEW 34 351
k = 2.0 and η = 2.0 are set. For simplicity, the results related to k = 2.0, η
5 KrvskpEW 36 3196
6 PenglungEW 325 73
= 2.0 are also listed in Table 8 for comparison. In addition, Fig. 10 shows
7 Semeion 265 1593 the convergence graphs of VBOA with different k and η values on four
8 SonarEW 60 208 representative functions.
9 SpectEW 22 267 As can be seen from Table 8 and Fig. 10, the comprehensive per
10 WaveformEW 40 5000
formance of VBOA with k = 2.0 and η = 2.0 is better than that of VBOA
with other k and η values. In addition, compared with VBOA under k =
2.0 and η = 2.0, VBOA with k = 1.5 and η = 1.5 obtains similar results on
Table 10
Classification accuracy comparison between VBOA and other nine algorithms on ten high-dimensional datasets.
Dataset PSO ALO MPA FPA OBBOA GA CSA AGWO CSHO VBOA
BreastEW 0.9378 0.9392 0.9882 0.9382 0.9705 0.9488 0.9460 0.9340 0.9502 0.9638
Clean1 0.8549 0.8465 0.9509 0.8697 0.9018 0.8697 0.8798 0.8555 0.8739 0.8849
Clean2 0.9465 0.9496 0.9823 0.9493 0.9702 0.9423 0.9514 0.9478 0.9465 0.9637
IonosphereEW 0.8803 0.8595 0.9762 0.8898 0.9095 0.8938 0.8943 0.8932 0.8943 0.9081
KrvskpEW 0.9200 0.9006 0.9864 0.9106 0.9489 0.9215 0.9079 0.9163 0.9360 0.9646
PenglungEW 0.8140 0.8072 1.0000 0.8355 0.9286 0.6721 0.8054 0.8541 0.8590 0.8879
Semeion 0.9676 0.9714 0.9895 0.9716 0.9843 0.9757 0.9676 0.9709 0.9731 0.9731
SonarEW 0.8667 0.8487 0.9918 0.8827 0.8374 0.8750 0.8538 0.8827 0.8808 0.8914
SpectEW 0.7841 0.7881 0.9308 0.8119 0.8490 0.8097 0.7925 0.8134 0.8179 0.8652
WaveformEW 0.7192 0.7066 0.7867 0.7187 0.7463 0.6921 0.7139 0.7174 0.7218 0.8078
19
W. Long et al. Expert Systems With Applications 201 (2022) 117217
Table 11
Average number of the optimal feature subsets of ten methods for ten high-dimensional datasets.
Dataset PSO ALO MPA FPA OBBOA GA CSA AGWO CSHO VBOA
BreastEW 18.33 24.27 6.0 21.6 15.7 12.2 14.4 19.2 12.2 11.6
Clean1 104.93 132 42.3 125.8 80 98.9 85.2 93 79.6 71.2
Clean2 109.4 95 76.0 109 82 94.1 84.6 103 101.6 79.6
IonosphereEW 19.2 20.13 5.3 19 11 13.5 15.6 22.6 13 12.2
KrvskpEW 25.6 35.8 22.3 28.8 18 18 18.4 27.4 18.2 16
PenglungEW 183.33 172.07 10.3 196.4 142.3 153 157.8 158.6 151.8 144.3
Semeion 171.6 187.8 89.7 164.8 133 149.4 141.6 187.8 139.4 133
SonarEW 37.6 48 17.3 42 32 30.3 30.2 47 28 23
SpectEW 12.07 13.87 8.0 12.2 12.7 7.0 11.8 17.8 7.6 8.0
WaveformEW 35.8 39.6 23.3 30.2 21.3 30.4 22.4 34.6 29.6 19.4
6.85 7
7 6.75 7
6.4
Mean ranking
Mean ranking
6 6
3.95
4 4 3.6
3.1
3 2.65 3
2.05 2.1
2 2
1.1
1 1
0 0
PSO ALO MPA FPA OBBOA GA CSA AGWOCSHO VBOA PSO ALO MPA FPA OBBOA GA CSA AGWOCSHO VBOA
Algorithms Algorithms
Fig. 11. Mean Friedman test ranks of ten algorithms on ten high-dimensional datasets.
Table 13
The average classification accuracy of nine algorithms for three fault types of wind turbine pitch system.
Dataset PSO GWO WOA SCA SOA HHO MPA BOA VBOA
Dataset-1 0.9981 0.9991 0.9989 0.9968 1.0000 1.0000 1.0000 0.9979 1.0000
Dataset-2 0.9991 0.9988 0.9981 0.9995 0.9988 0.9993 1.0000 0.9968 1.0000
Dataset-3 0.9965 0.9965 0.9965 0.9980 0.9974 0.9970 1.0000 0.9944 1.0000
Total ranking 7 6 8 5 4 3 1 9 1
Table 14
The average number of the optimal feature obtained by nine algorithms for three fault types of wind turbine pitch system.
Dataset PSO GWO WOA SCA SOA HHO MPA BOA VBOA
20
W. Long et al. Expert Systems With Applications 201 (2022) 117217
Table 15
The average CPU runtime (in seconds) of nine algorithms for three fault types of wind turbine pitch system.
Dataset PSO GWO WOA SCA SOA HHO MPA BOA VBOA
Dataset-1 20.254 17.392 9.6216 16.300 8.9943 32.654 21.567 22.664 18.562
Dataset-2 15.135 14.607 7.3502 12.253 14.859 12.129 12.719 14.691 8.9595
Dataset-3 21.706 18.442 10.676 19.349 21.982 21.409 13.347 22.414 13.402
Total ranking 8 4 1 3 6 7 4 9 2
search space of FS is represented by binary values. It is should be pointed turbine operation. Based on the actual operation data of the wind tur
out that VBOA is an optimization technique with a continuous space that bine pitch system, some machine learning fault identification models
needs to convert it from continuous version to binary one when solving such as neural networks, random forest, and support vector machines
the feature selection problems. Using a transfer function is one of the have been proposed by researchers during the recent years. Further-
easiest methods. The advantage of this method is not to modify the more, there are many types of data collected by the supervisory control
framework of the BOA algorithm. In this paper, the following S-shaped and data acquisition (SCADA) system of wind turbines, including fault-
transfer function is used. related feature data, irrelevant feature data, and related but redundant
feature data. Due to the high dimension of the extracted mixed feature
1
T(x) = τ x
(19) set and the redundancy between features is serious, which will reduce
1 + e−
the fault identification accuracy of machine learning models and pro
where τ is the constant number. long the diagnosis time. Therefore, this subsection proposes a novel
Ten high-dimensional benchmark datasets from University of Cali intelligent fault identification method of the wind turbine pitch system
fornia Irvine (UCI) repository (Mafarja & Mirjalili, 2017) are used in this based on VBOA and K-Nearest Neighbor (KNN).
paper. These datasets are widely discussed in literatures for study the In the proposed method, VBOA is used to select the optimal feature
performance of the optimization techniques. Table 9 outlines the char subset and combined with KNN classifier for identifying the fault of the
acteristics of ten selected high-dimensional datasets. wind turbine pitch system. In the experiment, the data used the pitch
For all the datasets, the wrapper method on feature subset selection system real fault data of wind turbine from a wind farm in East China,
is used with k-Nearest Neighbors (KNN) classifier (k = 5). The number of including pitch system emergency stop fault, pitch system super
runs is 20, the number of population is 10, the number of iterations is capacitor voltage low fault and pitch paddle 3 super capacitor voltage
100, and the problem dimensions are the number of features for each low fault. Table 12 gives the number of fault features and fault samples
dataset. The performance of VBOA is compared with PSO (Shi & Eber of three fault types.
hart, 1998), ant lion optimizer (ALO) (Mirjalili, 2015), MPA (Abd In this experiment, VBOA and the wrapper method on feature subset
Elminaam et al., 2021), flower pollination algorithm (FPA) (Yang, et al., selection is utilized with KNN classifier (K = 3). Each dataset is
2014), opposition-based BOA (OBBOA) (Assiri, 2021), genetic algorithm randomly divided into two parts, i.e., 80% for training set and 20% for
(GA) (Holland & Goldberg, 1989), crow search algorithm (CSA) testing. The population size of nine algorithms is 10 and the maximum
(Askarzadeh, 2016), augmented GWO (AGWO) (Qais, et al., 2018), and iteration number is 100. The performance of VBOA is compared with
chaotic selfish herd optimizer (CSHO) (Mafarja & Mirjalili, 2017). PSO, GWO, WOA, sine cosine algorithm (SCA) (Mirjalili, 2016), seagull
Table 10 shows the classification accuracy of ten methods for ten optimization algorithm (SOA) (Dhiman & Kumar, 2019), HHO, MPA,
datasets with high dimensionality. The average number of the optimal and BOA. The classification accuracy, the number of the optimal feature,
feature subsets for ten algorithms is outlined in Table 11. the CPU runtime and the Friedman rank test results of nine algorithms
From Table 10, VBOA performs superior to PSO, ALO, FPA, CSA, are shown in Tables 13-15.
AGWO, and CSHO algorithms on all the datasets for the classification As seen in Table 13, the fault identification success rate of VBOA and
accuracy. The result of MPA is much superior to VBOA on nine datasets MPA can achieve 100%. Compared with PSO, GWO, WOA, SCA, SOA,
while the VBOA obtains better results on WaveformEW dataset. and BOA, VBOA provides better classification accuracy on three data
Compared with GA, the VBOA finds better results on all the datasets sets. VBOA, SOA, and HHO get similar identification success rate on
except for Semeion dataset. Compared to the OBBOA algorithm, VBOA dataset-1. However, the better results are obtained by VBOA on dataset-
obtains better results on KrvskpEW, SonarEW, SpectEW, and Wave 2 and dataset-3. VBOA and MPA obtain the first rank, followed by HHO,
formEW datasets and worse results on other six datasets. SOA, SCA, GWO, PSO, WOA, and BBO based on the results of Friedman
As shown in Table 11, with respect to PSO, ALO, SBO, FPA, CSA, and rank test. Regarding to the rank results of the optimal feature number in
AGWO, VBOA gets the better average number of the optimal feature Table 14, WOA ranks first, followed by HHO, VBOA, MPA, SCA, SOA,
subsets on all the datasets. The result of MPA is better than on seven GWO, PSO, and BOA. It can be seen from Table 15 show that WOA
datasets. However, the better results of two datasets (i.e., KrvskpEW and achieves the least CPU runtime for all of three datasets. VBOA also shows
WaveformEW) are obtained by VBOA. For SpectEW dataset, two algo competitive performance in term of CPU runtime. According to the
rithms obtain similar results. Compared with OBBOA, VBOA provides above comparison results, it can be concluded that VBOA is a promising
better and similar results on seven and one datasets. VBOA is superior to candidate technique for fault identification of wind turbine.
GA and CSHO for all the datasets except for SpectEW dataset.
In addition, the mean Friedman ranking test results of ten algorithms 4.10. Analysis and discussion
on ten datasets with high dimensionality for the classification accuracy
and the optimal feature subset number are plotted in Fig. 11. It can be observed in Table 2 that VBOA is superior to the basic BOA
on 34 benchmark functions and BOA surpasses VBOA on only function
4.9. Fault identification of wind turbine f32. This may be due to the fact that, in VBOA, the modified position
update equation in local search phase is capable of enhancing the
To further investigate its performance on real applications, VBOA is exploitation ability of the basic BOA and the refraction- based learning
applied to identify the fault of wind turbine. As an important part of the (RBL) strategy employs the good characteristic of the opposition-based
wind turbine, the fault accurate identification of the pitch system is a learning (OL) which generating a more efficient and promising candi
great significance to improve the safety and economy of the wind date solution to improve the exploration ability of the algorithm.
21
W. Long et al. Expert Systems With Applications 201 (2022) 117217
Combining the modified position update equation and the RBL strategy to effectively achieve a good tradeoff in terms of convergence and di
can remedy the shortcomings of the basic BOA. Therefore, the proposed versity. On the other hand, a novel refraction-based learning (RBL)
VBOA can achieve a good balance between exploration and exploitation mechanism is designed for preventing the population falling into the
and has faster convergence than the basic BOA. At the same time, it can local optima. To study the effectiveness and feasibility, the results of
also be seen in Table 2 that OLBOA is worse than VBOA on all the VBOA on 40 high-dimensional benchmark function optimization (di
benchmark functions except for six functions. This is probably due to the mensions range from 100 to 10000), 10 high-dimensional benchmark
fact that, in OLBOA, the OL strategy is introduced to discover useful feature selection and one fault identification of wind turbine problems
information to generate the promising candidate solution for enhancing are compared against the basic BOA, OLBOA MBOA, MSCA, CCAA, HGS,
the diversity. However, the OL is only a special case of the RBL strategy. GWO, WOA, HHO, ALO, MPA, FPA, OBBOA, GA, CSA, AGWO, CSHO,
Thus, the diversity-enhanced ability of OL is worse than the RBL strat PSO, SCA, and SOA. The statistical results demonstrate that the result of
egy. From Table 2, the performance of MBOA is worse than that of VBOA VBOA is much superior to the other methods on high-dimensional
on 30 benchmark test functions. This may be owing to the fact that, in function optimization and feature selection problems for the solution
MBOA, the mutualism phase of symbiosis organisms search (SOS) is precision, convergence speed, classification accuracy, and the number of
embedded in the basic BOA to effectively exploit the search space. the optimal feature subsets. In future, the proposed method is applied for
However, MBOA only obtains good results on most unimodal and few dealing with the constrained single- and multi- objective problems as
multimodal problems. This means that MBOA’s ability to search for well as more real world applications.
high-dimensional solution space is insufficient. In addition, it can be
observed from Tables 3-4 that the optimization capabilities of BOA, CRediT authorship contribution statement
OLBOA, MBOA, and VBOA on all of the functions with 1000 and 10,000
dimensions are similar to 100 dimensions. As seen in Table 5, the Wen Long: Writing – review & editing, Conceptualization, Meth
average CPU execute time results of VBOA and BOA are very close to odology. Ming Xu: Software, Visualization, Methodology, Writing –
each other for all of the functions with 100, 1000, and 10,000 di review & editing. Jianjun Jiao: Writing – review & editing, Visualiza
mensions. In other words, the computational complexity of VBOA does tion. Tiebin Wu: Writing – review & editing, Software, Visualization.
not increase than that of the basic BOA. It can be seen from Table 6 that Mingzhu Tang: Investigation, Formal analysis, Visualization, Writing –
VBOA obtains the competitive performance compared with other meta- review & editing. Shaohong Cai: Conceptualization, Methodology,
heuristic algorithms. Based on the above analysis, it can conclude that Supervision, Investigation.
VBOA is a promising candidate technique for solving high- dimensional
optimization problems. Declaration of Competing Interest
While VBOA obtains good performance on high-dimensional func
tion optimization, it still has some limitations in solving complex The authors declare that they have no known competing financial
problems. From Tables 2-4, VBOA does not obtain the theoretical interests or personal relationships that could have appeared to influence
optimal value (0) on seven benchmark test functions (i.e., f7, f12, f19, f32, the work reported in this paper.
f33, f36, and f39). The reasons are analyzed as follows. Function f7 is a
unimodal problem with noise. The Gaussian noise makes sure that VBOA Acknowledgement
never gets the same value on the same point. Function f12 is a unimodal
problem with a long narrow valley. VBOA fails in solving this function This work was supported by the National Natural Science Foundation
because it cannot keep up the direction changes in the function with a of China (61463009, 62173050), Natural Science Foundation of Guiz
narrow curved valley. Function f19 is a multimodal problem with a flat hou Province, China ([2020]1Y012), Innovation Group Projects of Ed
surface. This flatness does not provide any information for VBOA to ucation Department of Guizhou Province, China (KY[2021]015),
guide the search process towards the global optima. Function f36 is a Guizhou Key Laboratory of Big Data Statistics Analysis (BDSA20200101,
non-separable, scalable and highly multimodal problem. The failure of BDSA20190106), Natural Science Foundation of Hunan Province, China
VBOA for this function is that the interaction of variables imposes some (2020JJ4382), and Key Projects of Education Department of Hunan
limitations to the extent a variable can be optimized close to its optimal Province, China (19A254).
value. Functions f32, f33, and f39 are the extremely complex multi- modal
problems. VBOA cannot explore the entire search space effectively when References
solving these problems. Furthermore, the dimensionality of the solution
space is also a crucial aspect with these problems. Abd Elminaam, D. S., Nabil, A., Ibraheem, S. A., & Houssein, E. H. (2021). An efficient
With respect to high-dimensional benchmark feature selection, the marine predators algorithm for feature selection. IEEE Access, 9, 60136–60153.
https://doi.org/10.1109/ACCESS.2021.3073261
solution efficiency of VBOA on five data- sets such as Clean 1, Pen Arora, S., & Anand, P. (2019). Binary butterfly optimization approaches for feature
glungEW, SonarEW, SpectEW, and WaveformEW is not enough. How selection. Expert Systems with Applications, 116, 147–160. https://doi.org/10.1016/j.
ever, VBOA achieves very competitive performance on three datasets of eswa.2018.08.051
Arora, S., & Singh, S. (2019). Butterfly optimization algorithm: A novel approach for
wind turbine fault identification. The possible reasons for this are as global optimization. Soft Computing, 23, 715–734. https://doi.org/10.1007/s00500-
follows. Essentially, VBOA is an optimization technique with a contin 018-3102-4
uous space that needs to convert it from continuous version to binary Askarzadeh, A. (2016). A novel metaheuristic method for solving constrained
engineering optimization problems: Crow search algorithm. Computers & Structures,
one when solving the feature selection problems. Therefore, choosing an
169, 1–12. https://doi.org/10.1016/j.compstruc.2016.03.001
appropriate convert technique is very important. The performance de Aslimani, N., & Ellaia, R. (2018). A new hybrid algorithm combining a new chaos
pends not only on the proposed VBOA, but also on the convert tech optimization approach with gradient descent for high dimensional optimization
problems. Computational and Applied Mathematics, 37, 2460–2488. https://doi.org/
nique. In other words, the proposed VBOA is unlikely to work well on all
10.1007/s40314-017-0454-9
of datasets. Assiri, A.S. (2021). On the performance improvement of Butterfly Optimization
approaches for global optimization and Feature Selection, Plos one, 16, Article
e0242612. https://doi.org/10.1371/journal.pone.0242612.
5. Conclusions
Born, M., & Wolf, E. (1959). Principles of optics. New York, NY, USA: Pergamon Press.
Cheng, R., & Jin, Y. (2015). A competitive swarm optimizer for large scale optimization.
In this paper, a modified butterfly optimizer algorithm is developed IEEE Transactions on Cybernetics, 45, 191–204. https://doi.org/10.1109/
for dealing with the high-dimensional optimization and feature selection TCYB.2014.2322602
De Falco, I., Cioppa, A. D., & Trunfio, G. A. (2019). Investigating surrogate-assisted
problems. On the one hand, using the velocity and individual memory cooperative coevolution for large-scale global optimization. Information Sciences,
items to modify the position update equation in the local search phase is 482, 1–26. https://doi.org/10.1016/j.ins.2019.01.009
22
W. Long et al. Expert Systems With Applications 201 (2022) 117217
Dhiman, G., & Kumar, V. (2019). Seagull optimization algorithm: Theory and its Mirjalili, S. (2015). The ant lion optimizer. Advances in Engineering Software, 83, 80–93.
applications for large-scale industrial engineering problems. Knowledge-Based https://doi.org/10.1016/j.advengsoft. 2015.01.010
Systems, 165, 169–196. https://doi.org/10.1016/j.knosys.2018.11.024 Mirjalili, S. (2016). SCA: A sine cosine algorithm for solving optimization problems.
Faramarzi, A., Heidarinejad, M., Mirjalili, S., & Gandomi, A. H. (2020). Marine predators Knowledge-Based Systems, 96, 120–133. https://doi.org/10.1016/j.
algorithm: A nature-inspired metaheuristic. Expert Systems with Applications, 152, knosys.2015.12.022
Article 113377. https://doi.org/10.1016/j.eswa.2020.113377 Mirjalili, S., & Lewis, A. (2016). The whale optimization algorithm, Advances in
Gungor, I., Emiroglu, B.G., Cinar, A.C., & Kiran, M.S. (2020). Integration search Engineering Software, 95, 51-67. https://doi.org/ 10.1016/j.advengsoft.2016.01.008.
strategies in tree seed algorithm for high dimensional function optimization, Mirjalili, S., Mirjalili, S.M., & Lewis, A. (2014). Grey wolf optimizer, Advances in
International Journal of Machine Learning & Cybernetics, 11, 249-267. https://doi.org/ Engineering Software, 69, 46-61. https://doi.org/ 10.1016/j.advengsoft.2013.12.007.
10.1007/s13042-019-00970-1. Mortazavi, A., & Moloodpoor, M. (2021). Enhanced butterfly optimization algorithm
Guo, Y., Liu, X., & Chen, L. (2019). Improved butterfly optimization algorithm based on with a new fuzzy regulator strategy and virtual butterfly concept. Knowledge-Based
guiding weight and population restart. Journal of Experimental & Theoretical Artificial Systems, 228, Article 107291. https://doi.org/10.1016/j.knosys.2021.107291
Intelligence, 33, 127–145. https://doi.org/10.1080/0952813X.2020.1725651 Murugeswari, R., Radhakrishnan, S., & Devaraj, D. (2016). A multi-objective
Gupta, S., Deep, K., Mirjalili, S., & Kim, J.H. (2020). A modified sine cosine algorithm evolutionary algorithm based QoS routing in wireless mesh networks. Applied Soft
with novel transition parameter and mutation operator for global optimization, Computing, 40, 517–525. https://doi.org/10.1016/j.asoc.2015.12.007
Expert Systems with Applications, 154, Article 113395. https://doi.org/ 10.1016/j. Ning, Y., Peng, Z., Dai, Y., Bi, D., & Wang, J. (2019). Enhanced particle swarm
eswa. 2020.113395. optimization with multi-swarm and multi-velocity for optimizing high-dimensional
Herdari, A. A., Mirjalili, S., Faris, H., Aljarah, I., Mararja, M., & Chen, H. (2019). Harris problems. Applied Intelligence, 49, 335–351. https://doi.org/10.1007/s10489-018-
hawks optimization: Algorithm and applications. Future Generation Computer Systems, 1258-3
97, 849–872. https://doi.org/10.1016/j.future.2019.02.028 Omidvar, M. N., Li, X. D., Mei, Y., & Yao, X. (2014). Cooperative co-evolution with
Holland, J., & Goldberg, D. (1989). Genetic algorithms in search, optimization and machine differential grouping for large scale optimization. IEEE Transactions on Evolutionary
learning. USA: Addison-Wesley. Computation, 18, 378–393. https://doi.org/10.1109/TEVC.2013.2281543
Hussain, K., Salleh, M. N. M., Cheng, S., & Naseem, R. (2017). Common benchmark Qais, M. H., Hasanien, H. M., & Alghuwainem, S. (2018). Augmented grey wolf optimizer
functions for meta-heuristic evaluation: A review. JOIV: International Journal on for grid-connected PMSG-based wind energy conversion systems. Applied Soft
Informatics Visualization, 1, 218–223. https://doi.org/10.30630/joiv.1.4-2.65. Computing, 69, 504–515. https://doi.org/10.1016/j.asoc.2018.05.006
Imanian, N., Shiri, M. E., & Moradi, P. (2014). Velocity based artificial bee colony Qiao, W., & Yang, Z. (2019). Modified dolphin swarm algorithm based on chaotic maps
algorithm for high dimensional continuous optimization problems. Engineering for solving high-dimensional function optimization problems. IEEE Access, 7,
Applications of Artificial Intelligence, 36, 148–163. https://doi.org/10.1016/j. 110472–110486. https://doi.org/10.1109/ACCESS.2019.2931910
engappai. 2014.07.012 Rao, R. V., Savsani, V. J., & Vakharia, D. P. (2012). Teaching-learning-based
Jamil, M., & Yang, X. S. (2013). A literature survey of benchmark functions for global optimization: An optimization method for continuous nonlinear large scale
optimization problems. International Journal of Mathematical Modelling and Numerical problems. Information Sciences, 183, 1–15. https://doi.org/10.1016/j.
Optimization, 4, 150–194. https://doi.org/10.1504/IJMMNO.2013.055204 ins.2011.08.006
Jalali, S. M. J., Ahmadian, S., Kebria, P. M., Khosravi, A., Lim, C. P., & Nahavandi, S. Seck-Tuoh-Mora, J.C., Hernandez-Remero, N., Lagos-Eulogio, P., Medina-Marin, J., &
(2019). Evolving artificial neural networks using butterfly optimization algorithm Zuniga-Pena, N.S. (2021). A continuous- state cellular automata algorithm for global
for data classification. Proceedings of International Conference on Neural Information optimization, Expert Systems with Applications, 177, Article 114930. https://doi.org/
Processing, 2019, 596–607. 10.1016/j.eswa.2021.114930.
Jayanthi, M., Sampath, K. S., Mohana, R., Ananth, V., Vengattaraman, T., & Shams, I., Mekhilef, S., & Soon, T. K. (2021). Maximum power point tracking using
Dhavachelvan, P. (2015). Gene suppressor: An added phase toward solving large modified butterfly optimization algorithm for partial shading, uniform shading, and
scale optimization problems in genetic algorithm. Applied Soft Computing, 35, fast varying load conditions. IEEE Transactions on Power Electronics, 36, 5569–5581.
214–226. https://doi.org/10.1016/j.asoc.2015.06.017 https://doi.org/10.1109/TPEL.2020.3029607
Kar, D., Ghosh, M., Guha, R., Sarkar, R., Garcia-Hernandez, L., & Abraham, A. (2020). Shao, P., Wu, Z., Zhou, X., & Tran, D. C. (2017). FIR digital filter design using improved
Fuzzy mutation embedded hybrids of gravita- tional search and particle swarm particle swarm optimization based on refraction principle. Soft Computing, 21,
optimization methods for engineering design problems. Engineering Applications of 2631–2642. https://doi.org/10.1007/s00500-015-1963-3
Artificial Intelligence, 95, Article 103847. https://doi.org/10.1016/j. Sharma, S., & Saha, A. K. (2020). m-MBOA: A novel butterfly optimization algorithm
engappai.2020.103847 enhanced with mutualism scheme. Soft Computing, 24, 4809–4827. https://doi.org/
Kennedy, J., & Eberhart, R. C. (1995). Particle swarm optimization, Proceedings of the 10.1007/s00500-019-04234-6
IEEE International Conference on. Neural Networks, 4, 1942–1948. Shi, Y.H., & Eberhart, R. (1998). A modified particle swarm optimizer, in: Proceedings of
Jia, Y., Chen, W., Gu, T., Zhang, H., Yuan, H., Kwong, S., & Zhang, J. (2019). Distributed the IEEE International Conference on Evolutionary Computation, pp: 69-73.
cooperative co-evolution with adaptive computing resource allocation for large scale Sowjanya, K., & Injeti, S. K. (2021). Investigation of butterfly optimization and gases
optimization. IEEE Transactions on Evolutionary Computation, 23, 188–202. https:// Brownian motion optimization algorithm for optimal multilevel image thresholding.
doi.org/10.1109/TEVC.2018.2817889 Expert Systems with Applications, 182, Article 115286. https://doi.org/10.1016/j.
Li, G., Shuang, F., Zhao, P., & Le, C. (2019). An improved butterfly optimization eswa. 2021.115286
algorithm for engineering design problems using the cross- entropy method. Sun, L., Chen, S., Xu, J., & Tian, Y. (2019a). Improved monarch butterfly optimization
Symmetry-Basel, 11, 1–20. https://doi.org/10.3390/sym11081049 algorithm based on opposition-based learning and random local perturbation,
Li, Z., Wang, W., Yan, Y., & Li, Z. (2015). PS-ABC: A hybrid algorithm based on particle Complexity, 2019. Article, 4182148. https://doi.org/10.1155/2019/4182148
swarm and artificial bee colony for high- dimensional optimization problems. Expert Sun, Y., Yang, T., & Liu, Z. (2019b). A whale optimization algorithm based on quadratic
Systems with Applications, 42, 8881–8895. https://doi.org/10.1016/j. interpolation for high-dimensional global optimization problems. Applied Soft
eswa.2015.07.043 Computing, 85, Article 105744. https://doi.org/10.1016/j.asoc.2019.105744
Liu, H., Wang, Y., Tu, L., Ding, G., & Hu, Y. (2019). A modified particle swarm Tan, L.S., Zainuddin, Z., & Ong, P. (2020). Wavelet neural networks based solutions for
optimization for large-scale numerical optimizations and engineering design elliptic partial differential equations with improved butterfly optimization algorithm
problems, Journal of Intelligent Manufacturing, 30, 2407-2433. https://doi.org/ training, Applied Soft Computing, 95, Article 106518. https://doi.org/ 10.1016/j.
10.1007/s10845- 018-1403-1. asoc. 2020.106518.
Long, W., Jiao, J., Liang, X., & Tang, M. (2018a). An exploration-enhanced grey wolf Tian, J., Sun, C., Tan, Y., & Zeng, J. (2020). Granularity-based surrogate-assisted particle
optimizer to solve high-dimensional numerical optimization. Engineering Applications swarm optimization for high-dimensional expensive optimization. Knowledge-Based
of Artificial Intelligence, 68, 63–80. https://doi.org/10.1016/j.engappai.2017.10.024 Systems, 187, Article 104815. https://doi.org/10.1016/j.knosys.2019.06.023
Long, W., Jiao, J., Liang, X., & Tang, M. (2018b). Inspired grey wolf optimizer for solving Tuo, S., Zhang, J., Yong, L., Yuan, X., Liu, B., Xu, X., & Deng, F. (2015). A harmony
large-scale function optimization problems. Applied Mathematical Modelling, 60, search algorithm for high-dimensional multimodal optimization problems. Digital
112–126. https://doi.org/10.1016/j.apm.2018.03.005 Signal Processing, 46, 151–163. https://doi.org/10.1016/j.dsp.2015.08.008
Long, W., Jiao, J., Liang, X., Wu, T., Xu, M., & Cai, S. (2021a). Pinhole-imaging-based Trisolini, M., Lewis, H. G., & Colombo, C. (2018). Spacecraft design optimization for
learning butterfly optimization algorithm for global optimization and feature demise and survivability. Aerospace Science & Technology, 77, 638–657. https://doi.
selection. Applied Soft Computing, 103, Article 107146. https://doi.org/10.1016/j. org/10.1016/j.ast.2018.04.006
asoc. 2021.107146 Wang, C., & Gao, J.-H. (2014). A differential evolution algorithm with cooperative
Long, W., Wu, T., Jiao, J., Tang, M., & Xu, M. (2020). Refraction-learning-based whale coevolutionary selection operation for high- dimensional optimization. Optimization
optimization algorithm for high-dimensional problems and parameter estimation of Letters, 8, 477–492. https://doi.org/10.1007/s11590-012-0592-3
PV model. Engineering Applications of Artificial Intelligence, 89, Article 103457. Wang, L., Xiong, Y., Li, S., & Zeng, Y. (2019). New fruit fly optimization algorithm with
https://doi.org/10.1016/j.engappai.2019.103457 joint search strategies for function optimization problems. Knowledge-Based Systems,
Long, W., Wu, T., Liang, X., & Xu, S. (2019). Solving high-dimensional global 176, 77–96. https://doi.org/10.1016/j.knosys.2019.03.028
optimization problems using an improved sine cosine algorithm. Expert Systems with Yang, X. S., Karamanoglu, M., & He, X. (2014). Flower pollination algorithm: A novel
Applications, 123, 108–126. https://doi.org/10.1016/j.eswa.2018.11.032 approach for multiobjective optimization. Engineering Optimization, 46, 1222–1237.
Long, W., Wu, T., Xu, M., Tang, M., & Cai, S. (2021b). Parameters identification of https://doi.org/10.1080/0305215X.2013.832237
photovoltaic models by using an enhanced adaptive butterfly optimization Yang, Y., Chen, H., Heidari, A.A. & Gandomi, A.H. (2021). Hunger games search: Visions,
algorithm. Energy, 229, Article 120750. https://doi.org/10.1016/j. conception, implementation, deep analysis, perspectives, and towards performance
energy.2021.120750 shifts, Expert Systems with Applications, 177, Article 114864. https://doi.org/
Mafarja, M. M., & Mirjalili, S. (2017). Hybrid whale optimization algorithm with 10.1016/j.eswa.2021.114864.
simulated annealing for feature selection. Neurocomputing, 260, 302–312. https://
doi.org/10.1016/j.neucom.2017.04.053
23
W. Long et al. Expert Systems With Applications 201 (2022) 117217
Yuan, Z., Wang, W., Wang, H., & Hossein, K. (2020). Improved butterfly optimization Zarshenas, A., & Suzuki, K. (2016). Binary coordinate ascent: An efficient optimization
algorithm for CCHP driven by PEMFC. Applied Thermal Engineering, 173, Article technique for feature subset selection for machine learning. Knowledge-Based
114766. https://doi.org/10.1016/j.applthermaleng.2019.114766 Systems, 110, 191–201. https://doi.org/10.1016/j.knosys.2016.07.026
Yurkuran, A., & Emel, E. (2015). An adaptive artificial bee colony algorithm for global
optimization. Applied Mathematics and Computation, 271, 1004–1023. https://doi.
org/10.1016/j.amc.2015.09.064
24