0% found this document useful (0 votes)
6 views18 pages

Boosted Barnacles Algorithm Optimizer Comprehensive Analysis For Social IoT Applications

The document presents the Boosted Barnacles Algorithm Optimizer (DBMT), which enhances resource efficiency and decision-making in Social Internet of Things (SIoT) applications by incorporating Triangular mutation and dynamic Opposition-based learning. The proposed method outperforms existing algorithms in predicting social-related datasets, addressing challenges posed by high-dimensional data in SIoT environments. The study demonstrates the effectiveness of DBMT through experiments on various datasets, showcasing its potential to improve accuracy and reliability in SIoT applications.

Uploaded by

60rahul
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
6 views18 pages

Boosted Barnacles Algorithm Optimizer Comprehensive Analysis For Social IoT Applications

The document presents the Boosted Barnacles Algorithm Optimizer (DBMT), which enhances resource efficiency and decision-making in Social Internet of Things (SIoT) applications by incorporating Triangular mutation and dynamic Opposition-based learning. The proposed method outperforms existing algorithms in predicting social-related datasets, addressing challenges posed by high-dimensional data in SIoT environments. The study demonstrates the effectiveness of DBMT through experiments on various datasets, showcasing its potential to improve accuracy and reliability in SIoT applications.

Uploaded by

60rahul
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 18

Received 19 June 2023, accepted 11 July 2023, date of publication 17 July 2023, date of current version 21 July 2023.

Digital Object Identifier 10.1109/ACCESS.2023.3296255

Boosted Barnacles Algorithm Optimizer:


Comprehensive Analysis for Social
IoT Applications
MOHAMMED A. A. AL-QANESS 1 , AHMED A. EWEES 2 , (Senior Member, IEEE),
MOHAMED ABD ELAZIZ 3,4,5,6 , ABDELGHANI DAHOU 7 , MOHAMMED AZMI AL-BETAR 5,8 ,

AHMAD O. ASEERI 9 , (Member, IEEE), DALIA YOUSRI 10 , AND REHAB ALI IBRAHIM 3
1 College of Physics and Electronic Information Engineering, Zhejiang Normal University, Jinhua 321004, China
2 Department of Computer, Damietta University, Damietta 34517, Egypt
3 Department of Mathematics, Faculty of Science, Zagazig University, Zagazig 44519, Egypt
4 Faculty of Computer Science and Engineering, Galala University, Suez 435611, Egypt
5 Artificial Intelligence Research Center (AIRC), College of Engineering and Information Technology, Ajman University, Ajman, United Arab Emirates
6 Department of Electrical and Computer Engineering, Lebanese American University, Byblos 13-5053, Lebanon
7 Department of Mathematics and Computer Science, University of Ahmed Draia, Adrar 01000, Algeria
8 Department of Information Technology, Al-Huson University College, Al-Balqa’ Applied University, Irbid 19117, Jordan
9 Department of Computer Science, College of Computer Engineering and Sciences, Prince Sattam bin Abdulaziz University, Al-Kharj 11942, Saudi Arabia
10 Department of Electrical Engineering, Faculty of Engineering, Fayoum University, Fayoum 63514, Egypt

Corresponding authors: Mohamed Abd Elaziz ([email protected]) and Ahmad O. Aseeri ([email protected])
This work was supported by the Deputyship for Research and Innovation, Ministry of Education, Saudi Arabia, under Project
IF-PSAU-2022/01/19574.

ABSTRACT The Social Internet of Things (SIoT) has revolutionized user experience through various
applications and networking services like Social Health Monitoring, Social Assistance, Emergency Alert
Systems, and Collaborative Learning Platforms. However, transferring different types of data between the
interconnected objects in the SIoT environment, including sensor data, user-generated data, and social
interaction data, poses challenges due to their high dimensionality. This paper presents an alternative SIoT
method that improves resource efficiency, system performance, and decision-making using the Barnacles
Mating Optimizer (BMO). The BMO incorporates Triangular mutation and dynamic Opposition-based
learning to enhance search space exploration and prevent getting stuck in local optima. Two experiments were
conducted using UCI datasets from different applications and SIoT-related datasets. The results demonstrate
that the developed method, DBMT, outperforms other algorithms in predicting social-related datasets in the
IoT environment.

INDEX TERMS Social IoT, Barnacles Mating Optimizer, triangular mutation, opposition-based learning.

I. INTRODUCTION the services for end-users by incorporating information


The Internet of Things (IoT) will significantly impact our transferred in a SIoT between different objects in a social
lives in the future [1]. There are different technologies and and autonomous relationship. The data transferred across IoT
applications related to the IoT, such as healthcare [2], smart devices, social networks, and sensors differ in many aspects,
homes [3], elderly care [4], human motion tracking [5], such as operating environment, communication protocol, and
internet of medical things [6], and industrial internet of data type (text, image, video, time-series data), which raise
things [7], [8]. In addition, the SIoT is an emerging area that many challenges in terms of transferring, processing, and
combines social networks and IoT. The SIoT tends to enhance analyzing the data [9]. The SIoT ecosystem should use robust
algorithms to effectively manage data collected from different
The associate editor coordinating the review of this manuscript and objects and enhance the user experience. For instance, SIoT
approving it for publication was Mauro Gaggero . can provide and boost the scalability, safety, reliability, and
This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 License.
73062 For more information, see https://creativecommons.org/licenses/by-nc-nd/4.0/ VOLUME 11, 2023
M. A. A. Al-Qaness et al.: Boosted Barnacles Algorithm Optimizer

effectiveness of many services such as healthcare, smart of the PSO in Neurological disability diagnosis is to help
city, event and crisis monitoring, and transportation [10]. optimize the propagation of EEG and neural networks.
However, processing vast amounts of data with high dimen- In [19], an enhanced version of the artificial ecosystem-based
sionality created by SIoT applications is a major difficulty. optimization (AEO) was proposed. The proposed optimiza-
In high-dimensional data, redundant and irrelevant features tion method is AEOSSA, which depends on employing
increase computing complexity and degrade the accuracy and the operators of the salp swarm algorithm (SSA) to aug-
efficiency of classification systems [11]. ment the exploitation capability of the AEO algorithm.
In recent years, there have been different applications The AEOSSA was used to solve scheduling problems in
for metaheuristic optimization algorithms for different IoT IoT fog computing environments. It outperformed several
applications. For example, Janakiraman [12] developed an comparative methods, including the original AEO and SSA
effective cluster head selection method for IoT appliances algorithms. Bahmanyar et al. [20] applied a modified arith-
connected to wireless sensor networks. They proposed a metic optimization algorithm (AOA), called multi-objective
Hybrid optimization approach of Ant Colony optimizer AOA, to solve the energy scheduling of home appliances
(ACO) and Artificial Bee Colony (ABC) algorithm. This in IoT applications. It was employed with the Raspberry Pi
combination of both optimization algorithms overcomes the minicomputer with Node-RED and NodeMCU models. The
limitations of each one. Srinivas and Manivannan [13] applied main work of the multi-objective AOA s to obtain the best
the rider optimization algorithm with Deep Belief Network schedule patterns for reducing daily electricity consumption
(DBN) to build an efficient method to detect and prevent and the peak-to-average ratio. It was compared to several
HELLO flooding attacks in medical IoT-WSN systems. competitor optimization algorithms, such as the PSO, Antlion
A new variant of the rider optimizer was suggested to optimization (ANO), and gray wolf optimizer (GWO).
boost the DBN’s performance and select the optimal shortest Padmaa et al. [21] presented a new method called ‘‘energy-
route path of the network. They used several indicators efficient clustering-based secure data transmission protocol
to set the optimal path, such as delay of transmission, (EEC-SDTP)’’ with oppositional chaos game optimization
node trust, d packet loss ratio, and distance between for IoT Edge systems. The proposed approach was employed
nodes. Vimal et al. [14] developed a multi-objective ACO to find the optimal set of cluster heads (CHs) and the optimal
and greedy-based optimizer with deep learning models path for data transmission. In [22], the whale optimization
for energy management in IoT-based cognitive radio net- algorithm (WOA) was employed for IoT service placement
works. The Q-learning method was employed to boost data in fog computing environments. The energy consumption and
clustering. The multi-objective ACO algorithms were used throughput were used as objective functions to obtain the
for jamming prediction to enhance the network’s energy. optimal IoT service placement plan that meets the quality of
Ramteke et al. [15] applied a new particle swarm optimization service (QoS) requirements for desired IoT services.
method (PSO) variant using genetic mutation to select the Following the successful applications of the MH optimiza-
control nodes for IoT-enabled software-defined software tion techniques in different IoT fields [23], [24], [25], this
heterogeneous wireless sensor networks. The suggested paper presents a new approach for social IoT applications
method was applied to heterogeneous networks with various based on improving the performance of Barnacles Mating
computational power and single or multiple sinks. The eval- Optimizer (BMO) [26]. This was conducted using the
uation experiments were conducted on a performance matrix advantages of Triangular mutation (TM) and Dynamic
scale that included stability period, fitness value, and average Opposition-based learning (OBL) to improve the process
residual energy. Under various network configurations, the of balancing convergence towards the optimal subset of
simulation outcomes of the proposed approach outperformed features and avoid attraction to the local point. The main
other techniques. reason for using BMO is that it has been proven efficient
Maddikunta et al. [16] presented a hybrid method based as an optimizer technique in different applications. For
on moth flame optimization (MFO) and whale optimization example, Li et al. [27] presented a BMO variant called
algorithm (WOA) to solve energy utilization in IoT networks. Converged BMO for fuel cell/battery vehicle degradation.
The hybrid model selected the optimal cluster head (CH) to Rizk-Allah and El-Fergany [28] use a modified version
improve other energy factors. This combined optimization of the BMC, called neighborhood strategy-based Laplacian
algorithm showed acceptable performance compared to other BMC, to address the problem of solar cell diode. Jia
basic algorithms. In [17], the authors suggested an adaptive and Sun [29] applied an improved BMC, called IBMC,
PSO Convolutional Neural Network (APSO-CNN) for IoT to optimize the support vector machine classifier. They
intrusion detection. used three search techniques, refraction-learning, Gaussian
Goyal et al. [18] presented a modified PSO algorithm to mutation, and logistic model, to boost the search capability
boost the physiological calculation accuracy of sensor-data of the BMO. Sulaiman and Mustaffa [30] applied the original
fusion in IoT systems. It can be utilized to automatically BMO algorithm to address the problem of optimal chiller
diagnose brain fatalities and natural epilepsy and from EEG loading. In [31], a hybrid of BMO, cuckoo search algorithm,
signals from medical health centers. The main application and PSO was developed to optimize the power system

VOLUME 11, 2023 73063


M. A. A. Al-Qaness et al.: Boosted Barnacles Algorithm Optimizer

stabilizer’s parameters. In [32], the BMO was developed mutation was used to enhance the differential evolution (DE)
using a logistic approach and chaotic map to solve multilevel algorithm [43], [44].
thresholding for color images. In addition, by comparing This article presents a significant contribution: the
BMO with other MH techniques, it has been noticed that DBMT (Dynamic Barnacles Mating Optimizer) algorithm,
the BMO keeps the population’s genetic diversity intact. an upgraded version of BMO (Barnacles Mating Optimizer).
This is conducted by introducing a mutation operator. As a DBMT incorporates Triangular mutation and dynamic
result, this prevents premature convergence too early, and Opposition-based learning (OBL) techniques. The DBMT
various search domain areas can be explored. The BMO algorithm optimizes convergence toward optimal solutions
balances exploration and exploitation effectively compared to by selecting an appropriate initial population. Additionally,
other MH techniques. The mating and reproduction processes the Triangular mutation method improves the diversity of
allow for exploration by generating new offspring with solutions and explores the search space more effectively,
diverse characteristics. In addition, the selection according leading to better feature discovery. The DBMT brings
to fitness promotes exploitation by solutions with higher potential enhancements to SIoT applications and services,
fitness values. According to these characteristics, BMO including superior accuracy and reliability in predicting
can be applied to different real-world problems. This social behaviors and patterns within the SIoT environment.
flexibility enables it to handle high-dimensional SIoT data The main contribution of this paper can be formulated as
without specific domain knowledge or constraints. BMO’s follows:
population-based nature and genetic operations make it • Propose an alternative FS method for IoT datasets
adaptable to many problems, including those with high- based on a modification of BMO to address the high
dimensional data. dimensionality of SIoT data.
With these advantages of BMO, it still needs more • Triangular mutation and dynamic Opposition-based
improvement. Especially increasing the ability to explore learning (OBL) are being used to increase BMO’s
the search domain and avoiding the attraction towards the performance. These modifications aim to improve the
local point. So, this motivated us to use the strengths of algorithm’s exploration ability and its ability to find
Triangular mutation and Opposition-based learning (OBL). optimal solutions.
Each of these mechanisms has been employed to augment • Using a collection of different social IoT datasets to
the search process of different MH algorithms. In [33], assess the efficiency of produced DBMT. In addition to
the OBL was utilized to augment the Aquila optimizer comparing the performance of developed DBMT against
(AO) performance. Then, the boosted AO method was state-of-the-art FS methods.
applied with the adaptive neuro-fuzzy inference system to
forecast crude oil production. Similarly, the OLB enhanced
II. PRELIMINARIES
the slime mould algorithm (SMA) in [34]. Additionally,
This section introduces the Barnacles Mating Optimizer
in [35], the authors applied the OBL to boost the arithmetic
(BMO), Triangular mutation, and dynamic Opposition-based
optimization algorithm (AOA) search capability, which was
learning.
utilized for text clustering. In addition, plenty of researchers
have proven the OBL strategy’s efficiency. For instance, the
use of OBL with Harris Hawks optimization (HHO) [36] A. BARNACLES MATING OPTIMIZER(BMO)
and improved sine cosine algorithm (SCA) [37] enhanced BMO algorithm is proposed by Mohd Sulaiman, and his
the searching ability by escaping from the local optima. team [26]. The algorithm is a population-based bio-inspired
Moreover, the authors of [38] proposed a hybridized Grey optimizer where the barnacle’s mating behavior was behind
wolf optimizer (GWO) that implemented the OBL to avoid this algorithm’s invention. The algorithm has three main
the local optima and modify the parameter ‘‘C’’ strategy steps:
to balance between exploration and exploitation. Also, the 1) Initialization
OBL intended to improve the exploration in the static single 2) Selection Process
assignment algorithm (SSA) [39]. In SSA, the authors applied 3) Reproduction
the OBL in each iteration. Which, as a result, increased
the search complexity. In addition, OBL was used with The steps will be discussed in the upcoming sub-section.
In addition, Fig.1 demonstrates the flowchart of the BMO
the improved grasshopper optimization algorithm [40] to
algorithm.
solve benchmark functions and engineering optimization
problems.
More so, triangular mutation techniques are always 1) INITIALIZATION
used to boost the search capability of MH techniques. BMO is a population-based algorithm where it starts by
Zhang et al. [41] applied the logarithmic spiral strategy initializing population solutions. In this step, the population
and a triangular mutation mechanism to improve the is considered to represent the barnacles as candidates.
teaching-learning-based optimization (TLBO) to solve global Consequently, if we assume that the population consists of n
optimization problems. In [42], the triangular Gaussian number of barnacles and N number of control variables, the
73064 VOLUME 11, 2023
M. A. A. Al-Qaness et al.: Boosted Barnacles Algorithm Optimizer

FIGURE 1. BMO flowchart.

resultant population will be illustrated as follows [26]: The parent selection procedure is done based on the
 1 following assumptions:
x1 · · · x1N

1) The parent selection is executed in a random manner
X =  ... . . . ...  (1)
 
(as represented in Eq 4 & 5), but restricted to be within
xn1 ··· xnN the length of the barnacle penis (pl).
2) Each barnacle may contribute its sperm and receive
where each control variable (x) is bounded between the upper others’ sperm. Although in real life, the female barnacle
and lower bounds of the optimization problem. In which ub can be fertilized by more than one male [45]. In BMO,
refers to the upper bound and lb is the lower bound. The ub each barnacle is fertilized only by one barnacle at a
and lb are represented using vectors as follows [26]: time.
According to the above selection assumptions, the mating
ub = [ub1 , . . . , ubi ] (2) categories can be:
lb = [lb1 , . . . , lbi ] (3) 1) Self-mating: the random selection ends with self-
selection (Xd == Xm ). Referring to [46], self-mating
After establishing the population, the candidates will be is very rare. Consequently, it will not be considered in
evaluated using the problem’s objective function and sorted BMO.
accordingly. Where the best barnacle will be placed at the top 2) Sperm Cast mating: the random selection is more than
of the population (i.e., it will be denoted as T ). the pl.
3) Normal mating: the random selection is within the pl.
2) SELECTION PROCESS
The exploration and the exploitation in BMO are deter-
In order to perform mating and reproduction, the parent must mined by the pl, such that the sperm cast mating mimics
be chosen first. In the case of BMO, the selection is modeled the exploration process. On the other hand, normal mating
mathematically as in Equations 4 and 5 [26], where Xd and Xm represents the exploitation phase in the algorithm.
are the selected barnacle’s Dad and Mum, respectively. Also,
the selection is random (randperm) and within the number of 3) REPRODUCTION
barnacles (n). The off-spring generation in BMO is based on characteristics
inheritance, which is actually based on the principle of
Xd = randperm(n) (4)
Hardy-Weinberg [26]. In simple words, the reproduction is
Xm = randperm(n) (5) performed using either Eq.6 or 6, Which refers to normal

VOLUME 11, 2023 73065


M. A. A. Al-Qaness et al.: Boosted Barnacles Algorithm Optimizer

mating and sperm-cast mating (exploration and exploitation), Algorithm 1 The BMO’s Pseudo-Code
respectively. 1: Generate the population of barnacles Xi
2: Compute the fitness of each barnacle
xiNnew = p × xdN + q × xmN for k ≤ pl (6) 3: Sorting the population
xiNnew = rand() × xmN for k > pl (7) 4: T ← best barnacle (at the top of the population)
5: while L < Maximum iterations do
where k value is calculated as the absolute difference between 6: Set the value of pl
the parent (|Xd − Xm |). K is used to check whether the 7: Select the parents using Eq.4 & 5
selection is within the pl or not. Consequently, the mating 8: Xd =randprem(n)
category will be decided. 9: Xm =randprem(n)
Starting with the normal mating (Eq.6), the off-spring 10: Selection conditions and off-spring generation
barnacle will inherit p percent of the characteristics of the Dad 11: k ← |Xd − Xm |
and q percent of the characteristics of the Mum. Where p is a 12: if Selecting Dad and Mum ≤ pl then
random percent between 0 and 1, and q is (1 − p). 13: p ← rand()
Next is the Sperm Cast mating, where only the barnacle’s 14: q ← (1 − q)
Mum will be considered since it will receive the sperm. As per 15: for each variable do
the previous characteristics inheritance idea, the off-spring 16: initialize off-spring using Eq.6
barnacle will inherit the Mum characteristics with a certain 17: xiN _new = pxdN + qxmN ; for k ≤ pl
percentage. Therefore, Mum’s variables will be multiplied by
18: else if Selecting Dad and Mum > pl then
a random number between 0 and 1.
19: for each variable do
Finally, the previously mentioned steps can be summa-
20: Initialize off-spring using Eq.7
rized in the pseudo-code 1, which starts with population
21: xiN _new = rand()xmN ; for k > pl
initialization, followed by fitness function calculation and
population sorting. Accordingly, the top barnacle will be the 22: Bring back the current barnacles if it goes out of
best candidate and named T . Next, according to the value boundaries
of pl, the selection and mating will be processed. Moreover, 23: Computing each barnacle’s fitness function.
variable boundaries check will be performed. Those steps will 24: Sort The population
be iteratively repeated until a pre-defined maximum number 25: Updating T if any better solution
of iterations, and finally, T will be retrieved. 26: L=L+1
27: Return T
B. TRIANGULAR MUTATION
In this section, we present the basic of Triangular Muta-
tion [47]. As the strategy name indicates, the developed probability. The improved mutation method was built with
mutation relies on the convex of three vectors [47]. The the DE optimization technique, as defined in the following
three vectors are the worst, better, and best. In addition, equation:
three mutation factors with values between 0 and 1 are used. 
 XC (t) + F1 · (Xbs (t) − Xbe (t)) + η1
Therefore, the resultant formula will be as follows: 

 if (u ≤ (2/3))
Vi (t + 1) =
Vi (t + 1) = Xc (t) + F1 · (Xbs (t) − Xbe (t)) 
 Xr1 (t) + F · (Xr1 (t) − Xr3 (t))


+ F2 · (Xbs (t) − Xwt (t)) otherwise
+ F3 · (Xbe (t) − Xwt (t)) (8) η1 = F2 · (Xbs (t) − Xwt (t)) + F3 · (Xbe (t) − Xwt (t))
(10)
where Xwt (t), Xbe (t) and Xbs (t) are the worst, better and
best vectors, respectively. Moreover, F1 , F2 , and F3 are the C. OPPOSITION-BASED LEARNING (OBL)
mutation factors. And Xc (t) is the convex combination vector The OBL is a search strategy developed by HR.
calculated using Eq. 9. In which w1, w2 and P w3 are real Tizhoosh [48]. The proposed strategy aims to improve the
wights (wi ⩾ 0), and their sum equal to 1 ( 3i=1 wi = 1). algorithm exploitation phase and, consequently, avoid stag-
Moreover, the wi is calculated as wi = pi / 3i=1 pi .
P
nancy in the candidate solutions [49], eventually converging
XC (t) = w1 · Xbs + w2 · Xbe + w3 · Xwt (9) to the global optimum. The algorithm population’s candidates
might be close to the optimal solutions during solution
The developed mutation can balance between the explo- searching. As a result, the convergence will be quick. On the
ration and the exploitation phases. However, since the other hand, if the algorithm population’s candidates are far
primary mutation shows a tendency to the exploration phase. from the best solution, the convergence will be slow. In this
Therefore, a slight modification will be applied with the context, the OBL strategy is useful because it will produce
help of the DE algorithm, where the developed mutation’s new solutions from the opposite search regions, which may
probability will be two-thirds of the fundamental rules’ lead to the global optimum.

73066 VOLUME 11, 2023


M. A. A. Al-Qaness et al.: Boosted Barnacles Algorithm Optimizer

Proceeding to the technical implementation of OBL. specifically applied to the best solution. This iterative process
Assume a real number x where x ∈ [lb, ub], the opposite continues until a specified stopping condition is met, yielding
of x is denoted as X o and can be calculated using Eq.11. the best solution. When the objective is to predict the type of
Subsequently, for N-dimension real numbers, Eq.11 will be collected data, we rely on the distributed learned model stored
generalized to be Eq.12. in Fog computing and the relevant features extracted from the
binary version of the best solution.
X o = (ub + lb) − x (11)
Xio = (ubi + lbi ) − Xi (12) A. FIRST STAGE
Similarly, the dynamic opposite point of X is XDo and At this stage, we use 70% and 30% for training and testing,
calculated using Eq.13, where r8 &r8 are random values and respectively. The next step is to form the initial N individuals
w is a weighting agent. Subsequently, if X is a dimensional (X ), which have NF features (i.e., dimensions), and this
vector ([X1 , X2 ,. . . , XDim ]), Then the dynamic opposite value represents as:
XiDo is calculated using 14.
Xij = (lbij + rand × (ubij − lbij )), i = [1, 2, . . . , N ], (15)
X Do = X + w × r8 (r9 × X o − X ), w > 0 (13)
XiDo = Xi + w × rand × (rand × Xio − Xi ), w > 0 (14) In Eq.(15), ubij and lbij (j = [1, 2, . . . , D]) stand for the
limits of the search domain at jdimension. Whereas a random
With Regard to the DOL optimization. The first step is number is rand ∈ [0, 1].
to generate the candidates (X = [X1 , X2 ,. . . , XDim ]). Next, The next step is to obtain the Boolean version (BXi ) of each
is to calculate the dynamic opposite value of each XiDo individual Xi :
using Eq.14. Consequently, the best candidates’ selection
will be from X Do ∪ X according to their fitness value. 1 if Xij > 0.5

Finally, the chosen candidates will be used as the current BXij = (16)
0 otherwise
population.
For each Xi , the fitness value is calculated according to
III. PROPOSED APPROACH FOR FS the chosen training samples, and this process is defined
This section discusses the steps of the DBMT feature as:
selection approach, highlighting its methodology. The pri-  
mary objective of DBMT is to mitigate the influence |BXi |
Fiti = λ × γi + (1 − λ) × , (17)
of irrelevant features in the collected dataset by elimi- NF
nating them. To address this challenge, we present an
alternative feature selection (FS) method that leverages From the definition of fitness value (Fit), it can be seen
DOL and TM strategies to enhance the performance that two terms influence the value of fitness value. The first
of the BMO algorithm. The integration of these strate- term is the error of classification (γ ) using KNN, which is
gies aims to improve the exploration capability of the learned using the reduced training set based on the features
algorithm while preserving diversity during the optimization corresponding to ones in the current solutionXi . The  second
|BXi |
process. term is the ratio of selected features (i.e., NF ), where
In the IoT environment, the developed FS method com- |BXi | stand for the total number of features corresponding
prises three stages. Initially, data is collected using IoT to ones in BXi . λ denotes the weighting parameter used to
devices and transmitted through various layers until it reaches balance the two terms of Eq. (20).
the cloud center. In our framework, if the data is required Thereafter, the DOL is applied to enhance the value of
for learning, it is sent to the cloud center; otherwise, it is individuals X ; however, since DOL can increase the time
forwarded to Fog nodes. When the objective is learning, complexity, we use the following formula to tackle this
our main focus is to train a predictor by selecting relevant problem.
features using the modified BMO algorithm. The DBMT, our
XiDoJ if PrDo > 0.5

novel FS method, commences by initializing the population XiN = (18)
agents and evaluating each agent’s fitness value to identify the Xi otherwise
best-performing one.
The methodology involves several steps to update the In Eq. (20), PrDo represents the probability value applied
agents (solutions) and improve their quality. Firstly, the BMO to balance between using DOL or not. X DoJ for each solution
operators are applied to update the agents. The dynamic OBL Xij is defined as:
(DOL) is also utilized to enhance the initial solutions and
select the best ones from the combined set of solutions and XijDoJ = Xij + w × rand × (rand × Xijo − Xij ), w > 0 (19)
their opposite values. This updating process occurs during
the exploration phase, where the BMO and TM operators If the XiN fitness value is better than Xi , preserve it in the
are employed. Conversely, the BMO and DOL operators are population by replacing it with Xi ; otherwise, keep Xi .

VOLUME 11, 2023 73067


M. A. A. Al-Qaness et al.: Boosted Barnacles Algorithm Optimizer

FIGURE 2. The main workflow of the DBMT.

73068 VOLUME 11, 2023


M. A. A. Al-Qaness et al.: Boosted Barnacles Algorithm Optimizer

B. UPDATING PROCESS Algorithm 2 Steps of DBMT


In this stage, the DOL, TM, and BMO integration is 1: Inputs: Total number of agent N , maximum number of
implemented to boost the exploration and exploitation iterations T , Dim dimension of the supposed problem,
processes. To conduct this task, the BMO and TM compete lower (lb) and upper (ub) boundaries
during the exploration phase. This was performed using the 2: Outputs: Best solution Xb
following formula: 3: Using Eq.(15) from the first phase to initialize N agents
(xi (i = 1, 2, . . . , N )).
Use Eq.(6) if PrTM > RTM

N 4: t = 0.
Xi = (20)
Use Eq.(10) otherwise 5: while (t ≤ T ) do
where RTM and PrTM denote the random number and 6: Computing Fi fitness value for each agent xi .
probability used to switch between using the operators of 7: Find the best fitness value Fb and its agent Xb .
BMO and TM. The value of PrTM and RTM is computed using 8: for (each barnacles (Xi )) do
Eq. (21) and Eq. (22), respectively: 9: Update the value of pl.
10: Using Eqs. (4)-(5) for selecting parents.
Fiti
PrTM = PN (21) 11: if Selection of Dad and Mum is pl then
i=1 Fiti 12: if Pri ≤ 0.5 then
RTM = (max(PrTM ) − min(PrTM )) × rand × (min(PrTM ) 13: Apply Eq. (6) to update Xi
(22) 14: else
whereas the BMO operators during the exploitation phase are 15: Apply Eq. (10) to update Xi
employed to enhance the solutions X , this is represented using 16: else if Selection of Dad and Mum > pl then
Eq. (7). The following step is to employ the DOL operator 17: Using Eq. (7) to update Xi
to the best solution Xb , determine which one (i.e., Xb and 18: Apply the DOP to the updated agents X using
its opposite value) is the best, and preserve it. Following Eq.(17).
this process, the limits of the search domain are dynamically 19: t = t + 1.
updated during the searching process, and this is performed 20: Return X∗
as:
Lj = min(Xij ) (23) TABLE 1. The details description of the datasets.

Uj = max(Xij ) (24)
Finally, the stop condition is checked, and when it is
satisfied, the Xb will be returned. Otherwise, the updating
process is conducted again.

C. ASSESSMENT PHASE
Within this stage, we use the testing set to compute the
efficiency of the obtained best solution Xb . This process is
performed by removing the irrelevant features corresponding
to zeros in the binary of Xb . Then, the output can be predicted
by applying the well-known classifier, kNN. After that,
several evaluation indicators will be implemented to assess
the quality of obtained outputs. The steps of DBMT are algorithms were run on an 8GB RAM Intel Core i7 processor
illustrated in Algorithm 2. with Matlab 2018b and Windows 10 64bit in this study. The
maximum iteration number is 50, whereas the population is
IV. EXPERIMENTAL EVALUATION set to 25 [58], [59], with 30 runs for each method. In addition,
A. EVALUATION SETUP the parameter of each algorithm is determined according to
In this part, we detail the conducted experiments used the original implementation.
to assess the efficacy of our developed approach for In this experiment, we use six datasets from the ASU
processing general IoT data. In addition, the developed database [60] to fairly evaluate the performance and effec-
method is compared to several well-known feature selec- tiveness. More so, we use two genomic microarray datasets,
tion approaches, such as, Manta ray foraging optimization called SMK-CAN and TOX-171. Additionally, we used
(MRFO) [50], AEO, and DE (AEODE) [51], MRFO and four image benchmark datasets, namely AR10P, PIX10P,
DE (MRFODE) [52], Henry gas solubility optimization ORL10P, and PIE10P. Those are two-classes and multi-
(HGSO) [53], GWO [54], GA [55], PSO [56], and SSA [57]. classes datasets for IoT applications, with 2400 to 19,993
These methods have established their efficiency as an FS characteristics. The details of the employed datasets are
in variant applications. To ensure a fair comparison, all presented in Table 1.

VOLUME 11, 2023 73069


M. A. A. Al-Qaness et al.: Boosted Barnacles Algorithm Optimizer

FIGURE 3. The results of the average values (Avg).

In addition, we used the following performance metrics to In Eq. (31), the TP , TN , FP , and FN are the true positives,
assess the quality of DBMT and other methods. true negatives, false positives, and false negatives,
• Accuracy measure that defined as: respectively.

TP + TN
Accuracy(Acc) = (25)
TP + TN + FP + FN B. EXPERIMENT 1: FS USING BENCHMARK DATASETS
• Sensitivity measure that defined as: In this section, the DBMT is tested using eleven datasets.
The evaluation outcome of the DBMT is compared to nine
TP
Sensitivity(Sens) = (26) algorithms to test the performance of the DBMT; these
(TP + FN ) algorithms are BMO, MRFO, AEO, MRFO, HGSO, bGWO,
• Specificity measure that defined as: SGA, BPSO, and SSA. We use six performance mea-
sures, called minimum (MIN), maximum (MAX), accuracy
TN
Specificity(Spec) = (27) (Acc), standard deviation (St), and average (Avg) of the
(TN + FP) fitness function values, in addition to the number of the
where TP, TN , FP, and FN refers to the true positives, true selected features. The comparison outcomes are presented in
negatives, false positives, and false negatives, respectively. Tables 2-3.
To validate the efficiency of the algorithms, we used a Table 2 presents the results of the fitness function’s average
set of performance metrics is used for example Accuracy, values (Avg). From Table 2, the DBMT obtained the best
Sensitivity, Specificity, and Fitness value. The definition of Avg in 4 out of 11 datasets (i.e., GLA-BRA-180, GLI-85,
these measures is given as: PCMAC, and SMK-CAN-187). The original BMO recorded
• Average of the Fitb value is computed as: the second rank, which achieved the best Avg in 3 datasets
Nr
(i.e., CLL-SUB-111, PIXRAW10P, and WARPIE10P). The
1 X MRFO obtained the third rank, followed by HGSO,
Average of Fitb = Fit ib (28)
Nr MRFOM, BPSO, AEO, bGWO, and SSA. The SGA obtained
i=1
the worst outcomes. Figure 3 shows the Avg of all algorithms.
where Nr stands for the number for the number of runs Furthermore, the best outcomes of the fitness value are
and Fitb is the best value of fitness value at run ith . presented in Table 2. We can see that the DBMT recorded
• Accuracy metric is formulated as: the best Min values in 4 out of 11 datasets; it got the
TP + TN best Min value in GLI-85, PCMAC, PIXRAW10P, and
Accuracy = (29) SMK-CAN-187; however, it came in the second rank after
TP + TN + FP + FN
the BMO, which recorded the best Min value in 5 out of
• Sensitivity metric is defined as:
11 datasets. The MRFO obtained the third rank, followed
TP by MRFOM, HGSO, BPSO, and AEO. The SGA also
Sensitivity = (30)
(TP + FN ) recorded the worst performance. More so, Figure 4 plots
• Specificity metric is given as: the comparative outcomes of the average of the best fitness
function values.
TN In terms of the Max measure, as shown in 2 the DBMT
Specificity = (31)
(TN + FP ) showed good performance compared to the other algorithms

73070 VOLUME 11, 2023


M. A. A. Al-Qaness et al.: Boosted Barnacles Algorithm Optimizer

FIGURE 4. The average results of the best values.

by recording the best Max values in 3 out of 11 datasets C. EXPERIMENT 2: COMPARISON WITH OTHER OBL
(i.e., CLL-SUB-111, GLA-BRA-180, and SMK-CAN-187) TECHNIQUES
as well as it provided competitive Max values in the other In this section, we compare the results of DBMT based on
datasets. The BMO obtained the second rank by getting the DOL with its performance using different OBL techniques.
best values in 3 datasets (i.e., PCMAC, PIXRAW10P, and In this study, we used four OBL techniques named standard
WARPIE10P). Although both DBMT and BMO got the best OBL, quasi-reflected OBL (QROBL), super OBL (SOBL),
outcomes in the three datasets, the average value of the and quasi-OBL (QOBL). In this experiment, we used six
DBMT was better than BMO. The MRFO was ranked third, datasets namely D1-D6 as given in Table 1.
followed by HGSO, BPSO, and MRFOM. Figure 5 shows Table 6 depicts the average accuracy, sensitivity, speci-
the Max results of the fitness function computed by Eq. 17 of ficity, and fitness value for each OBL technique. From these
all methods. results, we can be noticed that the DOL is better than other
Additionally, standard division results are also listed in techniques based on the value of accuracy which nearly
Table 2. Our DBMT method showed good standard division has the highest value among the six datasets. Based on the
results, whereas the BPSO showed small values, followed by results of Sensitivity, DOL, and QOBL(SOBL) has better
bGWO, BMO, MRFO, and HGSO. The MRFOM showed the results at three and two datasets respectively. Whereas, OBL
worst results. In general, the results of all algorithms were provides better results at only one dataset named D5 and this
similar to some extent. Figure 6 shows the standard division performance is the same as DOL and SOBL. By analysis
results of all methods. of the results of Specificity, the DOL has the highest value
Furthermore, the number of the selected features by all at five datasets, and OBL has the best value at only D3.
algorithms is listed in Table 3, where the best algorithm In addition, the average fitness value refers to the ability of
is the one that selects the lowest features and obtains the DOL to balance between the error classification and the ratio
high accuracy result. As presented in the table, the proposed of selected features. The same observation can be seen in
DBMT selected the smallest number of features and got Figure 7. These results indicate the high ability of DOL to
the first rank; it obtained the lowest features in 8 out of improve the performance of the developed DBMT to reach
11 datasets. The second rank algorithm was the MRFO, the optimal features.
followed by MRFO, HGSO, MRFOM, BMO, bGWO, and
AEO. D. EXPERIMENT 3: SIoT
Moreover, Table 5 records the classification accuracy For SIoT applications, the DBMT is evaluated using several
measure of all algorithms. The proposed DBMT achieved the SIoT datasets, as shown in Table 7 [61]. The datasets include
best results in 36% of the datasets in this measure. Thus, applications such as sensor-based human activity recognition,
it can classify the features with higher accuracy than the sentiment analysis, healthcare, and event detection. In total,
compared algorithms, achieving the same accuracy results we acquired eight datasets; five datasets were obtained from
as the other algorithms in 27% of the datasets. The second the UCI repository,1 including:
rank was recorded by the BMO algorithm, followed by • GPS trajectories are used to collect GPS coordinates,
MRFO, HGSO, BPSO, AEO, MRFOM, and SSA. The SGA trajectory id, and duration of car and bus vehicles
algorithm obtained the lowest accuracy. In addition, Table 4 transporting in a city.
shows acceptable standard deviation values for the DBMT
method compared to the other algorithms. 1 https://archive.ics.uci.edu/ml/datasets.php

VOLUME 11, 2023 73071


M. A. A. Al-Qaness et al.: Boosted Barnacles Algorithm Optimizer

FIGURE 5. Max value of the fitness function for all methods.

FIGURE 6. Average of values of the standard deviation measure.

• GAS sensor data: this data has 100 records of 8 MOX • The C6 dataset is an English crisis event detection
gas sensors, temperature, and humidity data in a house dataset used for crisis prediction, such as, earthquakes,
environment. tornadoes, hurricanes, wildfires, and floods [64].
• MovementAAL (‘‘Indoor User Movement Prediction In our experiment, we used the following split to train
from a radio signal strength (RSS)’’) data uses Wireless and evaluate our model, which is 70% and 30% ratios
Sensor Network (WSN) to collect the movement data for training and testing, respectively. Table 7 lists detailed
of a user in an office environment using a radio signal descriptions of each used dataset. Concerning textual data,
strength (RSS) from various wireless sensor networks we used BERT-base [65] as the pre-trained model to extract
(WSN) nodes to predict object movements pattern (i.e., text feature representations and map them to a 192 vector
human). size.
• Hepatitis dataset consists of 155 records of patients with The results of the DBMT were compared to nine opti-
hepatitis C to predict the patient’s risk of living or dying. mization methods as in the previous experiment. We used
• UCI-HAR: is a human activity recognition data recorded five measures, the average of the fitness function values
using a smartphone where 30 persons performed and the number of the selected features, classification
six different activities: walking, sitting, standing, accuracy, specificity, and sensitivity. Comparison outcomes
and laying while placing the smartphone on their are presented in Tables 8-12.
waist. Table 8 shows the average fitness function’s comparison
• The STS-Gold [62] and SemEval2017 Task4 dataset [63] outcomes. From Table 8, the DBMT recorded the best
are English sentiment analysis data collected from outcomes in 38% of the datasets (i.e., GAS sensors, UCI-
Twitter. Each tweet was labeled positive, negative, and HAR, and C6), and it showed similar results in the hepatitis
neutral in those datasets. dataset with both BPSO and BMO. The BPSO achieved the

73072 VOLUME 11, 2023


M. A. A. Al-Qaness et al.: Boosted Barnacles Algorithm Optimizer

TABLE 2. Fitness value of DBMT and other methods.

second rank. Which recorded the best outcomes in 2 datasets similarly to some extent, followed by BMO and bGWO.
(i.e., STS-Gold and GPS trajectories), followed by the BMO, Figure 8 shows the average of all selected features.
which got the best average in both SemEval2017 Task4 and More so, the results of the classification accuracy were
MovementAAL datasets. Other compared algorithms were listed in Table 10 for all algorithms. In this measure,
ranked as MRFO, HGSO, bGWO, and SSA. Finally, the SGA the proposed DBMT achieved the best accuracy result
recorded the worst results. in 50% of the NLP datasets and ranked first. It also
Table 9 shows the number of selected features. As noticed, achieved the same results with other methods in both
the DBMT significantly selected the smallest number of hepatitis and MovementAAL datasets. Therefore, it could
features in 6 out of 8 datasets and got the first rank. The select minor features with higher accuracy than the other
second rank was the BPSO. MRFO and HGSO performed methods. The BMO and BPSO were ranked second and

VOLUME 11, 2023 73073


M. A. A. Al-Qaness et al.: Boosted Barnacles Algorithm Optimizer

TABLE 3. Average of the selected feature number of DBMT and other methods.

TABLE 4. STD of fitness value for DBMT and other methods.

TABLE 5. Accuracy obtained using DBMT and other methods.

FIGURE 7. Average of accuracy, sensitivity, specificity, and fitness value for OBL techniques among D1-D6 datasets.

73074 VOLUME 11, 2023


M. A. A. Al-Qaness et al.: Boosted Barnacles Algorithm Optimizer

TABLE 6. Comparsion results between DOL and other OBL techniques.

FIGURE 8. Selected features average of all used datasets.

TABLE 7. The details of each applied data. In this measure, the DBMT showed the best outcomes in
2 out of 8 datasets (i.e., GPS trajectories and UCI-HAR)
and recorded the best results with other algorithms in
5 datasets. The traditional BMO and BPSO came in the
second and third ranks. The worst algorithm was the
SSA.
These results show that the developed DBMT approach can
address numerous feature selection tasks due to its behavior
in exploring the search space and its high ability to obtain
optimal values. It showed superior performance compared
to other optimization techniques. This results from the
advantages of integrating BMO with the TM and DOL. This
third, respectively. Additionally, the lowest accuracy val- integration enhances the exploration ability at the beginning
ues in these datasets were recorded by the bGWO and iterations, whereas avoiding attraction to local points during
SSA methods. exploitation. Therefore, DBMT leads to improve prediction.
Table 11 lists the result of the sensitivity measure. The table However, the developed DBMT still suffers from some
shows that DBMT and BMO achieved the best sensitivity limitations that influence its performance such as its time
results in four datasets. The BPSO and MRFO were ranked computation still needs to enhance it. This can be achieved
third and fourth, followed by HGSO and SGA. In addition, through reducing the number of iterations and the number
Table 12 shows the specificity results for all algorithms. of solutions. In addition, the process of selecting the initial

VOLUME 11, 2023 73075


M. A. A. Al-Qaness et al.: Boosted Barnacles Algorithm Optimizer

TABLE 8. Results of Avg. of fitness Functions.

TABLE 9. Selected attribute obtained using DBMT and other methods.

TABLE 10. Accuracy obtained using DBMT and other methods.

TABLE 11. Sensitivity obtained using DBMT and other methods.

TABLE 12. Specificity obtained using DBMT and other methods.

73076 VOLUME 11, 2023


M. A. A. Al-Qaness et al.: Boosted Barnacles Algorithm Optimizer

population has an influence on the convergence of the DBMT [8] S. A. Ajagbe, S. Misra, O. F. Afe, and K. I. Okesola, ‘‘Internet of Things
towards the optimal subset features. (IoT) for secure and sustainable healthcare intelligence: Analysis and
challenges,’’ in Proc. Int. Conf. Appl. Inform. Cham, Switzerland: Springer,
2022, pp. 45–59.
V. CONCLUSION [9] S. Shahab, P. Agarwal, T. Mufti, and A. J. Obaid, ‘‘SIoT (social Internet of
Things): A review,’’ in ICT Analysis and Applications, 2022, pp. 289–297.
In conclusion, this study introduced a modified version of [10] M. M. Rad, A. M. Rahmani, A. Sahafi, and N. N. Qader, ‘‘Social Internet of
the Barnacles Algorithm Optimizer (BMO) by incorporating Things: Vision, challenges, and trends,’’ Hum.-Centric Comput. Inf. Sci.,
two well-known artificial search mechanisms: Triangular vol. 10, no. 1, pp. 1–40, Dec. 2020.
[11] Y. Lin, X. Zhu, Z. Zheng, Z. Dou, and R. Zhou, ‘‘The individual
mutation and dynamic Opposition-based learning (DOL). identification method of wireless device based on dimensionality reduction
By leveraging the superior search capabilities of DOL and and machine learning,’’ J. Supercomput., vol. 75, no. 6, pp. 3010–3027,
Triangular mutation, we addressed the limitations of the tra- Jun. 2019.
[12] S. Janakiraman, ‘‘A hybrid ant colony and artificial bee colony optimiza-
ditional BMO algorithm, specifically its convergence speed tion algorithm-based cluster head selection for IoT,’’ Proc. Comput. Sci.,
and solution quality. These techniques improved the search vol. 143, pp. 360–366, Jan. 2018.
[13] T. A. Sai Srinivas and S. S. Manivannan, ‘‘Prevention of hello flood
process by generating diverse solutions and enhancing their attack in IoT using combination of deep learning with improved rider
quality. The newly developed approach, DBMT, was tested on optimization algorithm,’’ Comput. Commun., vol. 163, pp. 162–175,
various benchmark functions and applications. Specifically, Nov. 2020.
[14] S. Vimal, M. Khari, R. G. Crespo, L. Kalaivani, N. Dey, and M. Kaliappan,
it was evaluated as a feature selection (FS) method in ‘‘Energy enhancement using multiobjective ant colony optimization with
Social Internet of Things (SIoT) datasets, comparing it to double Q learning algorithm for IoT based cognitive radio networks,’’
the traditional BMO and other competitive optimization Comput. Commun., vol. 154, pp. 481–490, Mar. 2020.
[15] R. Ramteke, S. Singh, and A. Malik, ‘‘Optimized routing technique
algorithms. The results confirmed that the integration of for IoT enabled software-defined heterogeneous WSNs using genetic
Triangular mutation and DOL enhanced the performance of mutation based PSO,’’ Comput. Standards Interface, vol. 79, Jan. 2022,
BMO, with DBMT outperforming both the traditional BMO Art. no. 103548.
[16] P. K. R. Maddikunta, T. R. Gadekallu, R. Kaluri, G. Srivastava,
and other optimization methods. The developed DBMT has R. M. Parizi, and M. S. Khan, ‘‘Green communication in IoT networks
a high average of accuracy than the traditional BMO of 72%. using a hybrid optimization algorithm,’’ Comput. Commun., vol. 159,
pp. 97–107, Jun. 2020.
In addition, the number of selected features is better than the [17] X. Kan, Y. Fan, Z. Fang, L. Cao, N. N. Xiong, D. Yang, and X. Li,
others in most datasets. ‘‘A novel IoT network intrusion detection approach based on adaptive
From the results, the DBMT can be further applied to particle swarm optimization convolutional neural network,’’ Inf. Sci.,
vol. 568, pp. 147–162, Aug. 2021.
other IoT and intelligent environment applications as an [18] A. Goyal, H. S. Kanyal, S. Kaushik, and R. Khan, ‘‘IoT based cloud
FS method, such as human activity recognition, intrusion network for smart health care using optimization algorithm,’’ Informat.
detection system, IoT task scheduling, intelligent home Med. Unlocked, vol. 27, Jan. 2021, Art. no. 100792.
[19] M. A. Elaziz, L. Abualigah, and I. Attiya, ‘‘Advanced optimization
power scheduling for appliances, etc. technique for scheduling IoT tasks in cloud-fog computing environments,’’
Future Gener. Comput. Syst., vol. 124, pp. 142–154, Nov. 2021.
[20] D. Bahmanyar, N. Razmjooy, and S. Mirjalili, ‘‘Multi-objective scheduling
ACKNOWLEDGMENT of IoT-enabled smart homes for energy management based on arithmetic
This work was supported by the Deputyship for Research optimization algorithm: A node-RED and NodeMCU module-based
and Innovation, Ministry of Education, Saudi Arabia, under technique,’’ Knowl.-Based Syst., vol. 247, Jul. 2022, Art. no. 108762.
[21] M. Padmaa, T. Jayasankar, S. Venkatraman, A. K. Dutta, D. Gupta,
Project IF-PSAU-2022/01/19574. S. Shamshirband, and J. J. P. C. Rodrigues, ‘‘Oppositional chaos game
optimization based clustering with trust based data transmission protocol
for intelligent IoT edge systems,’’ J. Parallel Distrib. Comput., vol. 164,
REFERENCES pp. 142–151, Jun. 2022.
[1] M. Javaid and I. H. Khan, ‘‘Internet of Things (IoT) enabled healthcare [22] M. Ghobaei-Arani and A. Shahidinejad, ‘‘A cost-efficient IoT service
helps to take the challenges of COVID-19 pandemic,’’ J. Oral Biol. placement approach using whale optimization algorithm in fog computing
Craniofacial Res., vol. 11, no. 2, pp. 209–214, Apr. 2021. environment,’’ Expert Syst. Appl., vol. 200, Aug. 2022, Art. no. 117012.
[2] Z. N. Aghdam, A. M. Rahmani, and M. Hosseinzadeh, ‘‘The role of the [23] D. Soni, D. Srivastava, A. Bhatt, A. Aggarwal, S. Kumar, and M. A. Shah,
Internet of Things in healthcare: Future trends and challenges,’’ Comput. ‘‘An empirical client cloud environment to secure data communication with
Methods Programs Biomed., vol. 199, Feb. 2021, Art. no. 105903. alert protocol,’’ Math. Problems Eng., vol. 2022, pp. 1–14, Sep. 2022.
[3] W. Choi, J. Kim, S. Lee, and E. Park, ‘‘Smart home and Internet of [24] A. Aggarwal, P. Dimri, and A. Agarwal, ‘‘Statistical performance
Things: A bibliometric study,’’ J. Cleaner Prod., vol. 301, Jun. 2021, evaluation of various metaheuristic scheduling techniques for cloud
Art. no. 126908. environment,’’ J. Comput. Theor. Nanosci., vol. 17, no. 9, pp. 4593–4597,
Jul. 2020.
[4] S. Y. Y. Tun, S. Madanian, and F. Mirza, ‘‘Internet of Things (IoT)
[25] A. Aggarwal, P. Dimri, and A. Agarwal, ‘‘Survey on scheduling algorithms
applications for elderly care: A reflective review,’’ Aging Clin. Experim.
for multiple workflows in cloud computing environment,’’ Int. J. Comput.
Res., vol. 33, no. 4, pp. 855–867, Apr. 2021.
Sci. Eng., vol. 7, no. 6, pp. 565–570, Jun. 2019.
[5] M. A. A. Al-qaness, A. Dahou, M. A. Elaziz, and A. M. Helmi, ‘‘Multi- [26] M. H. Sulaiman, Z. Mustaffa, M. M. Saari, and H. Daniyal, ‘‘Barnacles
ResAtt: Multilevel residual network with attention for human activity mating optimizer: A new bio-inspired algorithm for solving engineering
recognition using wearable sensors,’’ IEEE Trans. Ind. Informat., vol. 19, optimization problems,’’ Eng. Appl. Artif. Intell., vol. 87, Jan. 2020,
no. 1, pp. 144–152, Jan. 2023. Art. no. 103330.
[6] A. Dahou, A. O. Aseeri, A. Mabrouk, R. A. Ibrahim, M. A. Al-Betar, [27] H. Li, H. Guo, and N. Yousefi, ‘‘A hybrid fuel cell/battery vehicle by con-
and M. A. Elaziz, ‘‘Optimal skin cancer detection model using transfer sidering economy considerations optimized by converged barnacles mating
learning and dynamic-opposite hunger games search,’’ Diagnostics, optimizer (CBMO) algorithm,’’ Energy Rep., vol. 6, pp. 2441–2449,
vol. 13, no. 9, p. 1579, Apr. 2023. Feb. 2020.
[7] C. Song, S. Liu, G. Han, P. Zeng, H. Yu, and Q. Zheng, ‘‘Edge-intelligence- [28] R. M. Rizk-Allah and A. A. El-Fergany, ‘‘Conscious neighborhood
based condition monitoring of beam pumping units under heavy noise in scheme-based Laplacian barnacles mating algorithm for parameters
industrial Internet of Things for industry 4.0,’’ IEEE Internet Things J., optimization of photovoltaic single-and double-diode models,’’ Energy
vol. 10, no. 4, pp. 3037–3046, Mar. 2022. Convers. Manage., vol. 226, Mar. 2020, Art. no. 113522.

VOLUME 11, 2023 73077


M. A. A. Al-Qaness et al.: Boosted Barnacles Algorithm Optimizer

[29] H. Jia and K. Sun, ‘‘Improved barnacles mating optimizer algorithm for [51] A. A. Ewees, L. Abualigah, D. Yousri, A. T. Sahlol, M. A. A. Al-Qaness,
feature selection and support vector machine optimization,’’ Pattern Anal. S. Alshathri, and M. A. Elaziz, ‘‘Modified artificial ecosystem-based opti-
Appl., vol. 24, no. 3, pp. 1249–1274, Aug. 2021. mization for multilevel thresholding image segmentation,’’ Mathematics,
[30] M. H. Sulaiman and Z. Mustaffa, ‘‘Optimal chiller loading solution for vol. 9, no. 19, p. 2363, Sep. 2021.
energy conservation using barnacles mating optimizer algorithm,’’ Results [52] M. A. Elaziz, K. M. Hosny, A. Salah, M. M. Darwish, S. Lu, and
Control Optim., vol. 7, Apr. 2022, Art. no. 100109. A. T. Sahlol, ‘‘New machine learning method for image-based diagnosis
[31] R. Devarapalli, B. Bhattacharyya, and A. Kumari, ‘‘A novel approach of COVID-19,’’ PLoS ONE, vol. 15, no. 6, Jun. 2020, Art. no. e0235187.
of intensified barnacles mating optimization for the mitigation of power [53] D. Mohammadi, M. Abd Elaziz, R. Moghdani, E. Demir, and S. Mirjalili,
system oscillations,’’ Concurrency Comput., Pract. Exper., vol. 33, no. 17, ‘‘Quantum Henry gas solubility optimization algorithm for global
pp. 1–16, Sep. 2021. optimization,’’ Eng. Comput., vol. 38, pp. 1–20, Mar. 2021.
[32] H. Li, G. Zheng, K. Sun, Z. Jiang, Y. Li, and H. Jia, ‘‘A logistic chaotic [54] R. A. Ibrahim, M. A. Elaziz, and S. Lu, ‘‘Chaotic opposition-based grey-
barnacles mating optimizer with Masi entropy for color image multilevel wolf optimization algorithm based on differential evolution and disruption
thresholding segmentation,’’ IEEE Access, vol. 8, pp. 213130–213153, operator for global optimization,’’ Expert Syst. Appl., vol. 108, pp. 1–27,
2020. Oct. 2018.
[33] M. A. Al-Qaness, A. A. Ewees, H. Fan, A. M. AlRassas, and M. A. Elaziz, [55] X. H. Shi, Y. C. Liang, H. P. Lee, C. Lu, and L. M. Wang, ‘‘An improved GA
‘‘Modified Aquila optimizer for forecasting oil production,’’ Geo-Spatial and a novel PSO-GA-based hybrid algorithm,’’ Inf. Process. Lett., vol. 93,
Inf. Sci., vol. 25, no. 4, pp. 1–17, 2022. no. 5, pp. 255–261, Mar. 2005.
[34] A. M. A. Rassas, M. A. A. Al-Qaness, A. A. Ewees, S. Ren, R. Sun, L. Pan, [56] R. A. Ibrahim, A. A. Ewees, D. Oliva, M. Abd Elaziz, and S. Lu,
and M. A. Elaziz, ‘‘Advance artificial time series forecasting model for oil ‘‘Improved Salp swarm algorithm based on particle swarm optimization for
production using neuro fuzzy-based slime mould algorithm,’’ J. Petroleum feature selection,’’ J. Ambient Intell. Humanized Comput., vol. 10, no. 8,
Explor. Prod. Technol., vol. 12, no. 2, pp. 383–395, Feb. 2022. pp. 3155–3169, Aug. 2019.
[35] L. Abualigah, K. H. Almotairi, M. A. A. Al-qaness, A. A. Ewees, [57] M. Abd Elaziz, A. A. Ewees, and Z. Alameer, ‘‘Improving adaptive neuro-
D. Yousri, M. A. Elaziz, and M. H. Nadimi-Shahraki, ‘‘Efficient text fuzzy inference system based on a modified salp swarm algorithm using
document clustering approach using multi-search arithmetic optimization genetic algorithm to forecast crude oil price,’’ Natural Resour. Res., vol. 29,
algorithm,’’ Knowl.-Based Syst., vol. 248, Jul. 2022, Art. no. 108833. no. 4, pp. 2671–2686, Aug. 2020.
[36] S. Gupta, K. Deep, A. A. Heidari, H. Moayedi, and M. Wang, ‘‘Opposition- [58] A. P. Piotrowski, J. J. Napiorkowski, and A. E. Piotrowska, ‘‘Population
based learning Harris hawks optimization with advanced transition rules: size in particle swarm optimization,’’ Swarm Evol. Comput., vol. 58,
Principles and analysis,’’ Expert Syst. Appl., vol. 158, Nov. 2020, Nov. 2020, Art. no. 100718.
Art. no. 113510. [59] H. Zhang, T. Liu, X. Ye, A. A. Heidari, G. Liang, H. Chen, and
[37] S. Gupta and K. Deep, ‘‘A hybrid self-adaptive sine cosine algorithm with Z. Pan, ‘‘Differential evolution-assisted salp swarm algorithm with chaotic
opposition based learning,’’ Expert Syst. Appl., vol. 119, pp. 210–230, structure for real-world problems,’’ Eng. Comput., vol. 39, pp. 1–35,
Apr. 2019. Jan. 2022.
[38] W. Long, J. Jiao, X. Liang, S. Cai, and M. Xu, ‘‘A random opposition-based
[60] Z. Zhao, F. Morstatter, S. Sharma, S. Alelyani, A. Anand, and H. Liu,
learning grey wolf optimizer,’’ IEEE Access, vol. 7, pp. 113810–113825,
‘‘Advancing feature selection research,’’ ASU Feature Selection Reposi-
2019.
tory, vol. 10, pp. 1–28, Jan. 2010.
[39] D. Bairathi and D. Gopalani, ‘‘Opposition based Salp swarm algorithm
[61] S. K. Lakshmanaprabu, K. Shankar, A. Khanna, D. Gupta,
for numerical optimization,’’ in Proc. Int. Conf. Intell. Syst. Design Appl.
J. J. P. C. Rodrigues, P. R. Pinheiro, and V. H. C. De Albuquerque,
Cham, Switzerland: Springer, 2018, pp. 821–831.
‘‘Effective features to classify big data using social Internet of Things,’’
[40] A. A. Ewees, M. A. Elaziz, and E. H. Houssein, ‘‘Improved grasshopper
IEEE Access, vol. 6, pp. 24196–24204, 2018.
optimization algorithm using opposition-based learning,’’ Expert Syst.
Appl., vol. 112, pp. 156–172, Dec. 2018. [62] R. Ahuja and S. Sharma, ‘‘Sentiment analysis on different domains using
[41] Z. Zhang, H. Huang, C. Huang, and B. Han, ‘‘An improved TLBO with machine learning algorithms,’’ in Advances in Data and Information
logarithmic spiral and triangular mutation for global optimization,’’ Neural Sciences. Cham, Switzerland: Springer, 2022, pp. 143–153.
Comput. Appl., vol. 31, no. 8, pp. 4435–4450, Aug. 2019. [63] S. Rosenthal, N. Farra, and P. Nakov, ‘‘SemEval-2017 task 4: Sentiment
[42] J. Guo, Y. Wu, W. Xie, and S. Jiang, ‘‘Triangular Gaussian mutation analysis in Twitter,’’ 2019, arXiv:1912.00741.
to differential evolution,’’ Soft Comput., vol. 24, no. 12, pp. 9307–9320, [64] J. Liu, T. Singhal, L. T. M. Blessing, K. L. Wood, and K. H. Lim,
Jun. 2020. ‘‘CrisisBERT: A robust transformer for crisis classification and contextual
[43] Z. Tan and K. Li, ‘‘Differential evolution with mixed mutation strategy crisis embedding,’’ in Proc. 32st ACM Conf. Hypertext Social Media,
based on deep reinforcement learning,’’ Appl. Soft Comput., vol. 111, Aug. 2021, pp. 133–141.
Nov. 2021, Art. no. 107678. [65] J. Devlin, M.-W. Chang, K. Lee, and K. Toutanova, ‘‘BERT: Pre-training
[44] M. Neshat, M. M. Nezhad, S. Mirjalili, D. A. Garcia, E. Dahlquist, and of deep bidirectional transformers for language understanding,’’ 2018,
A. H. Gandomi, ‘‘Short-term solar radiation forecasting using hybrid deep arXiv:1810.04805.
residual learning and gated LSTM recurrent network with differential
covariance matrix adaptation evolution strategy,’’ Energy, vol. 278,
Sep. 2023, Art. no. 127701.
[45] M. Barazandeh, C. S. Davis, C. J. Neufeld, D. W. Coltman, and
A. R. Palmer, ‘‘Something Darwin didn’t know about barnacles: Sperm-
cast mating in a common stalked species,’’ Proc. Roy. Soc. B, Biol. Sci.,
vol. 280, no. 1754, Mar. 2013, Art. no. 20122919.
[46] Y. Yusa, M. Yoshikawa, J. Kitaura, M. Kawane, Y. Ozaki, S. Yamato, MOHAMMED A. A. AL-QANESS received the
and J. T. Høeg, ‘‘Adaptive evolution of sexual systems in pedunculate B.S., M.S., and Ph.D. degrees in information
barnacles,’’ Proc. Roy. Soc. B, Biol. Sci., vol. 279, no. 1730, pp. 959–966, and communication engineering from the Wuhan
Mar. 2012. University of Technology, Wuhan, China, in 2010,
[47] A. W. Mohamed, ‘‘An improved differential evolution algorithm with 2014, and 2017, respectively. He was an Assistant
triangular mutation for global numerical optimization,’’ Comput. Ind. Eng., Professor with the School of Computer Science,
vol. 85, pp. 359–375, Jul. 2015. Wuhan University, Wuhan. He was also a Postdoc-
[48] H. R. Tizhoosh, ‘‘Opposition-based learning: A new scheme for machine toral Follower with the State Key Laboratory for
intelligence,’’ in Proc. Int. Conf. Comput. Intell. Model., Control Autom.
Information Engineering in Surveying, Mapping,
Int. Conf. Intell. Agents, Web Technol. Internet Commerce, vol. 1,
and Remote Sensing, Wuhan University. He is
Nov. 2005, pp. 695–701.
[49] E. Aarts, E. H. Aarts, and J. K. Lenstra, Local Search in Combinatorial currently a Professor with the Department of Electronic Information
Optimization. Princeton, NJ, USA: Princeton Univ. Press, 2003. Engineering, College of Physics and Electronic Information Engineering,
[50] W. Zhao, Z. Zhang, and L. Wang, ‘‘Manta ray foraging optimization: Zhejiang Normal University. His current research interests include wireless
An effective bio-inspired optimizer for engineering applications,’’ Eng. sensing, human activity recognition (HAR), mobile computing, machine
Appl. Artif. Intell., vol. 87, Jan. 2020, Art. no. 103300. learning, signal and image processing, and natural language processing.

73078 VOLUME 11, 2023


M. A. A. Al-Qaness et al.: Boosted Barnacles Algorithm Optimizer

AHMED A. EWEES (Senior Member, IEEE) AHMAD O. ASEERI (Member, IEEE) received
received the Ph.D. degree from Damietta Univer- the bachelor’s degree in computing from King
sity, Egypt, in 2012. He is currently an Associate Saud University, Saudi Arabia, the master’s degree
Professor of computer science with Damietta in computer science from the University of
University. He co-supervises the master’s and Wisconsin–Madison, USA, and the Ph.D. degree
Ph.D. students and leads and supervises various in computer science from Texas Tech University,
graduation projects. He has published many sci- USA. He is currently an Assistant Professor with
entific research papers in international journals the Department of Computer Science, College
and conferences. His research interests include of Computer Engineering and Sciences, Prince
machine learning, artificial intelligence, text min- Sattam bin Abdulaziz University, Saudi Arabia.
ing, natural language processing, image processing, and metaheuristic He received the Full Scholarship and the Government Scholarship for the
optimization techniques. master’s and Ph.D. studies, respectively. His main research interests include
artificial intelligence (AI), main focus in the area of deep learning, with
application to neural network-based risk analysis in physical unclonable
MOHAMED ABD ELAZIZ received the B.S. and functions for resource constraint IoTs, natural language processing (NLP),
M.S. degrees in computer science and the Ph.D. and computer vision; data mining, with application to clustering techniques,
degree in mathematics and computer science from including bisecting K-means clustering (BKM), limited-iteration bisecting
Zagazig University, Egypt, in 2008, 2011, and K-means (LIBKM), and memory-aware clustering algorithms; and applied
2014, respectively. From 2008 to 2011, he was deep learning for medical applications.
an Assistant Lecturer with the Department of
Computer Science. He is an Associate Professor
with Zagazig University. He is the author of more
than 430 articles. He is one of the 2% influential
scholars, which depicts the 100,000 top-scientists
in the world. He is one of the highest cited researchers according to
WOS 2022. His research interests include metaheuristic technique, medical
application, digital twins, renewable energy, security IoT, cloud computing,
machine learning, signal processing, image processing, and evolutionary DALIA YOUSRI received the B.Tech. (Hons.),
algorithms. M.Tech., and Ph.D. degrees from the Faculty of
Engineering, Fayoum University, Egypt, in 2011,
2016, and 2020, respectively. She is currently
ABDELGHANI DAHOU received the B.S. and a Lecturer with the Faculty of Engineering,
M.S. degrees in computer science and intelligent Fayoum University. She has published refereed
systems from the University of Ahmed Draia, manuscripts in the fields of optimization algo-
Adrar, Algeria, in 2012 and 2014, respectively, rithms, photovoltaic applications, fuel cell, chaotic
and the Ph.D. degree in computer science from the systems, and fractional calculus with some topics.
Wuhan University of Technology, Wuhan, Hubei, Her research interests include the modifications
China, in 2019. He is currently a Lecturer with the of optimization algorithms, modeling, and implementation of solar PV
Faculty of Science and Technology, University of systems, fuel cell technologies, and fractional calculus topics. Acting
Ahmed Draia. His research interests include deep as a Reviewer for various reputed journals, such as IEEE ACCESS, IET,
learning, signal processing, data mining, neuro- Energy Conversion and Management, Applied Soft Computing, International
evolution, image processing, and natural language processing. Journal of Electronics and Communications, Artificial Intelligence Review
(Springer), and Neural Computing and Applications.

MOHAMMED AZMI AL-BETAR received the


Ph.D. degree in artificial intelligence from Univer-
siti Sains Malaysia, in 2010. He was a Postdoctoral
Research Fellow with Universiti Sains Malaysia,
for three years, where he was invited as a Visiting
Researcher (two times). He is currently the Head
of the Evolutionary Computation Research Group
(ECRG), Artificial Intelligence Research Center REHAB ALI IBRAHIM received the B.S. degree
(AIRC), Ajman University. He has been a full-time in mathematics and computer science and the
Faculty Member and a Coordinator of the Master M.Sc. degree in computer science from Zagazig
of Science in Artificial Intelligence (M.Sc.-AI) Program, Ajman University, University, Egypt, in 2009 and 2014, respec-
since 2020. Furthermore, he has more than 15 years of teaching experience tively, and the Ph.D. degree from the Huazhong
in higher education institutions. He has taught several courses in the University of Science and Technology, Wuhan,
computer science and artificial intelligence fields. Before joining Ajman China, in 2020. From 2009 to 2014, she was
University, he was the Deputy Dean of academic affairs, the Deputy Dean an Assistant Lecturer with the Department of
of scientific research for quality assurance and development, and the Head Computer Science, Faculty of Science, Zagazig
of the Information Technology Department, Al-Balqa’ Applied University, University. She is the author of several articles. Her
Jordan. He has published more than 180 scientific articles in high-quality and research interests include machine learning, image processing, data mining,
well-reputed journals and conferences. Accordingly, he ranked in Stanford and artificial intelligence.
University’s study of the world’s top 2% of scientists in 2020/2021/2022.

VOLUME 11, 2023 73079

You might also like