Stock Exchange Trading Optimization Algorithm
Stock Exchange Trading Optimization Algorithm
https://doi.org/10.1007/s11227-021-03943-w
Hojjat Emami1
Abstract
In this paper, a human-inspired optimization algorithm called stock exchange trading
optimization (SETO) for solving numerical and engineering problems is introduced.
The inspiration source of this optimizer is the behavior of traders and stock price
changes in the stock market. Traders use various fundamental and technical analy-
sis methods to gain maximum profit. SETO mathematically models the technical
trading strategy of traders to perform optimization. It contains three main actuators
including rising, falling, and exchange. These operators navigate the search agents
toward the global optimum. The proposed algorithm is compared with seven popu-
lar meta-heuristic optimizers on forty single-objective unconstraint numerical func-
tions and four engineering design problems. The statistical results obtained on test
problems show that SETO is capable of providing competitive and promising per-
formances compared with counterpart algorithms in solving optimization problems
of different dimensions, especially 1000-dimension problems. Out of 40 numerical
functions, the SETO algorithm has achieved the global optimum on 36 functions,
and out of 4 engineering problems, it has obtained the best results on 3 problems.
1 Introduction
* Hojjat Emami
[email protected]
1
University of Bonab, Bonab, Iran
13
Vol.:(0123456789)
2126
H. Emami
13
Stock exchange trading optimization algorithm: a… 2127
13
2128
H. Emami
returns. In contrast, traders attempt to make transactions that can help them profit
quickly from price fluctuations over a shorter time frame. The objective of traders is
to gain returns that outperform buy-and-hold investing. Traders often use different
technical analysis tools such as stochastic oscillators and moving averages to find
optimal share buy and sell points. They try to maximize profit by adopting opti-
mal trading strategies and selecting the best shares. The procedure of the proposed
SETO algorithm attempts to find the most profitable share in the stock exchange
with the help of simple trading strategies. The best share corresponds to the opti-
mal solution to the given optimization problem. The SETO algorithm first creates a
population of candidate solutions. The algorithm improves the initial solutions using
three operators including rising, falling, and exchange. The individuals in the popu-
lation gradually converge to the optimal point.
Briefly speaking, the main contributions of this paper are as follows:
The SETO algorithm is simple and easy to implement. It can be applied to all opti-
mization problems that other optimizers can be applied for. SETO is an efficient
choice to solve optimization problems in various disciplines such as physical sci-
ence, mathematics, agricultural science, economics, computer science, communi-
cation, mechanical applications, civil engineering applications, manufacturing, and
many other areas.
The remaining parts of this paper are structured as follows: Section 2 reviews the
literature. Section 3 describes the inspiration source, mathematical model, and the
working principle of the proposed SETO algorithm. Section 4 presents the experi-
mental results obtained by the SETO and counterpart algorithms in solving single-
objective numerical optimization problems. Section 5 evaluates the applicability
of SETO and comparison algorithms on real-world engineering design problems.
Finally, Sect. 6 concludes the paper and lists potential directions for future research.
2 Related work
According to the metaphor of the search procedures, the structure of the problem
under consideration, and the search strategy, optimization meta-heuristics can be
categorized into different classes. As shown in Fig. 1, two main groups of meta-
heuristics are metaphor-based and non-metaphor-based algorithms [8]. The former
13
Stock exchange trading optimization algorithm: a… 2129
Meta-heuristic algorithms
category consists of algorithms that model the natural evolution, collective or swarm
intelligence of creatures, human actions in real life, chemistry or physical opera-
tions, etc. The latter category of algorithms did not simulate any natural phenomena
or creatures’ behavior for performing a search in the solution space of optimization
problems.
The metaphor-based algorithms can be categorized into three main paradigms:
biology-inspired, chemistry-/physics-inspired, and human-inspired algorithms.
Biology-inspired algorithms simulate the evolution of living organisms or the col-
lective intelligence of creatures such as ants, birds, and bees. Two classes of biol-
ogy-inspired algorithms are evolutionary and swarm intelligence algorithms. Evo-
lutionary algorithms are inspired by the laws of biological evolution in nature [15,
19]. The objective is to combine the best individuals to improve the survival and
reproduction ability of individuals throughout generations. Since the fittest individu-
als have a higher chance to survive and reproduce, the individuals in the next gen-
erations may probably be better than previous ones [18]. This idea forms the search
strategy of evolutionary algorithms, in which individuals will gradually reach the
global optimum. The most popular evolutionary algorithm is the genetic algorithm
(GA) [20] that follows Darwin’s theory of evolution. In GA, first, a population of
solutions is created randomly. The population evolves over iterations through selec-
tion, reproduction, combination, and mutation. Few popular evolutionary algorithms
are fast evolutionary programming (FEP) [21], differential evolution (DE) [22],
biogeography-based optimization (BBO) [23], forest optimization algorithm (FOA)
[24], black widow optimization (BWO) [25], farmland fertility algorithm (FFA)
[26], and seasons optimization algorithm (SOA) [18].
Swarm intelligence algorithms often model the interaction of living creatures in
a community, herds, flocks, colonies, and schools [6]. The core idea of swarm intel-
ligence algorithms is decentralization, in which the agents move toward the global
optimum through simulated social and collective intelligence, and local interac-
tion with their environment and with each other [8]. The algorithms in this cate-
gory memorize the best solutions found at each generation to produce the optimal
13
2130
H. Emami
solutions for the next generations. The most popular algorithms in this category are
PSO [9], ACO [10], and artificial bee colony (ABC) [27]. Some recently developed
swarm intelligence algorithms are firefly algorithm (FA) [28], krill herd (KH) [29],
elephant herding optimization (EHO) [30], spider monkey optimization (SMO)
[31], grey wolf optimizer (GWO) [32], whale optimization algorithm (WOA) [19,
33], butterfly optimization algorithm (BOA) [34], squirrel search algorithm (SSA)
[35], grasshopper optimization algorithm (GOA) [36], seagull optimization algo-
rithm (SOA) [37], normative fish swarm algorithm (NFSA) [38], red deer algorithm
(RDA) [39], and Harris hawks optimization (HHO) [7]. For more detail and deep
discussion about swarm intelligence algorithms, refer to the survey given in [6].
Chemistry- and physics-based algorithms simulate the chemistry and physical
rules in the universe such as chemical reactions, gravitational force, inertia force,
and magnetic force [25]. The search agents navigate and communicate through the
search space following the chemistry and physical rules. Simulated annealing (SA)
[40] is one of the founding algorithms in this category. SA models the annealing
process in metallurgy. Other widely used chemistry- and physics-based algorithms
are gravitational search algorithm (GSA) [41], big bang–big crunch (BB–BC) [42],
artificial chemical reaction optimization algorithm (ACROA) [43], galaxy-based
search algorithm (GbSA) [44], physarum-energy optimization algorithm (PEO)
[45], thermal exchange optimization (TEO) [46], equilibrium optimizer (EO) [47],
magnetic optimization algorithm (MOA) [48]. For a survey and discussion about
physics-inspired algorithms, refer to [49].
Human-based algorithms are developed based on metaphors from human life,
such as social relationships, political events, sports, music, and math. Since humans
are considered the smartest creatures in solving real-world problems, human-
inspired algorithms can also be more successful in solving optimization problems.
Some human-inspired algorithms are harmony search (HS) [50], imperialist com-
petitive algorithm (ICA) [51], teaching–learning-based optimization (TLBO) [52],
league championship algorithm (LCA) [53], class topper optimization (CTO) [54],
presidential election algorithm (PEA) [11], sine–cosine algorithm (SCA) [55], socio
evolution & learning optimization algorithm (SELO) [56], team game algorithm
(TGA) [57], ludo game-based swarm intelligence (LGSI) [58], heap-based optimizer
(HBO) [15], coronavirus optimization algorithm (CVOA) [59], political optimizer
(PO) [14], and Lévy flight distribution (LFD) [4].
Some algorithms are inspired by machine learning, reinforcement learning, and
learning classifier systems [60–62]. For example, ActivO is an ensemble machine
learning-based optimization algorithm [63]. ActivO combines strong and weak
learner strategies to perform a search for optimal solutions. The weak learner is
considered to explore the promising regions, and the strong learner is considered to
identify the exact location of the optimum within promising areas. Another exam-
ple is the molecule deep Q-networks (MolDQN) algorithm, which is developed by
combining domain knowledge of chemistry and reinforcement learning techniques
for molecule optimization. Researchers have proposed several methods for opti-
mizing trading strategies in the stock exchange [64–68]. For example, Thakkar and
Chaudhari [67] investigated the application of meta-heuristic algorithms for stock
portfolio optimization, and trend and stock price prediction along with implications
13
Stock exchange trading optimization algorithm: a… 2131
of PSO. In other work, Kumar and Haider [68] proposed RNN–LSTM and improved
its performance using PSO and flower pollination algorithm (FPA) for intraday stock
market prediction. It is important to notice that this paper does not focus on the opti-
mization or prediction in the stock exchange. To the best of our knowledge, in the
literature, there is no research that simulates the stock trading strategies for develop-
ing numerical optimization meta-heuristics.
It should be noted that each of the meta-heuristic algorithms has been improved
over the years, and several enhanced versions of them are available. The extended
algorithms improve the basic operators or overcome the defections that exist in
the conventional versions. For example, the chaotic election algorithm (CEA) [12]
embeds the chaos-based advertisement operator to the conventional PEA algorithm
[11] to improve its search capability and convergence speed. Some other algorithms
that recently proposed and used in different applications are opposition-based learn-
ing firefly algorithm combined with dragonfly algorithm (OFADA) [69], random
memory and elite memory equipped artificial bee colony (ABCWOA) algorithm
[70], efficient binary symbiotic organisms search (EBSOS) [71, 72], efficient binary
chaotic symbiotic organisms search (EBCSOS) [73], and binary farmland fertility
algorithm (BFFA) [74].
After this short review, and from the experimental results reported in the litera-
ture, we can conclude that the obtained performances on most optimization prob-
lems are not perfect. This phenomenon clearly shows that a lot of effort is needed in
the field. Each algorithm is suitable for solving certain types of problems. It seems
that one of the interesting tasks in this field is to determine the best algorithms for
each type of optimization problem. For deep analysis about meta-heuristic algo-
rithms, refer to surveys given in [2, 8, 75]. Table 1 summarizes some of the recently
proposed meta-heuristic algorithms.
This section discusses the inspiration source and describes the mathematical model
of the proposed stock exchange trading optimization (SETO) algorithm.
3.1 Inspiration
A stock exchange or bourse is an exchange where traders and investors sell and buy
all types of securities such as shares of stock, bonds, and other financial instruments
issued by listed companies [76]. The stock exchange often acts as a continuous auc-
tion market in which sellers and buyers perform transactions through electronic trad-
ing platforms and brokerages. People invest and trade with an efficient strategy in
mind to make the most profit. Shares price never goes up in a straight line. They rise
and fall on their way to higher prices. A rise occurs because more people want to
buy a share than sell it. In the rising phase, the price of shares moves up. When the
shares rise for a long period, correction may start. A correction and all types of mar-
ket declines occur because investors or traders are more motivated to sell than buy.
13
Table 1 Some recently proposed meta-heuristics
2132
13
Genetic algorithm GA) Darwin’s theory of evolution 1992 [20]
Fast evolutionary programming (FEP) Natural evolution 1999 [21]
Differential evolution (DE) Natural evolution 2007 [22]
Biogeography-based optimization (BBO) Geographical distribution of biological organisms 2008 [23]
Forest optimization algorithm Growth of trees in forests 2014 [24]
Black widow optimization (BWO) Unique mating behavior of black widow spiders 2020 [25]
Farmland fertility optimization (FFO) Farmland fertility in nature 2018 [26]
Seasons optimization algorithm (SOA) Trees growth behavior 2020 [18]
Particle swarm optimization (PSO) Motion of bird flocks and schooling fish 1995 [9]
Ant colony optimization (ACO) Foraging behavior of natural ants 2006 [10]
Artificial bee colony (ABC) Intelligent behavior of bees 2007 [27]
Firefly algorithm (FA) Flashing behavior of fireflies 2010 [28]
Krill herd (KH) The herding behavior of krill communications 2012 [29]
Elephant herding optimization (EHO) Herding behavior of elephant group 2016 [30]
Spider monkey optimization (SMO) Fission–fusion social structure of spider monkeys in foraging 2014 [31]
Grey wolf optimizer (GWO) Leadership hierarchy and hunting mechanism of grey wolves 2014 [32]
Whale optimization algorithm (WOA) Humpback whales 2016 [19]
Butterfly optimization algorithm (BOA) Food foraging behavior of the butterflies 2018 [34]
Squirrel search algorithm (SSA) Dynamic behavior of flying squirrels 2019 [35]
Grasshopper optimization algorithm (GOA) Foraging and swarming behavior of grasshoppers 2017 [36]
Seagull optimization algorithm (SOA) Migration and attacking behaviors of a seagull in nature 2019 [37]
Normative fish swarm algorithm (NFSA) Behavior of fish swarm in the real environment 2019 [38]
Red deer algorithm (RDA) Unusual mating behavior of Scottish red deer 2020 [39]
Harris hawks optimization (HHO) Cooperative behavior and chasing style of Harris’ hawks 2019 [7]
Simulated annealing (SA) Annealing procedure of the metal working 1983 [40]
H. Emami
Gravitational search algorithm (GSA) Newtonian’s law of gravity and the law of motion 2009 [41]
Table 1 (continued)
Title Inspiration source Year References
13
2134
H. Emami
At this time, sellers will start lowering prices until buyers tend to buy the shares.
Traders can sell their shares at any time they see fit or add to their number of shares.
They use various indicators to obtain the selling and buying signals and maximize
their gains through the analysis of stocks’ momentum. Some of the most commonly
used technical indicators are simple moving average (SMA), moving average con-
vergence divergence (MACD), relative strength index (RSI), stochastic oscillator,
and Bollinger bands among others [76].
The RSI [77] is a well-known momentum oscillator used in technical analysis. It
measures the magnitude of recent price changes to investigate overbought or over-
sold conditions in the price of a share. It produces signals that tell traders to sell
when the share is overbought and to buy when it is oversold. The RSI is often meas-
ured on a 14-day timeframe, and it oscillates between 0 and 100. The indicator has a
lower line typically at 30 and an upper line at 70. A share is often considered over-
sold when the RSI is at or below 30 and overbought when it is around 70 [78]. RSI
between the 30 and 70 levels is considered neutral. An oversold signal recommends
that short-term declines are reaching maturity, and a share may be in for a rally. In
contrast, an overbought signal could mean that short-term gains may be reaching a
point of maturity, and a share may be in for a price correction. As shown in Fig. 2,
RSI is often illustrated on a graph below the price chart.
In addition to the indicator signals, many investors use fundamental analysis
especially price-to-earnings (P/E) ratio to find out if a share is correctly valued [79].
P/E shows how cheap or expensive the share is. If all things are equal (the lower the
price, the higher the return), the lower P/E means the lower price of a share that is
suitable for investors. However, if all things are not equal, a lower P/E may not indi-
cate a good share for investing, because a share with a high P/E may provide a better
13
Stock exchange trading optimization algorithm: a… 2135
Start
Calculate RSI
No
Stop conditions are met?
Yes
End
return than a low P/E stock. Overall, in trading, it is better to compare the P/E of a
share with its market peers to discover it is overvalued or undervalued.
Traders and shareholders try to maximize profits by looking for the best shares
with the highest earning. The behavior of traders in the stock market is an adaptive
optimization process.
3.2 Mathematical model
This section shows how the trading behavior of traders and changes in share prices
is mathematically modeled to design the stock exchange trading optimization
(SETO) algorithm. Figure 3 shows the flowchart of the SETO algorithm. The SETO
is a population-based optimization algorithm, which starts its work with an initial
13
2136
H. Emami
To solve any optimization problem, the first step in the SETO algorithm is to cre-
ate an initial population of candidate solutions. Each solution in the population is
referred to as a share or stock. In this paper, the terms “share” and “stock” are uti-
lized interchangeably in most cases. For an optimization problem F(x) with D vari-
ables {x1 , x2 , … , xD } , the initial population is defined as
[ ]T
S = S1 , S2 , … , SN (2)
13
Stock exchange trading optimization algorithm: a… 2137
At any given time, each share has a number of sellers and buyers. To identify the
initial traders, we use a random initialization mechanism. To do this, first the nor-
malized fitness ( nfi ) of each share Si is computed as follows:
fi − min(M)
nfi = , M = {fk �k = 1, 2, … , N}
∑�
N � (6)
fk − min(M)
k=1
where bi and si are the number of buyers and sellers of Si , respectively. The vari-
able r is a random number in the range [0, 1], which is generated by the uniform
distribution.
3.2.2 Rising
The rising operator simulates the growth of shares’ prices in the market. In this
phase, shares can move to higher prices. Here the highest price that shares can reach
is considered as the optimal point. If the price of a share reaches its highest value,
then the traders who have that stock will make the most profit. To mathematically
model the rising phenomenon, we proposed the following equation:
Si (t + 1) = Si (t) + R × (Sg (t) − Si (t)) (9)
where Si (t) denotes the position of ith share at current iteration t, R is a 1 × D vector
of random numbers generated every iteration, and Sg (t) is the best solution found
until current iteration. The parameter R adds some amount of random deviations to
the direction of movement in hope of escaping local optimums and more exploring
solution space. Each element of vector rj ∈ R is defined as follows:
rj = U(0, pci × d1 ) (10)
where the function U generates a random number using uniform distribution in the
range [0, pci × d1 ] . The variable pci is the ratio of buyers to sellers of Si , and d1 is the
normalized distance between Si (t) and Sg (t) defined as
13
2138
H. Emami
�
∑
D
g 2
(Sj (t) − Sij (t))
j=1 (11)
d1 =
ub − lb
ub and lb are the upper and lower bound of the search space, respectively. The dis-
tance between shares is naturally related to the domain of the search space. Thus, the
distance is normalized using (ub − lb) in the denominator to avoid problem domain
dependency. Supply and demand are two important factors in share growth. The
higher the demand for a share, the more likely it is that the share will grow. For this
purpose, the pci is considered in Eq. (10) to determine the impact of demand on
share growth. Here, the demand for a share is indicated by the number of buyers. pci
is simply defined as follows:
bi
pci = (12)
si + 1
where bi and si are the number of buyers and sellers of share Si , respectively. To
avoid the search boundary violation, the parameter pci is limited to a value in the
range [0, 2]. So, Eq. (12) is revised as follows:
bi
pci = min( , 2) (13)
si + 1
In the rising phase, the demand for shares increases. To model this phenomenon, at
each iteration of the algorithm and during rising, we remove a seller from the selling
queue of Si and add it to the buying queue as a buyer.
bi = bi + 1;
si = si − 1; (14)
In the implementation of SETO, it is assumed that any trader can buy or sell a share
at any time. Therefore, the buying and selling queue of each share Si are modeled as
variables bi and si.
In the rising phase, the algorithm spread the solutions far from the current area of
search space to explore different areas of search space.
3.2.3 Falling
The falling phase simulates shares’ prices decline. To mathematically model the
falling, we propose the following equation:
where Sil (t) is the local best position the share Si has ever found. The local search
experience increases the convergence of the algorithm. W is a 1 × D vector of uni-
form random numbers. Each element wj ∈ W is computed as follows:
13
Stock exchange trading optimization algorithm: a… 2139
In the case of falling prices, the share supply increases. To model this issue, at each
iteration of the algorithm and during falling, we remove a buyer from buying queue
of Si and add it to the selling queue as a seller.
si = si + 1;
bi = bi − 1; (19)
At each iteration, the number of buyers and sellers of each share is controlled so that
the total number of buyers and sellers does not exceed the total number of traders.
3.2.4 Exchange
In the exchange phase, traders replace their shares with the lowest profit with the
most profitable shares. To do this, traders sell the lowest yielding shares and line up
to buy the best shares. We implement this phenomenon by just picking one of the
sellers from the sell queue of the worst share and assign it to the buy queue of the
best share. The competition can be done among all shares to attract the traders; how-
ever, for simplicity, we assign the seller to the best share. To mathematically model
this process, first, the worst share is identified. The share Sw with the lowest fitness is
considered the worst if it obtains the lowest fitness.
Sworst = Sw where f (Sw ) < f (Sj )
∀ j = 1, 2, … , N, w ≠ j (20)
Then, one of the sellers is removed from the selling queue of the worst share Sworst
and added to the buying queue of the best share. The best share Sbest is determined
as follows:
Sbest = Sb where f (Sb ) > f (Sj )
∀ j = 1, 2, … , N, b ≠ j (21)
The exchange operator improves the population because it allows the best and worst
shares eventually to grow. This reduces the number of sellers of the worst share and
13
2140
H. Emami
increases the number of buyers of the best share. Therefore, the ratio of buyers to
sellers increases, and in this case, the possibility of rising the shares increases.
3.2.5 RSI calculation
We use the RSI indicator to identify when the share rising or falling occurs. Accord-
ing to RSI value, SETO performs rising or falling as follows:
⎧ rising RSI ≤ 30
⎪
⎨ falling RSI ≥ 70 (22)
⎪ p × rising + (1 − p) × falling 30 < RSI < 70
⎩
where function rand generates a random number in the range [0, 1] using uniform
distribution. For a share Si , the RSI is calculated as follows [78]:
100
RSI = 100 − (24)
1 + RS
A simple moving average (SMA) method [76] is used to compute relative strength
(RS) as follows:
∑
K
/∑
K
RS = Pi Ni (25)
i=1 i=1
where Pi and Ni are the upward and downward price changes, respectively. K indi-
cates the trading time frame of RSI. In the implementation of SETO, K is set to be
14 days (iterations). In the SETO algorithm, the price of shares is represented with
their fitness. Pi and Ni are computed as follows:
{ ( )
1 if fi (t) − fi (t − 1) > 0
Pi =
0 otherwise (26)
{
1 if(fi (t − 1) − fi (t)) > 0
Ni =
0 otherwise (27)
where fi (t) and fi (t − 1) are the fitness in the current and previous iterations, respec-
tively. Here, the fitness corresponds to the close price of the share. If the previous
fitness is the same as the last fitness, both Pi and Ni are set to be zero. The RSI will
rise as the number of positive closes increase, and it will fall as the number of losses
increase.
13
Stock exchange trading optimization algorithm: a… 2141
3.2.6 Stop condition
Until termination conditions are met, the algorithm iterates the rising, falling, and
exchange phases on the population. Finally, the fittest share is returned as an opti-
mal solution for the problem. The following termination conditions are considered
to stop the algorithm:
13
2142
H. Emami
t = 0;
while (t ≤ G) do
for i=1 to N do
if (t ≥ K and Si (t).RSI ≤ 30) then
[S]= Rising(Si (t), S g );
else if (t ≥ K and Si (t).RSI ≥ 70) then
[S]= Falling (S);
else
r=rand;
if (r > 0.5) then
[S]= Rising(S);
else
[S]= Falling(S);
end
end
// Exchange phase;
[S]= Exchange(S);
// RSI calculation;
if (t >= K) then
Compute Pi by Eq. (26);
Compute Ni by Eq. (27);
Calculated RSI by Eq.(24);
Si (t).RSI = RSI;
end
Update the best solution S g ;
end
t = t + 1;
end
Return the fittest share S g and its fitness;
To show the functioning of the SETO, it is benchmarked using the peak function.
The purpose is to show how the shares move around the search space and gradually
converge to the global optimum. The peak function is defined as follows:
(28)
2 +y2 )
f (x, y) = xe−(x − 2 ≤ x, y ≤ 2
13
Stock exchange trading optimization algorithm: a… 2143
13
2144
H. Emami
respectively. Initially, the shares are scattered throughout the solution space and they
are not in global optimum. In the 5th iteration, a share is close to the global opti-
mal point, while other shares are placed at local optimums. In the 10th iteration, the
best share is more close to the global optimum, and in the 15th iteration, most of
the shares are more close to the global optimum. Finally, at the 20th iteration, the
majority of trees converge to the global optimum.
4 Experiments
4.1 Test problems
To investigate the precision, convergence speed, and search capability of the pro-
posed SETO and comparison algorithms, forty well-studied test problems are cho-
sen from the literature [4, 7, 18, 25, 80, 81]. This test set covers four classes of func-
tions as follows:
• Group I F1−F10 are fixed-dimension problems. This test set investigates the
local optimum avoidance capacity of algorithms in solving problems with a fixed
number of variables [18].
• Group II F11−F22 are single-objective unimodal functions. These test cases
have a unique global best in their landscape. They are considered to measure the
exploitation (intensification) ability of the algorithms [18, 80].
• Group III F23−32 are multimodal functions that consist of multiple local opti-
mums in their landscape. The dimensionality and multiple local optima make
multimodal functions more difficult and more complex to optimize. This group
of functions is considered to reveal the local avoidance and exploration (diversi-
fication) capability of optimization algorithms [7].
• Group IV F33−F40 are shifted, rotated, hybrid and composite functions. This
test set is drawn from CEC 2018 competition [81] on single-objective real-
parameter numerical optimization problems. These functions evaluate the reli-
ability, accuracy, and ability of the algorithms in providing a balance between
exploration and exploitation.
13
Stock exchange trading optimization algorithm: a… 2145
F1 Adjiman [– 1, 2] 2 – 2.02181
F2 Bartels Conn [– 500, 500] 2 1
F3 Brent [– 10, 10] 2 0
F4 Bukin 6 [(– 15, – 5), (– 5, – 3)] 2 180.3276
F5 Easom [– 100, 100] 2 –1
F6 Egg Crate [– 5, 5] 2 0
F7 Matyas [– 10, 10] 2 0
F8 Schaffer N. 4 [– 100, 100] 2 0.292579
F9 Three-Hump Camel [– 5, 5] 2 0
F10 Zettle [– 5, 10] 2 – 0.00379
13
2146
H. Emami
F33 Shifted and Rotated Rastrigin’s Function (CEC4) [– 100, 100] 10 400
F34 Shifted and Rotated Lunacek BiRastrigin Function (CEC6) [– 100, 100] 10 600
F35 Shifted and Rotated Non-Continuous Rastrigin’s Function [– 100, 100] 10 700
(CEC7)
F36 Shifted and Rotated Schwefel’s Function (CEC9) [– 100, 100] 10 900
F37 Hybrid Function 1 (N = 3) (CEC10) [– 100, 100] 10 1000
F38 Hybrid Function 6 (N=4) (CEC15) [– 100, 100] 10 1500
F39 Composite Function 1 (N = 3) (CEC20) [– 100, 100] 10 2000
F40 Composite Function 6 (N = 5) (CEC25) [– 100, 100] 10 2500
4.2 Comparison algorithms
13
Stock exchange trading optimization algorithm: a… 2147
4.3 Experimental setting
The experiments were performed using MATLAB 2016b on a Laptop machine with
8GB main memory and 64-bit i7 Intel (R) Core (TM) 2.2GHz processor. The popu-
lation size (N), the maximum iteration number (G), and the maximum number of fit-
ness function evaluations (FEs) for all the algorithms were set to be 25 and 103 × D ,
respectively. D indicates the dimension of problems. The configuration of control
parameters for comparison algorithms is summarized in Table 6. The parameters are
tuned as recommended in the corresponding literature. Most of the control param-
eters of SETO are already known and configured using the data drawn from the
stock exchange and scientific resources about technical analysis. This issue turns the
SETO into an optimizer quite easy to implement and execute. In the current imple-
mentation of the SETO algorithm, the only parameter that needs to be adjusted is the
initial number of traders (T). As given in Table 6, the parameter T is set to 100. Dif-
ferent values of the variable T do not affect the performance of the algorithm. The
parameter T is used to calculate the ratio of buyers to sellers (pc) and the ratio of
sellers to buyers (nc). The values of pc and nc do not change significantly as the total
number of traders increases or decreases. These parameters are limited to a value in
the range [0, 2]. Regarding population size, it is obvious that with increasing popu-
lation size, the performance of optimization algorithms improves, but also the exe-
cution time of the algorithms increases. However, the population size is considered
the same for all algorithms. To fair comparison, the basic standard versions of the
algorithms are used for tests. We used the source codes published by the authors and
customized them to be compatible with our experimental configuration. The quality
of solutions reported by the algorithms is calculated by the Mean and the standard
deviation (Std) measures. In an ideal state, the Mean is equal to the global optimum
of the problem, and the std is 0. As the std increases, the reliability of the algorithm
decreases. To obtain the statistical results, the algorithms were executed 30 times on
each test problem following the experimental instructions provided in [18, 84]. The
results at each run are recorded to calculate the mean and the standard deviation of
the best solutions found in 30 independent runs.
13
Table 7 Statistical results on 30D fixed-dimension functions
2148
13
F1 – 2.01E–01 ± 4.30E–03 ⊙ – 2.02E–01 ± 9.11E–16 ⊙ – 2.02E–01 ± 4.45E–04 ⊙ – 2.02E–01 ± 3.35E–10 ⊙
F2 1.00E+00 ± 4.17E–02 ⊙ 𝟏.𝟎𝟎𝐄 + 𝟎𝟎 ± 𝟎.𝟎𝟎𝐄 + 𝟎𝟎⊙ 1.00E+00 ± 1.19E–04 ⊙ 𝟏.𝟎𝟎𝐄 + 𝟎𝟎 ± 𝟎.𝟎𝟎𝐄 + 𝟎𝟎⊙
F3 2.25E–05 ± 1.68E–04 ⊖ 1.38E–87 ± 5.17E–139 ⊙ 9.11E–06 ± 1.69E–05 ⊖ 1.38E–87 ± 3.11E–105 ⊙
F4 𝟏.𝟖𝟎𝐄 + 𝟎𝟐 ± 𝟎.𝟎𝟎𝐄 + 𝟎𝟎⊙ 𝟏.𝟖𝟎𝐄 + 𝟎𝟐 ± 𝟎.𝟎𝟎𝐄 + 𝟎𝟎⊙ 1.80E+02 ± 4.38E–04 ⊙ 𝟏.𝟖𝟎𝐄 + 𝟎𝟐 ± 𝟎.𝟎𝟎𝐄 + 𝟎𝟎⊙
F5 – 1.00E+00 ± 0.00E+00 ⊕ – 1.00E+00 ± 0.00E+00 ⊕ – 1.00E+00 ± 0.00E+00 ⊕ – 1.00E+00 ± 3.51E–04 ⊕
F6 8.03E–1 ± 9.30E–18 ⊖ 1.50E–130 ± 7.93E–129 ⊖ 4.92E–21 ± 3.16E–18 ⊖ 6.29E–160 ± 5.19E–159 ⊖
F7 2.06E–21 ± 5.32E–19 ⊖ 1.53E–65 ± 2.78E–61 ⊖ 2.39E–29 ± 1.22E–31 ⊖ 2.41E–85 ± 2.50E–86 ⊖
F8 2.17E–01 ± 1.62E–02 ⊖ 2.93E–01 ± 4.01E–17 ⊙ 2.01E–01 ± 1.50E–02 ⊖ 2.93E–01 ± 3.79E–09 ⊙
F9 2.70E–94 ± 3.17E–95 ⊖ 3.50E–68 ± 4.58E–64 ⊖ 1.44E–22 ± 9.60E–22 ⊖ 2.13E–140 ± 3.60E–139 ⊖
F10 – 3.79E–03 ± 1.91E–02 ⊙ – 3.79E–03 ± 6.47E–15 ⊙ – 3.79E–03 ± 5.02E–17 ⊙ – 3.79E–03 ± 3.97E–10 ⊙
⊖ 4 3 4 3
⊕ 1 1 1 1
⊙ 5 6 5 6
SELO HBO LFD SETO
Mean ± Std Mean ± Std Mean ± Std Mean ± Std
13
Table 8 Statistical results of 30D unimodal functions
2150
13
F11 3.20E+01 ± 1.17E+00 ⊖ 2.74E+01 ± 1.02E+01 ⊖ 6.10E–18 ± 4.59E–19 ⊖ 4.18E–07 ± 2.98E–06 ⊖
F12 7.93E+01 ± 2.94E+01 ⊖ 4.31E+01 ± 9.22E+00 ⊖ 6.73E–01 ± 3.07E–02 ⊖ 2.19E+00 ± 1.42E+00 ⊖
F13 9.01E+03 ± 9.77E+02 ⊖ 8.44E+02 ± 3.50E+02 ⊖ 4.02E–03 ± 8.90E–02 ⊖ 2.60E+00 ± 6.32E+01 ⊖
F14 3.88E–09 ± ⊖7.66E–10 ⊖ 5.17E–11 ± 3.08E–12 ⊖ 3.06E–18 ± 8.05E–19 ⊖ 3.91E–10 ± 5.66E–10 ⊖
F15 1.44E+04 ± 2.56E+04 ⊖ 2.36E+03 ± 1.08E+04 ⊖ 2.89E+01 ± 1.40E+01 ⊖ 6.92E+01 ± 8.01E+01 ⊖
F16 4.71E–05 ± 2.70E–05 ⊖ 7.33E–07 ± 2.96E–06 ⊖ 3.16E–09 ± 9.25E–09 ⊖ 3.11E–05 ± 2.98E–05 ⊖
F17 2.05E+01 ± 7.30E+00 ⊖ 3.30E–02 ± 7.92E–01 ⊖ 6.41E–03 ± 3.77E–02 ⊖ 2.11E+01 ± 1.13E+01 ⊖
F18 6.12E–01 ± 4.47E–01 ⊖ 2.17E–02 ± 3.99E–02 ⊖ 4.70E+01 ± 2.08E+01 ⊖ 9.11E–06 ± 3.55E–07 ⊖
F19 1.54E–03 ± 4.39E–02 ⊖ 4.17E–15 ± 2.78E–14 ⊖ 9.15E–87 ± 3.08E–88 ⊖ 2.19E+03 ± 7.33E+03 ⊖
F20 4.15E–20 ± 3.91E–20 ⊖ 3.07E–13 ± 4.79E–14 ⊖ 5.13E–16 ± 9.70E–17 ⊖ 8.53E–04 ± 1.78E–05 ⊖
F21 5.35E–01 ± 2.70E+02 ⊖ 7.15E–01 ± 2.68E+01 ⊖ 9.66E–17 ± 4.37E–18 ⊖ 6.91E–04 ± 2.73E–04 ⊖
F22 4.14E–55 ± 2.17E–48 ⊖ 6.09E–102 ± 2.81E–107 ⊖ 1.56E–41 ± 2.40E–42 ⊖ 5.32E–190 ± 1.91E–184 ⊖
⊖ 12 12 12 12
⊕ 0 0 0 0
⊙ 0 0 0 0
SELO HBO LFD SETO
Mean ± Std Mean ± Std Mean ± Std Mean ± Std
F11 5.22E–78 ± 9.40E–81 ⊖ 𝟎.𝟎𝟎𝐄 + 𝟎𝟎 ± 𝟎.𝟎𝟎𝐄 + 𝟎𝟎⊙ 9.22E–01 ± 1.61E–01 ⊖ 0.00E+00 ± 0.00E+00
F12 9.73E–01 ± 5.02E–04 ⊖ 6.67E–01 ± 3.17E–04 ⊖ 9.99E–01 ± 4.87E–02 ⊖ 6.66E–01 ± 4.61E–90
F13 7.38E–02 ± 5.49E–02 ⊖ 5.67E–07 ± 1.04E–07 ⊖ 4.52E–06 ± 2.33E–08 ⊖ 0.00E+00 ± 0.00E+00
F14 4.70E–25 ± 3.76E–24 ⊖ 𝟎.𝟎𝟎𝐄 + 𝟎𝟎 ± 𝟎.𝟎𝟎𝐄 + 𝟎𝟎⊙ 1.61E–05 ± 1.14E–04 ⊖ 0.00E+00 ± 0.00E+00
F15 2.90E+01 ± 1.90E+01 ⊖ 7.76E+01 ± 4.02E+01 ⊖ 2.94E–02 ± 6.32E–01 ⊖ 2.86E+01 ± 0.00E+00
F16 3.66E–20 ± 7.70E–20 ⊖ 𝟎.𝟎𝟎𝐄 + 𝟎𝟎 ± 𝟎.𝟎𝟎𝐄 + 𝟎𝟎⊙ 1.46E–01 ± 2.04E–01 ⊖ 0.00E+000.00E+00
H. Emami
F18 5.24E–63 ± 6.14E–62 ⊖ 𝟎.𝟎𝟎𝐄 + 𝟎𝟎 ± 𝟎.𝟎𝟎𝐄 + 𝟎𝟎⊙ 6.40E–32 ± 5.28E–33 ⊖ 0.00E+00 ± 0.00E+00
F19 4.85E–101 ± 1.16E–102 ⊖ 𝟎.𝟎𝟎𝐄 + 𝟎𝟎 ± 𝟎.𝟎𝟎𝐄 + 𝟎𝟎⊙ 1.11E–40 ± 8.76E–41 ⊖ 0.00E+00 ± 0.00E+00
F20 3.54E–55 ± 1.67E–57 ⊖ 𝟎.𝟎𝟎𝐄 + 𝟎𝟎 ± 𝟎.𝟎𝟎𝐄 + 𝟎𝟎⊙ 7.25E–05 ± 4.75E–25 ⊖ 0.00E+00 ± 0.00E+00
F21 𝟎.𝟎𝟎𝐄 + 𝟎𝟎 ± 𝟎.𝟎𝟎𝐄 + 𝟎𝟎⊙ 𝟎.𝟎𝟎𝐄 + 𝟎𝟎 ± 𝟎.𝟎𝟎𝐄 + 𝟎𝟎⊙ 5.00E–04 ± 2.71E–04 ⊖ 0.00E+00 ± 0.00E+00
F22 1.26E–94 ± 7.19E–96 ⊖ 4.34E–232 ± 0.00E+00 ⊖ 𝟎.𝟎𝟎𝐄 + 𝟎𝟎 ± 𝟎.𝟎𝟎𝐄 + 𝟎𝟎⊙ 0.00E+00 ± 0.00E+00
⊖ 11 5 11 –
⊕ 0 0 0 –
⊙ 1 7 1 –
13
Table 9 Statistical results of 30D multimodal functions
2152
13
F23 1.70E+01 ± 2.57E+00 ⊖ 2.30E+00 ± 5.54E+01 ⊖ 7.20E–10 ± 6.31E–10 ⊖ 1.25E+01 ± 7.42E+00 ⊖
F24 5.10E–03 ± 8.11E–02 ⊖ 7.90E–02 ± 8.30E–01 ⊖ 3.22E–10 ± 1.55E–11 ⊖ 3.50E–02 ± 7.80E–02 ⊖
F25 1.16E+01 ± 4.33E+00 ⊖ 1.16E–02 ± 5.33E–03 ⊖ 5.42E–04 ± 8.12E–04 ⊖ 3.81E–01 ± 5.12E–01 ⊖
F26 6.80E–01 ± 2.05E–03 ⊖ 1.00E+00 ± 3.59E–09 ⊖ 1.00E+00 ± 7.36E–15 ⊖ 2.17E+00 ± 5.02E+00 ⊖
F27 8.37E+00 ± 4.68E+00 ⊖ 1.46E+01 ± 2.07E+01 ⊖ 1.63E+01 ± 5.18E+00 ⊖ 1.04E+01 ± 1.53E+01 ⊖
F28 9.12E–01 ± 4.29E–02 ⊖ 4.35E–01 ± 3.09E–02 ⊖ 1.40E+00 ± 3.30E–01 ⊖ 2.10E–01 ± 4.87E–02 ⊖
F29 4.32E+00 ± 2.25E+00 ⊕ 7.51E+00 ± 2.41E+00 ⊕ 6.56E+03 ± 7.13E+02 ⊖ 9.39E+01 ± 1.70E+01 ⊖
F30 3.75E+03 ± 6.56E+01 ⊖ 5.67E+04 ± 3.87E+02 ⊖ 7.13E–04 ± 6.64E–03 ⊖ 3.64E–04 ± 9.25E–05 ⊖
F31 5.52E–11 ± 3.14E+11 ⊖ 1.04E–09 ± 4.97E–12 ⊖ 9.14E–12 ± 3.32E–15 ⊖ 7.16E–11 ± 1.90E–12 ⊖
F32 7.41E–09 ± 6.63E–08 ⊖ 6.18E–11 ± 2.57E–10 ⊖ 4.60E–31 ± 2.77E–30 ⊖ 7.40E–11 ± 1.27E–10 ⊖
⊖ 9 9 10 10
⊕ 1 1 0 0
⊙ 0 0 0 0
SELO HBO LFD SETO
Mean ± Std Mean ± Std Mean ± Std Mean ± Std
13
Table 10 Statistical results on 10D group IV test function
2154
13
F33 4.08E+02 ± 5.18E+00 ⊖ 4.05E+02 ± 7.95E+00 ⊖ 4.05E+02 ± 4.71E+00 ⊖ 4.29E+02 ± 1.51E+01 ⊖
F34 6.41E+02 ± 1.25E+01 ⊖ 6.31E+02 ± 3.40E+00 ⊖ 6.19E+02 ± 6.03E+00 ⊖ 6.18E+02 ± 5.52E+00 ⊖
F35 7.27E+02 ± 3.59E+00 ⊖ 7.42E+02 ± 6.63E+00 ⊖ 7.39E+02 ± 2.92E+00 ⊖ 7.63E+02 ± 4.92E+00 ⊖
F36 9.03E+02 ± 4.70E+00 ⊖ 𝟗.𝟎𝟎𝐄 + 𝟎𝟐 ± 𝟎.𝟎𝟎𝐄 + 𝟎𝟎⊙ 𝟗.𝟎𝟎𝐄 + 𝟎𝟐 ± 𝟎.𝟎𝟎𝐄 + 𝟎𝟎⊙ 9.56E+02 ± 3.97E+01 ⊖
F37 1.68E+03 ± 9.80E+01 ⊖ 1.63E+03 ± 1.89E+02 ⊖ 2.53E+03 ± 2.18E+02 ⊖ 2.24E+03 ± 1.07E+02 ⊖
F38 1.76E+03 ± 1.28E+02 ⊖ 1.80E+03 ± 1.99E+02 ⊖ 9.82E+03 ± 1.32E+03 ⊖ 2.13E+03 ± 2.16E+02 ⊖
F39 2.10E+03 ± 1.76E+01 ⊖ 2.16E+03 ± 4.91E+01 ⊖ 2.19E+03 ± 4.17E+01 ⊖ 2.07E+03 ± 1.94E+01 ⊖
F40 2.63E+03 ± 6.23E+01 ⊖ 2.65E+03 ± 9.24E+01 ⊖ 2.67E+03 ± 4.30E+00 ⊖ 2.62E+03 ± 9.71E+00 ⊖
⊖ 8 7 7 8
⊕ 0 0 0 0
⊙ 0 1 1 0
SELO HBO LFD SETO
Mean ± Std Mean ± Std Mean ± Std Mean ± Std
F33 4.02E+02 ± 1.32E+00 ⊖ 𝟒.𝟎𝟎𝐄 + 𝟎𝟐 ± 𝟎.𝟎𝟎𝐄 + 𝟎𝟎⊙ 𝟒.𝟎𝟐𝐄 + 𝟎𝟐 ± 𝟐.𝟏𝟓𝐄 + 𝟎𝟎⊙ 4.00E+02 ± 0.00E+00
F34 6.02E+02 ± 1.43E+01 ⊖ 𝟔.𝟎𝟎𝐄 + 𝟎𝟐 ± 𝟎.𝟎𝟎𝐄 + 𝟎𝟎⊙ 6.05E+02 ± 6.22E+00 ⊖ 6.00E+02 ± 0.00E+00
F35 7.13E+02 ± 6.90E+00 ⊖ 7.10E+02 ± 3.79E–17 ⊖ 7.26E+02 ± 3.50E+01 ⊖ 7.00E+02 ± 2.41E–15
F36 𝟗.𝟎𝟎𝐄 + 𝟎𝟐 ± 𝟎.𝟎𝟎𝐄 + 𝟎𝟎⊙ 𝟗.𝟎𝟎𝐄 + 𝟎𝟐 ± 𝟎.𝟎𝟎𝐄 + 𝟎𝟎⊙ 𝟗.𝟎𝟎𝐄 + 𝟎𝟐 ± 𝟎.𝟎𝟎𝐄 + 𝟎𝟎⊙ 9.00E+02 ± 0.00E+00
F37 1.62E+03 ± 5.20E+01 ⊖ 𝟏.𝟎𝟎𝐄 + 𝟎𝟑 ± 𝟎.𝟎𝟎𝐄 + 𝟎𝟎⊙ 1.34E+03 ± 6.76E+01 ⊖ 1.00E+03 ± 0.00E+00
F38 1.68E+03 ± 4.30E+01 ⊖ 1.60E+03 ± 1.15E+02 ⊖ 1.84E+03 ± 4.30E+01 ⊖ 1.58E+03 ± 5.60E+01
F39 2.04E+03 ± 2.60E+01 ⊖ 2.00E+03 ± 6.40E–01 ⊕ 2.05E+03 ± 1.13E+02 ⊖ 2.01E+03 ± 7.22E+01
F40 2.63E+03 ± 3.90E+01 ⊖ 2.56E+03 ± 8.30E+01 ⊕ 2.55E+03 ± 6.70E+01 ⊕ 2.58E+03 ± 7.64E+01
⊖ 7 2 5 –
⊕ 0 2 1 –
⊙ 1 4 2 –
H. Emami
• In the case of fixed-dimension test cases, the SETO and HBO take 1st rank for
all test functions in terms of best mean results. However, in terms of std, the
first position belongs to SETO, which shows its stable convergence behavior in
solving fixed-dimension problems. Both SCA and PSO attain third rank among
others. GA, PSO, GSA, SCA, SELO, HBO, LFD, and SETO, respectively, gen-
erate 5, 7, 5, 7, 6, 9, 6, and 9 best mean results out of the total 10 functions.
From the results given in Table 7, it is evident that both SETO and HBO have
excellent exploitation ability; however, SETO is more stable than HBO. The high
exploitation power of SETO is due to two reasons. First, the algorithm updates
the position of shares in the search space if the next positions are better than
precedent positions. Second, shares move toward the best solution from different
directions at each generation that helps them jump out of local optima. Figure 5a
illustrates the results of the Friedman mean rank test [85] on fixed-dimension
functions. The Friedman mean rank value of SETO is minimum, which shows
that it obtains 1st rank compared with other algorithms.
• The results reported by SETO in solving unimodal functions are superior. It
generates the best mean results in all test functions. The second rank belongs to
HBO with 7 best mean results out of the total 12. This confirms that SETO has
superior exploitation power and convergence speed in solving unimodal func-
tions. GA, PSO, GSA, SCA, SELO, HBO, LFD, and SETO, respectively, gen-
erate 0, 0, 0, 0, 1, 7, 1, and 12 best mean results out of the total 12 functions.
Inspecting the std values shows that SETO attains the best standard deviations
among other algorithms, which confirms its stability in the searching process.
Figure 5b shows the results of the Friedman test on unimodal functions. As
shown in the plot, SETO obtains the best mean rank among others.
• As shown in Table 9, SETO is very powerful in solving multimodal functions. It
generates the best mean results for all test functions except F29. Inspecting the
results, we conclude that SETO significantly outperforms its counterparts due to
its high exploration power. The reason for this success lies in the position updat-
ing mechanism in the rising phase, in which the shares jump out of the local
optima and move toward the best solution from different directions. GA, PSO,
GSA, SCA, SELO, HBO, LFD, and SETO, respectively, generate 1, 1, 0, 0, 1,
3, 2, and 9 best mean results out of the total 10 functions. As shown in Fig. 5c,
the SETO attains 1st position and HBO 2nd rank among all algorithms on multi-
modal functions.
• The performance of SETO in solving group IV shifted and rotated, hybrid and
composite functions is superior, and it outperformed other algorithms on F33–
F38 functions. For F39 and F40, HBO and LFD generate the best mean results,
respectively. The mean results for F39 and F40, where SETO is not the top per-
former algorithm, are still very comparable and competitive to the best results
attained by HBO and LFD. As illustrated in Fig. 5d, SETO attains the best mean
rank among others in solving group IV functions. This confirms that SETO can
provide a proper balance between exploitation and exploration mechanisms in
solving complex and difficult problems. GA, PSO, GSA, SCA, SELO, HBO,
LFD, and SETO, respectively, generate 0, 1, 1, 0, 1, 5, 3, and 6 best mean results
out of the total 8 functions.
13
2156
H. Emami
7 8 7.08
5.85
6 5.45 5.25
7 6.25 6.08
Friedman mean rank
Algorithm Algorithm
7 8
6.1 6.1
6.5 6.69
6 5.4 5.6 7 6.19
5.2
Friedman mean rank
Algorithm Algorithm
The key factor to efficient search is the proper harmonization between exploration
(diversification) and exploitation (intensification). In the SETO algorithm, the ris-
ing operator is responsible for exploring the search space, and the falling operator is
responsible for exploiting the promising areas. The rising operator directs the search
agents (shares) in the solution space to explore unvisited areas and finds the prom-
ising areas, whilst the falling operator tries to carefully examine the inside of the
promising areas via accumulated local knowledge. The falling operator moves the
solutions far from the current area of search so that explorative move should reach
all the regions within search space accessed at least once. On the other hand, using
local experience, the falling operator forces the solutions to converge quickly with-
out wasting too many moves. The results confirm that SETO can provide a proper
balance between exploitation and exploration mechanisms in the search and optimi-
zation process.
Figure 6 presents the mean and overall ranks of comparison algorithms computed
by the nonparametric Friedman test [85] on all benchmark functions. The results
reveal that SETO obtains 1st overall rank and HBO obtains 2nd rank among all algo-
rithms. The third and fourth ranks belong to SELO and LFD, respectively. The dif-
ference between the LFD and SELO is insignificant and minute. GA is ranked last.
This phenomenon suggests that the introduction of new algorithms or the improve-
ment of existing ones is needed to solve classic and modern optimization problems.
Table 11 presents the results of the multi-problem-based Wilcoxon signed-rank
test [85] at significant level 𝛼 = 0.05 for benchmark functions. This test is performed
13
Stock exchange trading optimization algorithm: a… 2157
4.5 Scalability analysis
The convergence speed and optimization ability of algorithms will decrease as the
dimension of problems increases. To investigate this issue, we performed a series
of tests on 1000-dimension benchmark functions to evaluate the scalability of algo-
rithms. The experiments are performed on scalable unimodal functions F11–F22
and multimodal functions F23–F32. The algorithms terminate when they reach the
global optimum point, or they have failed to find a better solution than the exist-
ing solution during the last 50,000 FEs. The results are listed in Tables 13 and 14.
From the results, it can be concluded that SETO attains all the best mean results in
1000 dimension problems except F15. However, the best mean result for F15 is very
competitive to the best result. SELO, LFD, and HBO generate good performances;
however, their difference with SETO is not minute. The results confirm the superior
scalability of SETO compared with its counterparts. Figure 7 illustrates the execu-
tion time consumed by algorithms to reach the global optimum. From the figure, we
observe that SETO takes less execution time than other algorithms in most test func-
tions. SETO performs exploration and exploration at the same time and converges
faster. Therefore, SETO has less search time than other algorithms. After reaching
the global optimum, the solutions do not change, and according to the termination
conditions mentioned in Sect. 3.2.6, the algorithm stops.
4.6 Convergence test
13
2158
H. Emami
7 6.24 6
6 5.57 5.38 5.61
5
5 4.23 4.3 4
4 3
3 2.56
2 2.11
2 1
1
0
GA PSO GSA SCA SELO HBO LFD SETO
Algorithm
Fig. 6 The mean and overall ranks of optimization algorithms computed by Friedman test for all bench-
mark functions
4.7 Computational complexity
4.7.1 Time complexity
The overall time complexity of SETO within one iteration in the worst case can be
calculated as
13
Stock exchange trading optimization algorithm: a… 2159
Since the cost of computing objective function varies for each optimization problem,
Eq. (29) can be revised as follows:
{
O(ND) + O(ND) + 2O(ND + ND) + 2O(N) ≈ O(ND) if(D > C)
O(NC) + O(NC) + 2O(NC + NC) + 2O(N) ≈ O(NC) otherwise (30)
The overall time complexity of SETO is O(GND) or O(GNC) when the algorithm
iterates for G iterations. The overall time complexity of GA, PSO, GSA, SCA,
SELO, HBO, and LFD is O(GND) in the worst case. The time complexity of the
SETO is asymptotically equivalent to its counterparts. This proves that the SETO is
computationally efficient compared with other algorithms.
4.7.2 Space complexity
The proposed SETO needs O(N × D) space to store population at each generation,
where N denotes the population size, and D is the number of dimensions of prob-
lems. Besides, the algorithm uses O(N) space to store the fitness of shares. The over-
all space complexity of the SETO is O(ND).
5 Engineering problems
13
Table 13 Statistical results of 1000D unimodal functions
2160
13
F11 8.66E+15 ± 1.52E+06 ⊖ 4.40E+02 ± 7.51E+00 ⊖ 8.10E+05 ± 3.86E+00 ⊖ 4.86E+03 ± 6.84E+02 ⊖
F12 2.53E+09 ± 8.96E+05 ⊖ 2.06E+05 ± 2.75E+02 ⊖ 1.07E+07 ± 5.54E+03 ⊖ 4.28E+08 ± 6.55E+06 ⊖
F13 9.31E+05 ± 8.64E+02 ⊖ 9.44E+03 ± 9.80E+01 ⊖ 2.08E+04 ± 9.34E+02 ⊖ 5.23E–02 ± 1.70E–03 ⊖
F14 2.97E–02 ± 6.00E–04 ⊖ 2.92E+00 ± 5.00E–03 ⊖ 1.45E–17 ± 6.35E–18 ⊖ 3.19E+05 ± 3.25E+03 ⊖
F15 1.12E+10 ± 6.47E+02 ⊖ 1.95E+04 ± 3.65E+02 ⊖ 1.51E+07 ± 9.17E+05 ⊖ 2.16E+09 ± 6.32E+07 ⊖
F16 4.28E+04 ± 1.28E+02 ⊖ 4.76E+02 ± 8.47E+01 ⊖ 5.71E+03 ± 6.02E+02 ⊖ 1.06E+03 ± 3.27E+02 ⊖
F17 9.78E+01 ± 1.65E+00 ⊖ 9.95E–01 ± 5.00E–03 ⊖ 3.18E+01 ± 1.52E+01 ⊖ 9.91E+01 ± 1.60E+01 ⊖
F18 inf ⊖ 4.83E+02 ± 1.39E+01 ⊖ 6.31E+01 ± 3.12E+00 ⊖ inf ⊖
F19 3.54E+03 ± 1.56E+01 ⊖ 8.27E+01 ± 5.00E–02 ⊖ 4.62E+06 ± 6.15E+03 ⊖ 1.51E+11 ± 3.65E+06 ⊖
F20 2.64E+06 ± 5.61E+02 ⊖ 3.11E+02 ± 3.50E+01 ⊖ 𝟎.𝟎𝟎𝐄 + 𝟎𝟎 ± 𝟎.𝟎𝟎𝐄 + 𝟎𝟎⊙ 3.08E+05 ± 2.49E+04 ⊖
F21 1.23E+07 ± 6.12E+03 ⊖ 1.56E+05 ± 1.20E–02 ⊖ 4.23E+05 ± 3.31E+02 ⊖ 1.30E+06 ± 2.14E+04 ⊖
F22 𝟎.𝟎𝟎𝐄 + 𝟎𝟎 ± 𝟎.𝟎𝟎𝐄 + 𝟎𝟎⊙ 𝟎.𝟎𝟎𝐄 + 𝟎𝟎 ± 𝟎.𝟎𝟎𝐄 + 𝟎𝟎⊙ 𝟎.𝟎𝟎𝐄 + 𝟎𝟎 ± 𝟎.𝟎𝟎𝐄 + 𝟎𝟎⊙ 𝟎.𝟎𝟎𝐄 + 𝟎𝟎 ± 𝟎.𝟎𝟎𝐄 + 𝟎𝟎⊙
⊖ 11 11 10 11
⊕ 0 0 0 0
⊙ 1 1 2 1
SELO HBO LFD SETO
Mean ± Std Mean ± Std Mean ± Std Mean ± Std
F11 5.12E–04 ± 9.45E–05 ⊖ 1.45E+01 ± 5.02E+00 ⊖ 1.66E–06 ± 8.55E–08 ⊖ 0.00E+00 ± 0.00E+00
F12 1.00E+00 ± 6.00E–03 ⊖ 1.56E+05 ± 1.26E+03 ⊖ 1.00E+00 ± 4.15E–28 ⊖ 6.67E–01 ± 0.00E+00
F13 2.58E–05 ± 7.00E–04 ⊖ 8.61E+02 ± 7.64E+00 ⊖ 2.53E–06 ± 1.19E–08 ⊖ 0.00E+00 ± 0.00E+00
F14 5.66E–06 ± 6.48E–08 ⊖ 7.19E–09 ± 1.75E–10 ⊖ 1.77E–03 ± 2.56E–05 ⊖ 0.00E+00 ± 0.00E+00
F15 9.97E+02 ± 6.03E+01 ⊕ 1.76E+03 ± 1.42E+02 ⊖ 9.89E+02 ± 3.26E–02 ⊕ 9.99E+02 ± 0.00E+00
F16 2.47E–04 ± 3.60E–04 ⊖ 2.05E+01 ± 6.56E+00 ⊖ 1.98E–01 ± 3.90E–04 ⊖ 0.00E+00 ± 0.00E+00
F17 7.97E+01 ± 7.55E–01 ⊖ 9.87E+01 ± 8.05E+00 ⊖ 6.94E–04 ± 6.31E–07 ⊖ 0.00E+00 ± 0.00E+00
H. Emami
13
Table 14 Statistical results of 1000D multimodal functions
2162
13
F23 2.09E+01 ± 5.19E+00 ⊖ 3.83E+00 ± 1.70E–01 ⊖ 9.41E+00 ± 2.15E+00 ⊖ 2.08E+01 ± 2.16E+00 ⊖
F24 2.39E+03 ± 1.26E+01 ⊖ 2.95E+02 ± 2.30E+01 ⊖ 3.79E+02 ± 5.56E+01 ⊖ 2.82E+02 ± 2.37E+00 ⊖
F25 2.29E+04 ± 7.52E+01 ⊖ 7.00E–01 ± 0.00E+00 ⊖ 6.64E+01 ± 1.53E+01 ⊖ 5.78E+01 ± 7.01E+00 ⊖
F26 3.80E+02 ± 5.80E+01 ⊖ 2.43E+02 ± 1.30E–01 ⊖ 1.69E+02 ± 1.08E+01 ⊖ 3.61E+02 ± 2.60E+01 ⊖
F27 1.56E+04 ± 8.24E+02 ⊖ 4.64E+03 ± 6.70E+01 ⊖ 3.78E+01 ± 2.40E+01 ⊖ 2.66E+03 ± 9.60E+01 ⊖
F28 1.62E+02 ± 1.14E+01 ⊖ 1.80E+00 ± 6.00E–10 ⊖ 3.61E+01 ± 7.68E+00 ⊖ 6.15E+01 ± 6.06E+00 ⊖
F29 6.31E+07 ± 9.52E+03 ⊖ 2.84E+03 ± 4.60E+01 ⊖ 3.13E+06 ± 3.55E+04 ⊖ 7.57E+06 ± 5.82E+04 ⊖
F30 inf ⊖ 1.46E+00 ± 3.20E–03 ⊖ 1.32E+102 ± 1.56E+36 ⊖ inf ⊖
F31 4.10E–182 ± 6.30E–185 ⊖ 𝟎.𝟎𝟎𝐄 + 𝟎𝟎 ± 𝟎.𝟎𝟎𝐄 + 𝟎𝟎⊙ 0.00E+00 ± 0.00E+00 ⊙ 𝟎.𝟎𝟎𝐄 + 𝟎𝟎 ± 𝟎.𝟎𝟎𝐄 + 𝟎𝟎⊙
F32 𝟎.𝟎𝟎𝐄 + 𝟎𝟎 ± 𝟎.𝟎𝟎𝐄 + 𝟎𝟎⊙ 0.00E+00 ± 0.00E+00 ⊙ 𝟎.𝟎𝟎𝐄 + 𝟎𝟎 ± 𝟎.𝟎𝟎𝐄 + 𝟎𝟎⊙ 𝟎.𝟎𝟎𝐄 + 𝟎𝟎 ± 𝟎.𝟎𝟎𝐄 + 𝟎𝟎⊙
⊖ 9 8 8 8
⊕ 0 0 0 0
⊙ 1 2 2 2
SELO HBO LFD SETO
Mean ± Std Mean ± Std Mean ± Std Mean ± Std
F23 3.11E–04 ± 3.16E–07 ⊖ 2.45E+00 ± 1.08E+00 ⊖ 3.24E–04 ± 2.61E–04 ⊖ – 8.88E–16 ± 0.00E+00
F24 1.99E–38 ± 4.74E–39 ⊖ 5.82E–03 ± 3.64E–05 ⊖ 2.58E–03 ± 3.20E–02 ⊖ 0.00E+00 ± 0.00E+00
F25 2.22E–16 ± 2.38E–17 ⊖ 4.39E–05 ± 7.00E–06 ⊖ 1.34E–07 ± 6.80E–08 ⊖ 0.00E+00 ± 0.00E+00
F26 9.00E–01 ± 5.53E–24 ⊙ 8.74E+00 ± 2.10E+00 ⊖ 9.00E–01 ± 0.00E+00 ⊙ 9.00E–01 ± 0.00E+00
F27 6.36E–04 ± 3.16E–05 ⊖ 1.09E+03 ± 6.52E+02 ⊖ 8.24E–06 ± 1.20E–05 ⊖ 0.00E+00 ± 0.00E+00
F28 3.00E–01 ± 5.60E–03 ⊖ 1.44E+01 ± 2.50E+00 ⊖ 1.21E–03 ± 4.80E–04 ⊖ 0.00E+00 ± 0.00E+00
F29 3.09E+03 ± 9.90E+01 ⊖ 6.32E+03 ± 5.53E+02 ⊖ 3.59E+03 ± 9.75E+02 ⊖ 3.59E+01 ± 1.13E+01
F30 1.56E–53 ± 3.51E–55 ⊖ inf ⊖ inf ⊖ 0.00E+00 ± 0.00E+00
H. Emami
Table 14 (continued)
Function GA PSO GSA SCA
Mean ± Std Mean ± Std Mean ± Std Mean ± Std
F31 1.75E–143 ± 3.07E–148 ⊖ 𝟎.𝟎𝟎𝐄 + 𝟎𝟎 ± 𝟎.𝟎𝟎𝐄 + 𝟎𝟎⊙ 𝟎.𝟎𝟎𝐄 + 𝟎𝟎 ± 𝟎.𝟎𝟎𝐄 + 𝟎𝟎⊙ 0.00E+00 ± 0.00E+00
F32 𝟎.𝟎𝟎𝐄 + 𝟎𝟎 ± 𝟎.𝟎𝟎𝐄 + 𝟎𝟎⊙ 𝟎.𝟎𝟎𝐄 + 𝟎𝟎 ± 𝟎.𝟎𝟎𝐄 + 𝟎𝟎⊙ 𝟎.𝟎𝟎𝐄 + 𝟎𝟎 ± 𝟎.𝟎𝟎𝐄 + 𝟎𝟎⊙ 0.00E+00 ± 0.00E+00
⊖ 8 8 7 –
⊕ 0 0 0 –
⊙ 2 2 3 –
13
2164
H. Emami
160
GA
140
PSO
120 GSA
Time (sec.)
100 SCA
80 SELO
HBO
60
LFD
40
SETO
20
0
F11 F12 F13 F14 F15 F16 F17 F18 F19 F20 F21 F22 F23 F24 F25 F26 F27 F28 F29 F30 F31 F32
Test function
Fig. 7 Comparison of the execution time of algorithms in 1000D unimodal and multimodal functions
Figure 9a shows the structure of the three-bar truss design problem. This problem is
one of the most studied test cases used in the literature [7, 36]. The objective is to
design a truss with three bars so that its weight to be minimal. The problem has two
parameters including the area of bars 1 and 3 and the area of bar 2. To design the
truss, three constraints should be considered: stress, deflection, and buckling. The
problem is mathematically defined as follows:
� � � �
let X�⃗ = x1 , x2 = A1 , A2
√
minimize f (X) �⃗ = l × (2 2x1 + x2 ) 0 ≤ x1 , x2 ≤ 1
√
subject to g1 (X)�⃗ = √ 2x1 +x2 P − 𝜎 ≤ 0
2x12 +2x1 x2 )
�⃗ = √ x2 (31)
g2 (X) P−𝜎 ≤0
2x12 +2x1 x2 )
�⃗ =
g3 (X) √
1
P −𝜎 ≤0
2x2 +x1 )
where l = 100cm, P = 2KN∕cm2 , and 𝜎 = 2KN/cm2
Figure 9b shows the schematic view of the rolling element bearing design prob-
lem. It is a maximization problem, which contains ten geometric variables and nine
design constraints to control the assembly and geometric-based restrictions [15].
13
Stock exchange trading optimization algorithm: a… 2165
Fig. 8 Convergence graphs and solution distributions of comparison algorithms on F6, F11, F28, F35,
and F40 test functions
13
2166
H. Emami
subject to
𝜙
g1 (⃗z) = 2sin−1 (D0 ∕D ) − Z + 1 ≤ 0, g2 (⃗z) = 2Db − KD min (D − d) > 0,
b m
g3 (⃗z) = KD max (D − d) − 2Db ≥ 0, g4 (⃗z) = 𝜉Bw − Db ≤ 0,
g5 (⃗z) = Dm − 0.5(D + d) ≥ 0, g6 (⃗z) = (0.5 + e)(D + d) − Dm ≥ 0,
g7 (⃗z) = 0.5(D − Dm − Db ) − 𝜀Db ≥ 0 g8 (⃗z) = fi ≥ 0.515,
g9 (⃗z) = fo ≥ 0.515
where
� � �−0.3
� �1.72 � � 0.3 �0.41 �10∕3
�� �0.41
(1−𝛾)1.39
fi (2f0 −1)
× 𝛾 (1−𝛾)
1−𝛾 2fi
fc = 37.91 1 + 1.04 1+𝛾 fo (2fi −1)1∕3
2fi −1
� � �2 �
x = {(D − 2)∕2 − 3(T∕4)}2 + D∕2 − T∕4 − Db − {d∕2 + T∕4}2
� �
y = 2{(D − d)∕2 − 3(T∕4)} (D∕d) − T∕4 − Db
∏ x
𝜙o = 2 −cos−1 ( y )
D r r
𝛾 = D b , fi = Di , fo = Di , T = D − d − 2Db
m b b
D = 160, d = 90
Bw = 30, ri = ro = 11.0330.5(D + d) ≤ Dm ≤ 0.6(D + d),
0.15(D - d) ≤ Db ≤ 0.45(D − d), 4 ≤ Z ≤ 50, 0.515 ≤ fi and fo ≤ 0.6,
0.4 ≤ KD min ≤ 0.5,
0.6 ≤ KD max ≤ 0.7, 0.3 ≤ e ≤ 0.4, 0.02 ≤ e ≤ 0.1,
0.6 ≤ 𝜉 ≤ 0.85
(32)
Table 16 summarizes the solutions obtained by the proposed SETO and comparison
algorithms for the rolling element bearing design problem. Inspecting the results in
Table 16, we conclude that the SETO obtains superior results compared with other
optimizers and exposes the best design.
Figure 9c shows a schematic view of the speed reducer design problem. The objec-
tive is to design a simple gearbox with the minimum weight that is embedded
between the propeller and the engine in light aircraft [15]. The problem consists of
constraints on surface stress, bending stress of the gear teeth, stresses in the shafts,
and transverse deflections of the shafts. The mathematical formulation of the prob-
lem is as follows [15]:
13
Stock exchange trading optimization algorithm: a… 2167
13
2168
H. Emami
27 397.5
subject to g1 (⃗x) = x1 x22 x3
− 1 ≤ 0, g2 (⃗x) = x1 x22 x32
− 1 ≤ 0,
1.93x43 1.93x53
g3 (⃗x) = − 1 ≤ 0, g4 (⃗x) = x x x4 − 1 ≤ 0,
x2 x3 x64
√( )2
2 3 7
1 745x4
g5 (⃗x) = 110x3 + 16.9 × 106 − 1 ≤ 0
6√
110x2 x3
( )2 (33)
1 745x5
g6 (⃗x) = 85x 3
110x x
+ 157.5 × 106 − 1 ≤ 0
7 2 3
x2 x3 5x
g7 (⃗x) = 40
− 1 ≤ 0, g8 (⃗x) = x 2 − 1 ≤ 0
1
x 15x +1.9
g9 (⃗x) = 12x1 − 1 ≤ 0, g10 (⃗x) = 6x −1 ≤0
2 4
1.1x7 +1.9
g11 (⃗x) = x5
−1≤0
where 2.6 ≤ x1 ≤ 3.6, 0.7 ≤ x2 ≤ 0.8, 17 ≤ x3 ≤ 28,
7.3 ≤ x4 ≤ 8.3, 7.3 ≤ x5 ≤ 8.3, 2.9 ≤ x6 ≤ 3.9,
5.0 ≤ x7 ≤ 5.5
As shown in Table 17, HBO obtains the best results. With a slight difference from
HBO, the proposed SETO takes the second rank. Except for HBO, the proposed
SETO attains the best results compared to other optimizers, which confirms that it
can be a suitable choice for designing the speed reducer.
Pressure vessels are widely used in industry structures such as gas tanks and cham-
pagne bottles. The goal is to design a cylindrical vessel with the minimum fabrica-
tion cost. The problem consists of four design parameters including the thickness of
13
Stock exchange trading optimization algorithm: a… 2169
the head ( Ts ), the thickness of the body ( Th ), the inner radius (R), and the length of
the cylindrical section (L). Figure 9d shows the overall structure of the pressure ves-
sel design problem. The problem is mathematically defined as follows [4]:
let x⃗ = [x1 , x2 , x3 , x4 ] = [Ts , Th , R, L] where 0 ≤ x1 , x2 ≤ 99,
10 ≤ x3 , x4 ≤ 200,
Table 18 reports the results attained by SETO and comparison optimizers. The
parameters and costs of SETO are very competitive to those obtained by other algo-
rithms. This confirms that the SETO is able to deal with the constrained search
space of pressure vessel design problem.
13
2170
H. Emami
• Tuning some of the control parameters with optimal values for different appli-
cations. Most of the control parameters of SETO are already known and con-
figured using the data drawn from the stock exchange and scientific resources
about technical analysis. This issue turns the SETO into an optimizer quite easy
to implement and execute. However, in some applications, different values for
the parameters can increase the performance of the algorithm. Parameter setting
is not specific to the SETO algorithms and exists in all algorithms.
• Increasing the execution time of the algorithm due to the calculation of the
Euclidean distance between shares in rising and falling phases. As the dimension
of the problem increases, the execution time of the algorithm also increases,
• The algorithm still traps in local optima on some benchmark functions and can-
not converge to the global optimum, as we can see in speed reducer design prob-
lem and some numerical functions such as F39, F40. This suggests that increas-
ing the exploitation and exploration power of the genetic algorithm is needed.
• It can be used for both continuous and discrete problems with some easy modifi-
cations.
• It is simple and efficient. It achieves superior results on different groups of
numerical functions and engineering optimization problems.
• It can be applied to all problems that other algorithms can be applied for.
• It converges to the global optimum of the optimization problems faster than its
counterparts.
• It outperformed other algorithms on most benchmark functions. Out of 40
numerical optimizatin functions, SETO has achieved the global optimum on
36 functions, and out of 4 engineering complex problems, it obtained the best
results on 4 cases.
6 Conclusion
This paper presents a novel stock exchange trading optimization (SETO) algo-
rithm to solve numerical and engineering optimization problems. The algorithm is
based on technical-based trading strategies in the stock market. Rising, falling, and
exchange are the three main phases of the algorithm that hopefully causes the solu-
tions to converge to the global optimum of the cost function. SETO is easy to imple-
ment and conceptually simple. To test the performance of SETO, it is compared
with several state-of-the-art optimizers in solving a wide variety of numerical global
optimization and real-world problems. The results confirm that SETO attained out-
standing performance compared with its counterparts in most test cases. This issue
is demonstrated with the experiments and the statistics of results. There remain sev-
eral directions for future research. One of the interesting works is to apply the SETO
to a variety of real-world applications to precisely determine the advantages and
13
Stock exchange trading optimization algorithm: a… 2171
References
1. Brammya G, Praveena S, Ninu Preetha NS, Ramya R, Rajakumar BR, Binu D (2019) Deer hunting
optimization algorithm: a new nature-inspired meta-heuristic paradigm. Comput J
2. Molina D, Poyatos J, Del Ser J, García S, Hussain A, Herrera F (2020) Comprehensive taxonomies
of nature- and bio-inspired optimization: inspiration versus algorithmic behavior, critical analysis
and recommendations. Cognit Comput 12(5):897–939
3. Abbasi M, Yaghoobikia M, Rafiee M, Jolfaei A, Khosravi MR (2020) Energy-efficient workload
allocation in fog-cloud based services of intelligent transportation systems using a learning classifier
system. IET Intell Transp Syst 14(11):1484–1490
4. Houssein EH, Saad MR, Hashim FA, Shaban H, Hassaballah H (2020) Lévy flight distribution: a
new metaheuristic algorithm for solving engineering optimization problems. Eng Appl Artif Intell
94:103731
5. Hussain K, Salleh M, Cheng S, Shi Y (2018) Metaheuristic research: a comprehensive survey. Artif
Intell Rev 52:2191–2233
6. Yang XS, Deb S, Zhao YX, Fong S, He X (2018) Swarm intelligence: past, present and future. Soft
Comput 22(18):5923–5933
7. Heidari AA, Mirjalili S, Faris H, Aljarah I, Mafarja M, Chen H (2019) Harris hawks optimization:
algorithm and applications. Futur Gener Comput Syst 97:849–872
8. Abdel-Basset M, Abdel-Fatah L, Sangaiah AK (2018) Meta-heuristic algorithms: a comprehensive
review. In: Computational intelligence for multimedia big data on the cloud with engineering appli-
cations. Elsevier Inc
9. Kennedy J, Eberhart R (1995) Particle swarm optimization. In: Proceedings of ICNN’95—Interna-
tional Conference on Neural Networks, Perth, WA, Australia, pp 1942–1948
10. Dorigo M, Birattari M, Stutzle T (2006) Ant colony optimization. IEEE Comput Intell Mag 1:28–39
11. Emami H, Derakhshan F (2015) Election algorithm: a new socio-politically inspired strategy. AI
Commun 28(3):591–603
12. Emami H (2019) Chaotic election algorithm. Comput Inform 38:1444–1478
13. Fadakar F, Ebrahimi M (2016) A new metaheuristic football game inspired algorithm. In: 1st Con-
ference on Swarm Intelligence and Evolutionary Computation CSIEC 2016—Proceedings, pp 6–11
14. Askari Q, Younas I, Saeed M (2020) Political optimizer: a novel socio-inspired meta-heuristic for
global optimization. Knowl Based Syst 195:105709
15. Askari Q, Saeed M, Younas I (2020) Heap-based optimizer inspired by corporate rank hierarchy for
global optimization. Expert Syst Appl 161:113702
16. Salih SQ, Alsewari ARA (2020) A new algorithm for normal and large-scale optimization prob-
lems: Nomadic People Optimizer. Neural Comput Appl 32(14):10359–10386
17. Sörensen K, Sevaux M, Glover F (2017) A history of metaheuristics. In: ORBEL29-29th Belgian
Conference on Operations Research, pp 791–808
18. Emami H (2020) Seasons optimization algorithm. Eng Comput 123456789:1–21
19. Mirjalili S, Lewis A (2016) The whale optimization algorithm. Adv Eng Softw 95:51–67
20. Holland JH (1992) Genetic algorithms—computer programs that ‘evolve’ in ways that resemble
natural selection can solve complex problems even their creators do not fully understand. Sci Am
66–72
21. Yao X, Liu Y, Lin G (1999) Evolutionary programming made faster. IEEE Trans Evol Comput
3(2):82–102
22. Huang F, Wang L, He Q (2007) An effective co-evolutionary differential evolution for constrained
optimization. Appl Math Comput 186:340–356
23. Simon D (2008) Biogeography-based optimization. IEEE Trans Evol Comput 12(6):702–713
13
2172
H. Emami
24. Ghaemia M, Feizi-Derakhshi MR (2014) Forest optimization algorithm. Expert Syst Appl
41(15):6676–6687
25. Hayyolalam V, Pourhaji Kazem AA (2020) Black widow optimization algorithm: a novel meta-heu-
ristic approach for solving engineering optimization problems. Eng Appl Artif Intell 87:103249
26. Shayanfar H, Gharehchopogh FS (2018) Farmland fertility: a new metaheuristic algorithm for solv-
ing continuous optimization problems. Appl Soft Comput J 71:728–746
27. Karaboga D, Basturk B (2007) A powerful and efficient algorithm for numerical function optimiza-
tion: artificial bee colony (ABC) algorithm. J Glob Optim 39(3):459–471
28. Yang X (2010) Firefly algorithm, stochastic test functions and design optimisation. Int J Bio-Inspir
Comput 2(2):78–84
29. Gandomia AH, Alavi AH (2012) Krill herd: a new bio-inspired optimization algorithm. Commun
Nonlinear Sci Numer Simul 17(12):4831–4845
30. Wang GG, Deb S, Coelho LDS (2016) Elephant herding optimization. In: Proceedings of 2015 3rd
International Symposium on Computational and Business Intelligence ISCBI, pp 1–5
31. Bansal JC, Sharma H, Jadon SS, Clerc M (2014) Spider monkey optimization algorithm for numeri-
cal optimization. Memet Comput 16(1):31–47
32. Mirjalili S, Mohammad S, Lewis A (2014) Grey wolf optimizer. Adv Eng Softw 69:46–61
33. Soleimanian F, Gholizadeh H (2019) A comprehensive survey: whale optimization algorithm and its
applications. Swarm Evol Comput 48:1–24
34. Arora S, Singh S (2018) Butterfly optimization algorithm: a novel approach for global optimization.
Soft Comput 23(3):715–734
35. Jain M, Singh V, Rani A (2019) A novel nature-inspired algorithm for optimization: squirrel search
algorithm. Swarm Evol Comput 44:148–175
36. Saremi S, Mirjalili S, Lewis A (2017) Grasshopper optimisation algorithm: theory and application.
Adv Eng Softw 105:30–47
37. Dhiman G, Kumar V (2019) Seagull optimization algorithm: theory and its applications for large
scale industrial engineering problems. Knowl Based Syst 165:169–196
38. Mohamad-saleh WTJ, Tan W (2019) Normative fish swarm algorithm (NFSA) for optimization.
Soft Comput 24(3):2083–2099
39. Fathollahi-Fard ANM, Hajiaghaei-Keshteli M, Tavakkoli-Moghaddam R (2020) Red deer algorithm
(RDA): a new nature-inspired meta-heuristic. Soft Comput 24(19):14637–14665
40. Kirkpatrick S, Vecchi GCD, Science MP (1983) Optimization by simulated annealing. Science
220:671–680
41. Rashedi E, Nezamabadi-pour H, Saryazdi S (2009) GSA: a gravitational search algorithm. Inf Sci
179(13):2232–2248
42. Erol OK, Eksin I (2006) A new optimization method: big bang–big crunch. Adv Eng Softw
37:106–111
43. Alatas B (2011) ACROA: artificial chemical reaction optimization algorithm for global optimiza-
tion. Expert Syst Appl 38(10):13170–13180
44. Shah-hosseini H (2011) Principal components analysis by the galaxy-based search algorithm: a
novel metaheuristic for continuous optimisation. Int J Comput Sci Eng 6(2):132–140
45. Feng X, Liu Y, Yu H, Luo F (2017) Physarum-energy optimization algorithm. Soft Comput
23(3):871–888
46. Kaveh A, Dadras A (2017) A novel meta-heuristic optimization algorithm: thermal exchange opti-
mization. Adv Eng Softw 110:69–84
47. Faramarzi A, Heidarinejad M, Stephens B, Mirjalili S (2019) Equilibrium optimizer: a novel optimi-
zation algorithm. Knowl Based Syst 191:105190
48. Kushwaha N, Pant M, Kant S, Jain VK (2018) Magnetic optimization algorithm for data clustering.
Pattern Recognit Lett 115:59–65
49. Alexandros GD (2017) Nature inspired optimization algorithms related to physical phenomena and
laws of science: a survey. Int J Artif Intell Tools 26(6):1–25
50. Geem ZW, Kim JH, Loganathan GV (2001) A new heuristic optimization algorithm: harmony
search. Simulation 76(2):60–68
51. Atashpaz-Gargari E, Lucas C (2007) Imperialist competitive algorithm: an algorithm for optimiza-
tion inspired by imperialistic competition. In: 2007 IEEE Congress on Evolutionary Computation,
CEC2007, Singapore, pp 4661–4667
52. Rao RV, Savsani VJ, Vakharia DP (2012) Teaching-learning-based optimization: an optimization
method for continuous non-linear large scale problems. Inf Sci 183(1):1–15
13
Stock exchange trading optimization algorithm: a… 2173
53. Husseinzadeh Kashan A (2014) League championship algorithm (LCA): an algorithm for global
optimization inspired by sport championships. Appl Soft Comput J 16:171–200
54. Das P, Das DK, Dey S (2018) A new class topper optimization algorithm with an application to data
clustering. IEEE Trans Emerg Top Comput 6750:1–11
55. Mirjalili S (2016) SCA: a sine cosine algorithm for solving optimization problems. Knowl Based
Syst 96:120–133
56. Kumar M, Kulkarni AJ, Satapathy SC (2018) Socio evolution & learning optimization algorithm: a
socio-inspired optimization methodology. Futur Gener Comput Syst 81:252–272
57. Mahmoodabadi MJ, Rasekh M, Zohari T (2018) TGA: team game algorithm. Future Comput
Inform J 3(2):191–199
58. Singh PR, Elaziz MA, Xiong S (2019) Ludo game-based metaheuristics for global and engineering
optimization. Appl Soft Comput J 84:105723
59. Martinez-Alvarez F et al (2020) Coronavirus optimization algorithm: a bio-inspired meta-heuristic
based on the COVID-19 propagation model. Big Data 8(4):308–322
60. Abbasi M, Yaghoobikia M, Rafiee M, Jolfaei A, Khosravi MR (2020) Energy-efficient workload
allocation in fog-cloud based services of intelligent transportation systems using a learning classifier
system. IET Intell Transp Syst 14(11):1484–1490
61. Zhou Z, Kearnes S, Li L, Zare RN, Riley P (2019) Optimization of molecules via deep reinforce-
ment learning. Sci Rep 9(1):1–10
62. Talbi EG (2019) Machine learning for metaheuristics—state of the art and perspectives. In: 11th
International Conference on Knowledge and Smart Technology (KST), pp XXIII–XXIII
63. Owoyele O, Pal P (2021) A novel machine learning-based optimization algorithm (ActivO) for
accelerating simulation-driven engine design. Appl Energy 285:116455
64. Nabipour M, Nayyeri P, Jabani H, Mosavi A, Salwana E, Shahab S (2020) Deep learning for stock
market prediction. Entropy 22(8):840
65. Das SR, Mishra D, Rout M (2019) Stock market prediction using Firefly algorithm with evolution-
ary framework optimized feature reduction for OSELM method. Expert Syst Appl 4:100016
66. Kelotra A, Pandey P (2020) Stock market prediction using optimized deep-ConvLSTM model. Big
Data 8(1):5–24
67. Thakkar A, Chaudhari K (2020) A comprehensive survey on portfolio optimization, stock price and
trend prediction using particle swarm optimization. Springer, pp 1–32
68. Kumar K, Haider MT (2021) Enhanced prediction of intra-day stock market using metaheuristic
optimization on RNN-LSTM network. New Gener Comput 39(1):231–272
69. Abedi M, Gharehchopogh FS (2020) An improved opposition based learning firefly algorithm with
dragonfly algorithm for solving continuous optimization problems. Intell Data Anal 24(2):309–338
70. Rahnema N, Gharehchopogh FS (2020) An improved artificial bee colony algorithm based on whale
optimization algorithm for data clustering. Multimed Tools Appl 79(44):32169–32194
71. Mohammadzadeh H, Soleimanian F (2021) Feature selection with binary symbiotic organisms
search algorithm for email spam detection. Int J Inf Technol Decis Mak 20(1):469–515
72. Soleimanian F, Shayanfar H, Gholizadeh H (2020) A comprehensive survey on symbiotic organisms
search algorithms. Artif Intell Rev 53:2265–2312
73. Mohmmadzadeh H, Soleimanian F (2021) An efficient binary chaotic symbiotic organisms search
algorithm approaches for feature selection problems. J Supercomput
74. Hosseinalipour A, Soleimanian F, Masdari M, Khademi A (2021) A novel binary farmland fertility
algorithm for feature selection in analysis of the text psychology. Appl Intell 1–36
75. Darwish A (2018) Bio-inspired computing: algorithms review, deep analysis, and the scope of
applications. Future Comput Inform J 3(2):231–246
76. Murphy JJ (1999) Technical analysis of the financial markets: a comprehensive guide to trading
methods and applications. Penguin
77. Wilder JW (1978) New concepts in technical trading systems. Trend Research
78. Anderson B, Li S (2015) An investigation of the relative strength index. Banks Bank Syst
10(1):92–96
79. Wafi AS, Hassan H, Mabrouk A (2015) Fundamental analysis models in financial markets—review
study. Procedia Econ Finance 30(15):939–947
80. Civicioglu P (2013) Backtracking search optimization algorithm for numerical optimization prob-
lems. Appl Math Comput 219(15):8121–8144
81. Suganthan P, Ali M, Wu G, Mallipeddi R (2018) Special session & competitions on real-parameter
single objective optimization. In: CEC2018, Rio de Janeiro, Brazil
13
2174
H. Emami
Publisher’s Note Springer Nature remains neutral with regard to jurisdictional claims in published
maps and institutional affiliations.
13