0% found this document useful (0 votes)
32 views

Stock Exchange Trading Optimization Algorithm

Uploaded by

Jessik Valencia
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
32 views

Stock Exchange Trading Optimization Algorithm

Uploaded by

Jessik Valencia
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 50

The Journal of Supercomputing (2022) 78:2125–2174

https://doi.org/10.1007/s11227-021-03943-w

Stock exchange trading optimization algorithm:


a human‑inspired method for global optimization

Hojjat Emami1 

Accepted: 8 June 2021 / Published online: 25 June 2021


© The Author(s), under exclusive licence to Springer Science+Business Media, LLC, part of Springer Nature
2021

Abstract
In this paper, a human-inspired optimization algorithm called stock exchange trading
optimization (SETO) for solving numerical and engineering problems is introduced.
The inspiration source of this optimizer is the behavior of traders and stock price
changes in the stock market. Traders use various fundamental and technical analy-
sis methods to gain maximum profit. SETO mathematically models the technical
trading strategy of traders to perform optimization. It contains three main actuators
including rising, falling, and exchange. These operators navigate the search agents
toward the global optimum. The proposed algorithm is compared with seven popu-
lar meta-heuristic optimizers on forty single-objective unconstraint numerical func-
tions and four engineering design problems. The statistical results obtained on test
problems show that SETO is capable of providing competitive and promising per-
formances compared with counterpart algorithms in solving optimization problems
of different dimensions, especially 1000-dimension problems. Out of 40 numerical
functions, the SETO algorithm has achieved the global optimum on 36 functions,
and out of 4 engineering problems, it has obtained the best results on 3 problems.

Keywords  Human-inspired meta-heuristic · Numerical optimization · Engineering


design problems · Stock exchange trading optimization (SETO) algorithm

1 Introduction

Optimization plays a crucial role in various domains, like industrial applications,


business, engineering, social science, and transportation [1–3]. A lot of problems in
science and engineering are generally constraint or unconstraint optimization prob-
lems. Generally speaking, optimization is the process of selecting the best possible

* Hojjat Emami
[email protected]
1
University of Bonab, Bonab, Iran

13
Vol.:(0123456789)
2126
H. Emami

solution for a given engineering/scientific problem [4]. An optimization problem P


can be formulated mathematically as follows [5]:
P ≐ (Q, C, f ) (1)
where Q denotes the solution space defined over a finite set of optimization varia-
bles, C denotes a set of problem-dependent constraints, and f is an objective function
that needs to be minimized or maximized. The goal is to find an optimum solution
Q∗ with minimum objective function value f (q∗ ) ≤ f (q), ∀ q ∈ Q in minimization
problems. In maximization problems, the objective value of solution q∗ is to be
maximized. According to the structure measure, the optimization problems can be
grouped into different classes: constraint or unconstraint, single- or multi-objective,
and combinatorial problems [5]. Constraint optimization problems involve one or
several certain restrictions that cannot be violated in the optimization process. On
the contrary, unconstraint problems do not involve limitations or constraints. In sin-
gle-objective problems, there is only one specific objective, while in multi-objective
mode there is more than one objective to be maximized or minimized. In combina-
torial optimization problems, the goal is to find or select a permutation of variables
in a way that objective function is minimized or maximized.
The majority of real-world large-scale, multimodal, non-differentiable, and non-
continuous optimization problems are difficult to solve with conventional mathemat-
ical and deterministic methods such as quasi-Newton and sequential quadratic pro-
gramming. The classic deterministic and exact optimization methods often perform
an exhaustive search by simple calculus rules and tend to utilize problem-specific
information such as the gradients of the objective to guide the search process in
solution space [6]. These methods may stick at the local optima, need to derivate
the search space, and cannot efficiently balance between exploitation and explora-
tion [4]. Meta-heuristic optimizers are efficient alternatives when dealing with large-
scale and non-differentiable problems [7]. They have gained immense popularity
amongst researchers due to their simplicity, durability, self-organization, coordina-
tion, easy implementation, robustness, and effectiveness in solving a variety of opti-
mization problems [6].
Meta-heuristics can solve optimization problems with limited complexity [4].
However, the performance of most meta-heuristics depends on the tuning of user-
defined parameters. Besides, meta-heuristics do not guarantee a global optimum
solution is ever found but try to find a near-optimal solution within a reasonable
time. Meta-heuristics are black-box optimizers that can be applied to a variety of
optimization problems with only limited modifications. To solve an optimization
problem, a meta-heuristic algorithm first creates one or multiple initial solutions.
While stopping criteria are not satisfied, the algorithm explores and exploits the
solution space with different actuators to generate new solutions. At each genera-
tion, the algorithm updates the solutions. Finally, a solution with the maximum fit-
ness is considered the optimal solution for the given problem. The key factor to effi-
cient search is the proper harmonization between exploitation (intensification) and
exploration (diversification) [8].

13
Stock exchange trading optimization algorithm: a… 2127

The majority of meta-heuristic algorithms are inspired by the biological evolu-


tion, social behavior of humans, physics laws, and the survival and living systems of
animals, insects, and birds [2, 4–6, 8, 9]. For example, particle swarm optimization
(PSO) [9] models the social behavior of birds flocking. It starts the search process
with a collection of solutions dubbed particles. Each particle navigates in the solu-
tion space using its local best and the global best knowledge found by other parti-
cles. Another example is ant colony optimization (ACO) [10], which simulates the
searching of ants from the colony to the food source.
In recent years, we have witnessed human-inspired algorithms becoming increas-
ingly one of the most important topics in the optimization field. Human-inspired
algorithms simulate the approaches that humans use to solve problems. The presi-
dential election algorithm (PEA) [11, 12] is a fundamental human behavior-inspired
algorithm that models the interaction between voters and candidates in the elec-
tion campaign. A few well-known human-inspired algorithms are football game
algorithm (FGA) [13] inspired by the behavior of players to score a goal under the
supervision of the coach; political optimizer (PO) [14] inspired by the multi-phased
process of politics; heap-based optimizer (HBO) [15] inspired by the rank hierar-
chy in organizations, deer hunting optimization algorithm (DHOA) [1] simulates the
hunting methods of the human toward deer; and nomadic people optimizer (NPO)
[16] models the migration behavior of nomadic people in their searches and move-
ment for sources of life including grass for grazing and water. Humans are the most
intelligent creatures and always try to solve real-world problems in the best possible
way; thus, modeling their behavior and actions can be a successful method to solve
optimization problems. In the literature, human-inspired algorithms show outstand-
ing performance in solving optimization problems.
Recently, tremendous progress has been made in the development of meta-heu-
ristic algorithms. However, the field of meta-heuristics is far from maturity [5, 17].
According to the NFL theorem, it is possible that a certain meta-heuristic obtains
better results on specific problems and not as good on others. In other words, no
algorithm can solve all kinds of optimization problems at the same time [4, 18].
These reasons prove that there is still a need for introducing new algorithms or
improving the existing ones. This paper presents a new human-inspired meta-heuris-
tic for solving optimization problems. The proposed algorithm is referred to as stock
exchange trading optimization (SETO). The procedure of SETO attempts to find the
best share with maximum profit (optimal or near-optimal solution) in the stock mar-
ket with the help of trading strategies. To the best of our knowledge, in the literature,
there is no research, which simulates the stock trading strategies.
The stock exchange is a place where investors and traders can sell or buy their
ownership of stocks. Equities or stocks represent fractional ownership in a company.
The stock exchanges pursue two goals. The first objective is to provide capital to
companies to expand their businesses. The secondary goal is to allow investors to
share in the profits of publicly traded companies. Trade is the basic concept in the
stock market that means the transfer of a share from a seller to a buyer based on an
agreement on a price. Trading and investing are two different approaches to profit
in the stock market. Both traders and investors seek profits through buying and sell-
ing shares. Investors buy shares and hold them for an extended period to earn large

13
2128
H. Emami

returns. In contrast, traders attempt to make transactions that can help them profit
quickly from price fluctuations over a shorter time frame. The objective of traders is
to gain returns that outperform buy-and-hold investing. Traders often use different
technical analysis tools such as stochastic oscillators and moving averages to find
optimal share buy and sell points. They try to maximize profit by adopting opti-
mal trading strategies and selecting the best shares. The procedure of the proposed
SETO algorithm attempts to find the most profitable share in the stock exchange
with the help of simple trading strategies. The best share corresponds to the opti-
mal solution to the given optimization problem. The SETO algorithm first creates a
population of candidate solutions. The algorithm improves the initial solutions using
three operators including rising, falling, and exchange. The individuals in the popu-
lation gradually converge to the optimal point.
Briefly speaking, the main contributions of this paper are as follows:

• A new meta-heuristic algorithm named stock exchange trading optimization


(SETO) algorithm is proposed for solving numerical and engineering optimiza-
tion problems. Most of the control parameters of SETO are already known and
configured using the data drawn from the stock exchange and scientific resources
about technical analysis. This issue turns the SETO into an optimizer quite easy
to implement and execute.
• Forty single-objective numerical optimization functions from CEC competitions
and four engineering design problems are used to evaluate the performance of
the SETO and comparison algorithms. The experimental results confirm the
superiority of the proposed SETO compared with counterparts.

The SETO algorithm is simple and easy to implement. It can be applied to all opti-
mization problems that other optimizers can be applied for. SETO is an efficient
choice to solve optimization problems in various disciplines such as physical sci-
ence, mathematics, agricultural science, economics, computer science, communi-
cation, mechanical applications, civil engineering applications, manufacturing, and
many other areas.
The remaining parts of this paper are structured as follows: Section 2 reviews the
literature. Section 3 describes the inspiration source, mathematical model, and the
working principle of the proposed SETO algorithm. Section 4 presents the experi-
mental results obtained by the SETO and counterpart algorithms in solving single-
objective numerical optimization problems. Section  5 evaluates the applicability
of SETO and comparison algorithms on real-world engineering design problems.
Finally, Sect. 6 concludes the paper and lists potential directions for future research.

2 Related work

According to the metaphor of the search procedures, the structure of the problem
under consideration, and the search strategy, optimization meta-heuristics can be
categorized into different classes. As shown in Fig.  1, two main groups of meta-
heuristics are metaphor-based and non-metaphor-based algorithms [8]. The former

13
Stock exchange trading optimization algorithm: a… 2129

Meta-heuristic algorithms

Metaphor based algorithms Non-metaphor based algorithms

Biology-inspired Chemistry/physics-inspired Human-inspired

Evolutionary Swarm intelligence

Fig. 1  Broad classification of optimization meta-heuristic algorithms

category consists of algorithms that model the natural evolution, collective or swarm
intelligence of creatures, human actions in real life, chemistry or physical opera-
tions, etc. The latter category of algorithms did not simulate any natural phenomena
or creatures’ behavior for performing a search in the solution space of optimization
problems.
The metaphor-based algorithms can be categorized into three main paradigms:
biology-inspired, chemistry-/physics-inspired, and human-inspired algorithms.
Biology-inspired algorithms simulate the evolution of living organisms or the col-
lective intelligence of creatures such as ants, birds, and bees. Two classes of biol-
ogy-inspired algorithms are evolutionary and swarm intelligence algorithms. Evo-
lutionary algorithms are inspired by the laws of biological evolution in nature [15,
19]. The objective is to combine the best individuals to improve the survival and
reproduction ability of individuals throughout generations. Since the fittest individu-
als have a higher chance to survive and reproduce, the individuals in the next gen-
erations may probably be better than previous ones [18]. This idea forms the search
strategy of evolutionary algorithms, in which individuals will gradually reach the
global optimum. The most popular evolutionary algorithm is the genetic algorithm
(GA) [20] that follows Darwin’s theory of evolution. In GA, first, a population of
solutions is created randomly. The population evolves over iterations through selec-
tion, reproduction, combination, and mutation. Few popular evolutionary algorithms
are fast evolutionary programming (FEP) [21], differential evolution (DE) [22],
biogeography-based optimization (BBO) [23], forest optimization algorithm (FOA)
[24], black widow optimization (BWO) [25], farmland fertility algorithm (FFA)
[26], and seasons optimization algorithm (SOA) [18].
Swarm intelligence algorithms often model the interaction of living creatures in
a community, herds, flocks, colonies, and schools [6]. The core idea of swarm intel-
ligence algorithms is decentralization, in which the agents move toward the global
optimum through simulated social and collective intelligence, and local interac-
tion with their environment and with each other [8]. The algorithms in this cate-
gory memorize the best solutions found at each generation to produce the optimal

13
2130
H. Emami

solutions for the next generations. The most popular algorithms in this category are
PSO [9], ACO [10], and artificial bee colony (ABC) [27]. Some recently developed
swarm intelligence algorithms are firefly algorithm (FA) [28], krill herd (KH) [29],
elephant herding optimization (EHO) [30], spider monkey optimization (SMO)
[31], grey wolf optimizer (GWO) [32], whale optimization algorithm (WOA) [19,
33], butterfly optimization algorithm (BOA) [34], squirrel search algorithm (SSA)
[35], grasshopper optimization algorithm (GOA) [36], seagull optimization algo-
rithm (SOA) [37], normative fish swarm algorithm (NFSA) [38], red deer algorithm
(RDA) [39], and Harris hawks optimization (HHO) [7]. For more detail and deep
discussion about swarm intelligence algorithms, refer to the survey given in [6].
Chemistry- and physics-based algorithms simulate the chemistry and physical
rules in the universe such as chemical reactions, gravitational force, inertia force,
and magnetic force [25]. The search agents navigate and communicate through the
search space following the chemistry and physical rules. Simulated annealing (SA)
[40] is one of the founding algorithms in this category. SA models the annealing
process in metallurgy. Other widely used chemistry- and physics-based algorithms
are gravitational search algorithm (GSA) [41], big bang–big crunch (BB–BC) [42],
artificial chemical reaction optimization algorithm (ACROA) [43], galaxy-based
search algorithm (GbSA) [44], physarum-energy optimization algorithm (PEO)
[45], thermal exchange optimization (TEO) [46], equilibrium optimizer (EO) [47],
magnetic optimization algorithm (MOA) [48]. For a survey and discussion about
physics-inspired algorithms, refer to [49].
Human-based algorithms are developed based on metaphors from human life,
such as social relationships, political events, sports, music, and math. Since humans
are considered the smartest creatures in solving real-world problems, human-
inspired algorithms can also be more successful in solving optimization problems.
Some human-inspired algorithms are harmony search (HS) [50], imperialist com-
petitive algorithm (ICA) [51], teaching–learning-based optimization (TLBO) [52],
league championship algorithm (LCA) [53], class topper optimization (CTO) [54],
presidential election algorithm (PEA) [11], sine–cosine algorithm (SCA) [55], socio
evolution & learning optimization algorithm (SELO) [56], team game algorithm
(TGA) [57], ludo game-based swarm intelligence (LGSI) [58], heap-based optimizer
(HBO) [15], coronavirus optimization algorithm (CVOA) [59], political optimizer
(PO) [14], and Lévy flight distribution (LFD) [4].
Some algorithms are inspired by machine learning, reinforcement learning, and
learning classifier systems [60–62]. For example, ActivO is an ensemble machine
learning-based optimization algorithm [63]. ActivO combines strong and weak
learner strategies to perform a search for optimal solutions. The weak learner is
considered to explore the promising regions, and the strong learner is considered to
identify the exact location of the optimum within promising areas. Another exam-
ple is the molecule deep Q-networks (MolDQN) algorithm, which is developed by
combining domain knowledge of chemistry and reinforcement learning techniques
for molecule optimization. Researchers have proposed several methods for opti-
mizing trading strategies in the stock exchange [64–68]. For example, Thakkar and
Chaudhari [67] investigated the application of meta-heuristic algorithms for stock
portfolio optimization, and trend and stock price prediction along with implications

13
Stock exchange trading optimization algorithm: a… 2131

of PSO. In other work, Kumar and Haider [68] proposed RNN–LSTM and improved
its performance using PSO and flower pollination algorithm (FPA) for intraday stock
market prediction. It is important to notice that this paper does not focus on the opti-
mization or prediction in the stock exchange. To the best of our knowledge, in the
literature, there is no research that simulates the stock trading strategies for develop-
ing numerical optimization meta-heuristics.
It should be noted that each of the meta-heuristic algorithms has been improved
over the years, and several enhanced versions of them are available. The extended
algorithms improve the basic operators or overcome the defections that exist in
the conventional versions. For example, the chaotic election algorithm (CEA) [12]
embeds the chaos-based advertisement operator to the conventional PEA algorithm
[11] to improve its search capability and convergence speed. Some other algorithms
that recently proposed and used in different applications are opposition-based learn-
ing firefly algorithm combined with dragonfly algorithm (OFADA) [69], random
memory and elite memory equipped artificial bee colony (ABCWOA) algorithm
[70], efficient binary symbiotic organisms search (EBSOS) [71, 72], efficient binary
chaotic symbiotic organisms search (EBCSOS) [73], and binary farmland fertility
algorithm (BFFA) [74].
After this short review, and from the experimental results reported in the litera-
ture, we can conclude that the obtained performances on most optimization prob-
lems are not perfect. This phenomenon clearly shows that a lot of effort is needed in
the field. Each algorithm is suitable for solving certain types of problems. It seems
that one of the interesting tasks in this field is to determine the best algorithms for
each type of optimization problem. For deep analysis about meta-heuristic algo-
rithms, refer to surveys given in [2, 8, 75]. Table 1 summarizes some of the recently
proposed meta-heuristic algorithms.

3 Stock exchange trading optimization (SETO) algorithm

This section discusses the inspiration source and describes the mathematical model
of the proposed stock exchange trading optimization (SETO) algorithm.

3.1 Inspiration

A stock exchange or bourse is an exchange where traders and investors sell and buy
all types of securities such as shares of stock, bonds, and other financial instruments
issued by listed companies [76]. The stock exchange often acts as a continuous auc-
tion market in which sellers and buyers perform transactions through electronic trad-
ing platforms and brokerages. People invest and trade with an efficient strategy in
mind to make the most profit. Shares price never goes up in a straight line. They rise
and fall on their way to higher prices. A rise occurs because more people want to
buy a share than sell it. In the rising phase, the price of shares moves up. When the
shares rise for a long period, correction may start. A correction and all types of mar-
ket declines occur because investors or traders are more motivated to sell than buy.

13
Table 1  Some recently proposed meta-heuristics
2132

Title Inspiration source Year References

13
Genetic algorithm GA) Darwin’s theory of evolution 1992 [20]
Fast evolutionary programming (FEP) Natural evolution 1999 [21]
Differential evolution (DE) Natural evolution 2007 [22]
Biogeography-based optimization (BBO) Geographical distribution of biological organisms 2008 [23]
Forest optimization algorithm Growth of trees in forests 2014 [24]
Black widow optimization (BWO) Unique mating behavior of black widow spiders 2020 [25]
Farmland fertility optimization (FFO) Farmland fertility in nature 2018 [26]
Seasons optimization algorithm (SOA) Trees growth behavior 2020 [18]
Particle swarm optimization (PSO) Motion of bird flocks and schooling fish 1995 [9]
Ant colony optimization (ACO) Foraging behavior of natural ants 2006 [10]
Artificial bee colony (ABC) Intelligent behavior of bees 2007 [27]
Firefly algorithm (FA) Flashing behavior of fireflies 2010 [28]
Krill herd (KH) The herding behavior of krill communications 2012 [29]
Elephant herding optimization (EHO) Herding behavior of elephant group 2016 [30]
Spider monkey optimization (SMO) Fission–fusion social structure of spider monkeys in foraging 2014 [31]
Grey wolf optimizer (GWO) Leadership hierarchy and hunting mechanism of grey wolves 2014 [32]
Whale optimization algorithm (WOA) Humpback whales 2016 [19]
Butterfly optimization algorithm (BOA) Food foraging behavior of the butterflies 2018 [34]
Squirrel search algorithm (SSA) Dynamic behavior of flying squirrels 2019 [35]
Grasshopper optimization algorithm (GOA) Foraging and swarming behavior of grasshoppers 2017 [36]
Seagull optimization algorithm (SOA) Migration and attacking behaviors of a seagull in nature 2019 [37]
Normative fish swarm algorithm (NFSA) Behavior of fish swarm in the real environment 2019 [38]
Red deer algorithm (RDA) Unusual mating behavior of Scottish red deer 2020 [39]
Harris hawks optimization (HHO) Cooperative behavior and chasing style of Harris’ hawks 2019 [7]
Simulated annealing (SA) Annealing procedure of the metal working 1983 [40]
H. Emami

Gravitational search algorithm (GSA) Newtonian’s law of gravity and the law of motion 2009 [41]
Table 1  (continued)
Title Inspiration source Year References

Big bang–big crunch (BB–BC) Big bang theory 2006 [42]


Artificial chemical reaction optimization algorithm (ACROA) Natural chemical reaction 2011 [43]
Galaxy-based search algorithm (GBSA) Spiral arm of spiral galaxies to search its surrounding 2011 [44]
Physarum-energy optimization algorithm (PEO) Characteristic of ants’ spatiotemporal variations 2017 [45]
Thermal exchange optimization (TEO) Newton’s law of cooling 2017 [46]
Equilibrium optimizer (EO) Control volume mass balance 2019 [47]
Magnetic optimization algorithm (MOA) Principles of magnetic field theory 2018 [48]
Harmony search (HS) Music improvisation process 2001 [50]
Imperialist competitive algorithm (ICA) Imperialistic competition 2007 [51]
Teaching–learning-based optimization (TLBO) Teaching and learning process in a classroom 2012 [52]
Stock exchange trading optimization algorithm: a…

League championship algorithm (LCA) Sport championships 2014 [53]


Class topper optimization (CTO) Learning intelligence of students in a class 2018 [54]
Presidential election algorithm (PEA) Behavior of candidates in the presidential election 2015 [11]
Sine–cosine algorithm (SCA) The mathematical form of the sine and cosine 2016 [55]
Socio evolution & learning optimization algorithm (SELO) Social learning behavior of humans organized as families 2018 [56]
Team game algorithm (TGA) Cooperation of individuals in a game 2018 [57]
Ludo game-based swarm intelligence (LGSI) Ludo game playing strategies 2019 [58]
Heap-based optimizer (HBO) Corporate rank hierarchy in organizations 2020 [15]
Coronavirus optimization algorithm (CVOA) Coronavirus outbreak 2020 [59]
Political optimizer (PO) Multi-phased process of politics 2020 [14]
Lévy flight distribution (LFD) Lévy flight random walk for exploring unknown spaces 2020 [4]
Machine learning-based optimization algorithm (ActivO) Machine learning strategies 2021 [71]
2133

13
2134
H. Emami

Fig. 2  A schematic view of RSI indicator http://​forex-​indic​ators.​net/​rsi

At this time, sellers will start lowering prices until buyers tend to buy the shares.
Traders can sell their shares at any time they see fit or add to their number of shares.
They use various indicators to obtain the selling and buying signals and maximize
their gains through the analysis of stocks’ momentum. Some of the most commonly
used technical indicators are simple moving average (SMA), moving average con-
vergence divergence (MACD), relative strength index (RSI), stochastic oscillator,
and Bollinger bands among others [76].
The RSI [77] is a well-known momentum oscillator used in technical analysis. It
measures the magnitude of recent price changes to investigate overbought or over-
sold conditions in the price of a share. It produces signals that tell traders to sell
when the share is overbought and to buy when it is oversold. The RSI is often meas-
ured on a 14-day timeframe, and it oscillates between 0 and 100. The indicator has a
lower line typically at 30 and an upper line at 70. A share is often considered over-
sold when the RSI is at or below 30 and overbought when it is around 70 [78]. RSI
between the 30 and 70 levels is considered neutral. An oversold signal recommends
that short-term declines are reaching maturity, and a share may be in for a rally. In
contrast, an overbought signal could mean that short-term gains may be reaching a
point of maturity, and a share may be in for a price correction. As shown in Fig. 2,
RSI is often illustrated on a graph below the price chart.
In addition to the indicator signals, many investors use fundamental analysis
especially price-to-earnings (P/E) ratio to find out if a share is correctly valued [79].
P/E shows how cheap or expensive the share is. If all things are equal (the lower the
price, the higher the return), the lower P/E means the lower price of a share that is
suitable for investors. However, if all things are not equal, a lower P/E may not indi-
cate a good share for investing, because a share with a high P/E may provide a better

13
Stock exchange trading optimization algorithm: a… 2135

Start

Initialize algorithm parameters

Define fitness function

Generate a population of shares

Find fitness of shares

Compute the growth of shares (rising phase)

t=t+1 Compute the correction of shares (falling phase)

Replace the shares (exchange phase)

Calculate RSI

No
Stop conditions are met?

Yes

Return the best share

End

Fig. 3  Flowchart of the proposed SETO algorithm

return than a low P/E stock. Overall, in trading, it is better to compare the P/E of a
share with its market peers to discover it is overvalued or undervalued.
Traders and shareholders try to maximize profits by looking for the best shares
with the highest earning. The behavior of traders in the stock market is an adaptive
optimization process.

3.2 Mathematical model

This section shows how the trading behavior of traders and changes in share prices
is mathematically modeled to design the stock exchange trading optimization
(SETO) algorithm. Figure 3 shows the flowchart of the SETO algorithm. The SETO
is a population-based optimization algorithm, which starts its work with an initial

13
2136
H. Emami

population of shares. Each share (stock) in the population is a potential solution to


the problem. The objective is to find the most profitable share in the population,
which corresponds to the optimal solution. The algorithm iteratively updates the
population by three main operators including rising, falling, and exchange. Finally,
the most profitable share is reported as the optimal solution. The rising phase mod-
els the growth of shares’ prices in the stock exchange. The falling phase models the
prices decline of shares. In the exchange phase, traders replace their shares with the
lowest profit with the most profitable shares. In the following, the components of the
algorithm are described in more detail.

3.2.1 Create initial population

To solve any optimization problem, the first step in the SETO algorithm is to cre-
ate an initial population of candidate solutions. Each solution in the population is
referred to as a share or stock. In this paper, the terms “share” and “stock” are uti-
lized interchangeably in most cases. For an optimization problem F(x) with D vari-
ables {x1 , x2 , … , xD } , the initial population is defined as
[ ]T
S = S1 , S2 , … , SN (2)

where N is the population size. Each share Si ∈ S is a vector of D real-valued vari-


ables presented as follows:
Si = {si1 , si2 , … , siD } (3)
where sij contains a possible value for the corresponding variable xj of problem F(x).
The variable sij is initialized as
si,j = (ui,j − li,j ).𝜙ij + li,j (4)
where 𝜙ij is a random number in the range [0, 1] generated by uniform distribution. li
and ui are the lower and upper bounds of tij , respectively.
The profitability of shares is evaluated using a fitness function f, which is related
to the objective function of the problem. The fitness (profitability) of each share Si is
computed as follows:
fi = f (Si ) = f {si1 , si2 , … , siD } (5)
where fi is the fitness of share Si according to the objective function of the problem.
In minimization problems, the goal is to minimize the objective/cost function; how-
ever, in maximization problems, the goal is to maximize the objective function. In
the terminology of SETO, for minimization problems, the fitness function equals
the objective function, and for maximization problems, the fitness function has an
inverse relation with the objective function. If a share is valuable, then its fitness
will be greater and more traders will be attracted toward it. In this case, the share
grows more and reaches higher prices. In other words, the share gradually converges
to the optimal point.

13
Stock exchange trading optimization algorithm: a… 2137

At any given time, each share has a number of sellers and buyers. To identify the
initial traders, we use a random initialization mechanism. To do this, first the nor-
malized fitness ( nfi ) of each share Si is computed as follows:
fi − min(M)
nfi = , M = {fk �k = 1, 2, … , N}
∑�
N � (6)
fk − min(M)
k=1

The number of traders of Si is computed as follows:


Ti = ⌈nfi × T⌉ (7)
where T is the total number of traders, and Ti is the number of traders of share Si .
The number of traders can vary and change at any time. However, for simplicity,
in the current implementation of SETO, the number of traders is considered con-
stant, and during the running of the algorithm, the total number of traders does not
increase or decrease. The initial number of buyers and sellers of share Si is calcu-
lated as
bi = ⌈r × Ti ⌉
si = Ti − bi (8)

where bi and si are the number of buyers and sellers of Si , respectively. The vari-
able r is a random number in the range [0, 1], which is generated by the uniform
distribution.

3.2.2 Rising

The rising operator simulates the growth of shares’ prices in the market. In this
phase, shares can move to higher prices. Here the highest price that shares can reach
is considered as the optimal point. If the price of a share reaches its highest value,
then the traders who have that stock will make the most profit. To mathematically
model the rising phenomenon, we proposed the following equation:
Si (t + 1) = Si (t) + R × (Sg (t) − Si (t)) (9)
where Si (t) denotes the position of ith share at current iteration t, R is a 1 × D vector
of random numbers generated every iteration, and Sg (t) is the best solution found
until current iteration. The parameter R adds some amount of random deviations to
the direction of movement in hope of escaping local optimums and more exploring
solution space. Each element of vector rj ∈ R is defined as follows:
rj = U(0, pci × d1 ) (10)
where the function U generates a random number using uniform distribution in the
range [0, pci × d1 ] . The variable pci is the ratio of buyers to sellers of Si , and d1 is the
normalized distance between Si (t) and Sg (t) defined as

13
2138
H. Emami



D
g 2
(Sj (t) − Sij (t))
j=1 (11)
d1 =
ub − lb
ub and lb are the upper and lower bound of the search space, respectively. The dis-
tance between shares is naturally related to the domain of the search space. Thus, the
distance is normalized using (ub − lb) in the denominator to avoid problem domain
dependency. Supply and demand are two important factors in share growth. The
higher the demand for a share, the more likely it is that the share will grow. For this
purpose, the pci is considered in Eq. (10) to determine the impact of demand on
share growth. Here, the demand for a share is indicated by the number of buyers. pci
is simply defined as follows:
bi
pci = (12)
si + 1

where bi and si are the number of buyers and sellers of share Si , respectively. To
avoid the search boundary violation, the parameter pci is limited to a value in the
range [0, 2]. So, Eq. (12) is revised as follows:
bi
pci = min( , 2) (13)
si + 1

In the rising phase, the demand for shares increases. To model this phenomenon, at
each iteration of the algorithm and during rising, we remove a seller from the selling
queue of Si and add it to the buying queue as a buyer.
bi = bi + 1;
si = si − 1; (14)

In the implementation of SETO, it is assumed that any trader can buy or sell a share
at any time. Therefore, the buying and selling queue of each share Si are modeled as
variables bi and si.
In the rising phase, the algorithm spread the solutions far from the current area of
search space to explore different areas of search space.

3.2.3 Falling

The falling phase simulates shares’ prices decline. To mathematically model the
falling, we propose the following equation:

Si (t + 1) = Si (t) − W × (Sil (t) − Si (t)) (15)

where Sil (t) is the local best position the share Si has ever found. The local search
experience increases the convergence of the algorithm. W is a 1 × D vector of uni-
form random numbers. Each element wj ∈ W is computed as follows:

13
Stock exchange trading optimization algorithm: a… 2139

wj = U(0, nci × d2 ) (16)


where function U generates a uniform random number in the range [0, nci × d2 ] . d2 is
the normalized distance between Si (t) and Sil (t) , which is calculated as follows:

∑D
2
(Sijl (t) − Sij (t))
j=1 (17)
d2 =
ub − lb
nci is the ratio of sellers to buyers computed as
si
nci = min(
bi + 1
, 2) (18)

In the case of falling prices, the share supply increases. To model this issue, at each
iteration of the algorithm and during falling, we remove a buyer from buying queue
of Si and add it to the selling queue as a seller.
si = si + 1;
bi = bi − 1; (19)

At each iteration, the number of buyers and sellers of each share is controlled so that
the total number of buyers and sellers does not exceed the total number of traders.

3.2.4 Exchange

In the exchange phase, traders replace their shares with the lowest profit with the
most profitable shares. To do this, traders sell the lowest yielding shares and line up
to buy the best shares. We implement this phenomenon by just picking one of the
sellers from the sell queue of the worst share and assign it to the buy queue of the
best share. The competition can be done among all shares to attract the traders; how-
ever, for simplicity, we assign the seller to the best share. To mathematically model
this process, first, the worst share is identified. The share Sw with the lowest fitness is
considered the worst if it obtains the lowest fitness.
Sworst = Sw where f (Sw ) < f (Sj )
∀ j = 1, 2, … , N, w ≠ j (20)

Then, one of the sellers is removed from the selling queue of the worst share Sworst
and added to the buying queue of the best share. The best share Sbest is determined
as follows:
Sbest = Sb where f (Sb ) > f (Sj )
∀ j = 1, 2, … , N, b ≠ j (21)

The exchange operator improves the population because it allows the best and worst
shares eventually to grow. This reduces the number of sellers of the worst share and

13
2140
H. Emami

increases the number of buyers of the best share. Therefore, the ratio of buyers to
sellers increases, and in this case, the possibility of rising the shares increases.

3.2.5 RSI calculation

We use the RSI indicator to identify when the share rising or falling occurs. Accord-
ing to RSI value, SETO performs rising or falling as follows:

⎧ rising RSI ≤ 30

⎨ falling RSI ≥ 70 (22)
⎪ p × rising + (1 − p) × falling 30 < RSI < 70

where p is a binary random number with values 0 or 1 regenerated at every iteration.


p is computed as follows:
{
1 rand ≥ 0.5
p=
0 else (23)

where function rand generates a random number in the range [0, 1] using uniform
distribution. For a share Si , the RSI is calculated as follows [78]:
100
RSI = 100 − (24)
1 + RS
A simple moving average (SMA) method [76] is used to compute relative strength
(RS) as follows:


K
/∑
K
RS = Pi Ni (25)
i=1 i=1

where Pi and Ni are the upward and downward price changes, respectively. K indi-
cates the trading time frame of RSI. In the implementation of SETO, K is set to be
14 days (iterations). In the SETO algorithm, the price of shares is represented with
their fitness. Pi and Ni are computed as follows:
{ ( )
1 if fi (t) − fi (t − 1) > 0
Pi =
0 otherwise (26)

{
1 if(fi (t − 1) − fi (t)) > 0
Ni =
0 otherwise (27)

where fi (t) and fi (t − 1) are the fitness in the current and previous iterations, respec-
tively. Here, the fitness corresponds to the close price of the share. If the previous
fitness is the same as the last fitness, both Pi and Ni are set to be zero. The RSI will
rise as the number of positive closes increase, and it will fall as the number of losses
increase.

13
Stock exchange trading optimization algorithm: a… 2141

3.2.6 Stop condition

Until termination conditions are met, the algorithm iterates the rising, falling, and
exchange phases on the population. Finally, the fittest share is returned as an opti-
mal solution for the problem. The following termination conditions are considered
to stop the algorithm:

• A predefined number of generations (G) is reached.


• A specified number of fitness function evaluations (FEs) is reached.
• The fitness of the best share is unchanged in successive iterations.

Algorithm 1 summarizes the pseudo-code of the proposed SETO algorithm.

13
2142
H. Emami

Algorithm 1: Pseudo code of the proposed SETO algorithm


Input: The maximum number of iterations (G), number of shares (N ), number of
traders (T ), RSI time frame (K)
Output: The fittest share S g and its fitness

Initialize algorithm parameters;


Create an initial population of shares Si , i = 1, 2, . . . , N by Eqs. (2)-(4);
Calculate the fitness of shares by Eq. (5);
Determine the traders of shares;
Identify the best share S g in the population;

t = 0;
while (t ≤ G) do

for i=1 to N do
if (t ≥ K and Si (t).RSI ≤ 30) then
[S]= Rising(Si (t), S g );
else if (t ≥ K and Si (t).RSI ≥ 70) then
[S]= Falling (S);
else
r=rand;
if (r > 0.5) then
[S]= Rising(S);
else
[S]= Falling(S);
end
end

// Exchange phase;
[S]= Exchange(S);

// RSI calculation;
if (t >= K) then
Compute Pi by Eq. (26);
Compute Ni by Eq. (27);
Calculated RSI by Eq.(24);
Si (t).RSI = RSI;
end
Update the best solution S g ;
end

t = t + 1;
end
Return the fittest share S g and its fitness;

3.3 An example to show the functioning of SETO

To show the functioning of the SETO, it is benchmarked using the peak function.
The purpose is to show how the shares move around the search space and gradually
converge to the global optimum. The peak function is defined as follows:

(28)
2 +y2 )
f (x, y) = xe−(x − 2 ≤ x, y ≤ 2

13
Stock exchange trading optimization algorithm: a… 2143

Fig. 4  The functioning of SETO on peak function

The global optimum of this problem is −0.4289 located at position


(x, y) = (−0.0708, 0.002) . Figure  4a shows the graphical plot of the test func-
tion. Figure 4b shows the initial shares scattered throughout the search space. The
shares are shown with a blue circle marker and the best share with a red star marker.
Figure  4c, d shows the positions of shares at 5th, 10th, 15th, and 20th iterations,

13
2144
H. Emami

respectively. Initially, the shares are scattered throughout the solution space and they
are not in global optimum. In the 5th iteration, a share is close to the global opti-
mal point, while other shares are placed at local optimums. In the 10th iteration, the
best share is more close to the global optimum, and in the 15th iteration, most of
the shares are more close to the global optimum. Finally, at the 20th iteration, the
majority of trees converge to the global optimum.

4 Experiments

This section presents the performance evaluation of the proposed algorithm on a


diverse set of unconstraint and single-objective numerical optimization functions. In
the following, characteristics of test problems, performance metrics, parameter tun-
ing, as well as numerical results are presented.

4.1 Test problems

To investigate the precision, convergence speed, and search capability of the pro-
posed SETO and comparison algorithms, forty well-studied test problems are cho-
sen from the literature [4, 7, 18, 25, 80, 81]. This test set covers four classes of func-
tions as follows:

• Group I F1−F10 are fixed-dimension problems. This test set investigates the
local optimum avoidance capacity of algorithms in solving problems with a fixed
number of variables [18].
• Group II F11−F22 are single-objective unimodal functions. These test cases
have a unique global best in their landscape. They are considered to measure the
exploitation (intensification) ability of the algorithms [18, 80].
• Group III F23−32 are multimodal functions that consist of multiple local opti-
mums in their landscape. The dimensionality and multiple local optima make
multimodal functions more difficult and more complex to optimize. This group
of functions is considered to reveal the local avoidance and exploration (diversi-
fication) capability of optimization algorithms [7].
• Group IV F33−F40 are shifted, rotated, hybrid and composite functions. This
test set is drawn from CEC 2018 competition [81] on single-objective real-
parameter numerical optimization problems. These functions evaluate the reli-
ability, accuracy, and ability of the algorithms in providing a balance between
exploration and exploitation.

The characteristics of test problems are summarized in Tables 2, 3, 4, 5, and 6. In


the tables, the parameter fmin means the global optimum of the test function, Vars
indicates the number of dimensions of the problem, and Range denotes the bound-
ary of search space.

13
Stock exchange trading optimization algorithm: a… 2145

Table 2  Descriptions of fixed-dimension test functions


Function Name Range Vars fmin

F1 Adjiman [– 1, 2] 2 – 2.02181
F2 Bartels Conn [– 500, 500] 2 1
F3 Brent [– 10, 10] 2 0
F4 Bukin 6 [(– 15, – 5), (– 5, – 3)] 2 180.3276
F5 Easom [– 100, 100] 2 –1
F6 Egg Crate [– 5, 5] 2 0
F7 Matyas [– 10, 10] 2 0
F8 Schaffer N. 4 [– 100, 100] 2 0.292579
F9 Three-Hump Camel [– 5, 5] 2 0
F10 Zettle [– 5, 10] 2 – 0.00379

Table 3  Descriptions of Function Name Range Vars fmin


unimodal test functions
F11 Brown [– 1, 4] 30 0
F12 Dixon and Price [– 10, 10] 30 0
F13 Powell Singular [– 4, 5] 30 0
F14 Powell Sum [– 1, 1] 30 0
F15 Rosenbrock [– 30, 30] 30 0
F16 Schwefel’s 2.20 [– 100, 100] 30 0
F17 Schwefel’s 2.21 [– 100, 100] 30 0
F18 Schwefel’s 2.22 [– 100, 100] 30 0
F19 Schwefel’s 2.23 [– 10, 10] 30 0
F20 Sphere [– 100, 100] 30 0
F21 Sum Squares [– 10, 10] 30 0
F22 Xin-She Yang 1 [– 20, 20] 30 0

Table 4  Descriptions of Function Name Range Vars fmin


multimodal test functions
F23 Ackley [– 32, 32] 30 0
F24 Alpine N. 1 [– 10, 10] 30 0
F25 Griewank [– 100, 100] 30 0
F26 Periodic [– 10, 10] 30 0.9
F27 Rastrigin [– 5.12, 5.12] 30 0
F28 Salomon [– 100, 100] 30 0
F29 Trignometric 2 [– 500, 500] 30 0
F30 Xin-She Yang 2 [– 5,5] 30 0
F31 Xin-She Yang N. 2 [– 2pi, 2pi] 30 0
F32 Xin-She Yang N. 4 [– 10, 10] 30 –1

13
2146
H. Emami

Table 5  Descriptions of group IV test functions


Function Name Range Vars fmin

F33 Shifted and Rotated Rastrigin’s Function (CEC4) [– 100, 100] 10 400
F34 Shifted and Rotated Lunacek BiRastrigin Function (CEC6) [– 100, 100] 10 600
F35 Shifted and Rotated Non-Continuous Rastrigin’s Function [– 100, 100] 10 700
(CEC7)
F36 Shifted and Rotated Schwefel’s Function (CEC9) [– 100, 100] 10 900
F37 Hybrid Function 1 (N = 3) (CEC10) [– 100, 100] 10 1000
F38 Hybrid Function 6 (N=4) (CEC15) [– 100, 100] 10 1500
F39 Composite Function 1 (N = 3) (CEC20) [– 100, 100] 10 2000
F40 Composite Function 6 (N = 5) (CEC25) [– 100, 100] 10 2500

Table 6  Control parameters of the algorithms used in the tests


Algorithm Control parameters

GA [82] Pc = 0.67, Pm = 0.33


PSO [83] c1 = 2, c2 = 2, 𝜔 = 0.2
GSA [41] G0 = 100, 𝛼 = 20, k = [N → 1]
SCA [55] 𝛼 = 2, r1 = 1 − t G2 , r2 ∈ [0.2𝜋], r3 ∈ [0, 2], r4 ∈ [0, 1]
SELO [56] P = 2, O = 3, rp = 0.999, rk = 0.1, follow_prob_factor_ownparent = 0.999
follow_prob_factor_otherkids = 0.9991, r = [0.95000 → 0.99995]
HBO [15] 1−p
C = ⌊G∕25⌋, p1 = 1 − (t∕G), p2 = p1 + 2 1
LFD [4] Threshold = 2, CSV = 0.5,
𝛽 = 1.5, 𝛼1 = 10, 𝛼2 = 0.00005, 𝛼3 = 0.005, 𝜕1 = 0.9, 𝜕2 = 0.1
SETO Initial number of traders (T)=100

4.2 Comparison algorithms

The proposed SETO is compared with seven well-established optimization meta-


heuristics such as GA [82], PSO [83], GSA [41], SCA [55], SELO [56], HBO [15],
and LFD [4] algorithms. GA, PSO, and GSA are three well-studied algorithms in
science and engineering. SCA, SELO, HBO, and LFD are recently proposed effi-
cient human-inspired optimization algorithms that obtain competitive results on sin-
gle-objective unconstraint numerical function and constraint real-world engineering
problems. SCA is an iterative math-inspired algorithm that uses the sine and cosine
relations to search the solution space. SELO is inspired by the social learning behav-
ior of humans organized as families. HBO models the organization of people in a
hierarchy called corporate rank hierarchy (CRH). It uses the heap data structure to
map the concept of CRH. LFD is inspired by the Levy flight motions and the wire-
less sensor networks environment.

13
Stock exchange trading optimization algorithm: a… 2147

4.3 Experimental setting

The experiments were performed using MATLAB 2016b on a Laptop machine with
8GB main memory and 64-bit i7 Intel (R) Core (TM) 2.2GHz processor. The popu-
lation size (N), the maximum iteration number (G), and the maximum number of fit-
ness function evaluations (FEs) for all the algorithms were set to be 25 and 103 × D ,
respectively. D indicates the dimension of problems. The configuration of control
parameters for comparison algorithms is summarized in Table 6. The parameters are
tuned as recommended in the corresponding literature. Most of the control param-
eters of SETO are already known and configured using the data drawn from the
stock exchange and scientific resources about technical analysis. This issue turns the
SETO into an optimizer quite easy to implement and execute. In the current imple-
mentation of the SETO algorithm, the only parameter that needs to be adjusted is the
initial number of traders (T). As given in Table 6, the parameter T is set to 100. Dif-
ferent values of the variable T do not affect the performance of the algorithm. The
parameter T is used to calculate the ratio of buyers to sellers (pc) and the ratio of
sellers to buyers (nc). The values of pc and nc do not change significantly as the total
number of traders increases or decreases. These parameters are limited to a value in
the range [0, 2]. Regarding population size, it is obvious that with increasing popu-
lation size, the performance of optimization algorithms improves, but also the exe-
cution time of the algorithms increases. However, the population size is considered
the same for all algorithms. To fair comparison, the basic standard versions of the
algorithms are used for tests. We used the source codes published by the authors and
customized them to be compatible with our experimental configuration. The quality
of solutions reported by the algorithms is calculated by the Mean and the standard
deviation (Std) measures. In an ideal state, the Mean is equal to the global optimum
of the problem, and the std is 0. As the std increases, the reliability of the algorithm
decreases. To obtain the statistical results, the algorithms were executed 30 times on
each test problem following the experimental instructions provided in [18, 84]. The
results at each run are recorded to calculate the mean and the standard deviation of
the best solutions found in 30 independent runs.

4.4 Numerical results and discussion

Tables  7, 8, 9, and 10 summarize the statistical results obtained by the proposed


SETO and comparison algorithms. The main objective is to evaluate the perfor-
mance of the comparison algorithms in finding the optimal solutions and measure
the quality of the found solutions. In the tables, the symbol ⊖ means that SETO
performs better than the counterpart algorithm on the specified test function, ⊕
indicates that the competing algorithm has performed better on the specified func-
tion than SETO, and ⊙ indicates that both the competing algorithm and SETO have
attained the same results. The best results are illustrated in boldface. Overall, SETO
outperforms its counterparts in terms of statistical tests on most benchmark prob-
lems. Inspecting the results reported in Tables 7, 8, 9, and 10, we have the following
observations:

13
Table 7  Statistical results on 30D fixed-dimension functions
2148

Function GA PSO GSA SCA


Mean ± Std Mean ± Std Mean ± Std Mean ± Std

13
F1 – 2.01E–01 ± 4.30E–03 ⊙ – 2.02E–01 ± 9.11E–16 ⊙ – 2.02E–01 ± 4.45E–04 ⊙ – 2.02E–01 ± 3.35E–10 ⊙
F2 1.00E+00 ± 4.17E–02 ⊙ 𝟏.𝟎𝟎𝐄 + 𝟎𝟎 ± 𝟎.𝟎𝟎𝐄 + 𝟎𝟎⊙ 1.00E+00 ± 1.19E–04 ⊙ 𝟏.𝟎𝟎𝐄 + 𝟎𝟎 ± 𝟎.𝟎𝟎𝐄 + 𝟎𝟎⊙
F3 2.25E–05 ± 1.68E–04 ⊖ 1.38E–87 ± 5.17E–139 ⊙ 9.11E–06 ± 1.69E–05 ⊖ 1.38E–87 ± 3.11E–105 ⊙
F4 𝟏.𝟖𝟎𝐄 + 𝟎𝟐 ± 𝟎.𝟎𝟎𝐄 + 𝟎𝟎⊙ 𝟏.𝟖𝟎𝐄 + 𝟎𝟐 ± 𝟎.𝟎𝟎𝐄 + 𝟎𝟎⊙ 1.80E+02 ± 4.38E–04 ⊙ 𝟏.𝟖𝟎𝐄 + 𝟎𝟐 ± 𝟎.𝟎𝟎𝐄 + 𝟎𝟎⊙
F5 – 1.00E+00 ± 0.00E+00 ⊕ – 1.00E+00 ± 0.00E+00 ⊕ – 1.00E+00 ± 0.00E+00 ⊕ – 1.00E+00 ± 3.51E–04 ⊕
F6 8.03E–1 ± 9.30E–18 ⊖ 1.50E–130 ± 7.93E–129 ⊖ 4.92E–21 ± 3.16E–18 ⊖ 6.29E–160 ± 5.19E–159 ⊖
F7 2.06E–21 ± 5.32E–19 ⊖ 1.53E–65 ± 2.78E–61 ⊖ 2.39E–29 ± 1.22E–31 ⊖ 2.41E–85 ± 2.50E–86 ⊖
F8 2.17E–01 ± 1.62E–02 ⊖ 2.93E–01 ± 4.01E–17 ⊙ 2.01E–01 ± 1.50E–02 ⊖ 2.93E–01 ± 3.79E–09 ⊙
F9 2.70E–94 ± 3.17E–95 ⊖ 3.50E–68 ± 4.58E–64 ⊖ 1.44E–22 ± 9.60E–22 ⊖ 2.13E–140 ± 3.60E–139 ⊖
F10 – 3.79E–03 ± 1.91E–02 ⊙ – 3.79E–03 ± 6.47E–15 ⊙ – 3.79E–03 ± 5.02E–17 ⊙ – 3.79E–03 ± 3.97E–10 ⊙
⊖ 4 3 4 3
⊕ 1 1 1 1
⊙ 5 6 5 6
SELO HBO LFD SETO
Mean ± Std Mean ± Std Mean ± Std Mean ± Std

F1 – 2.02E–01 ± 2.30E–04 ⊙ – 2.02E–01 ± 2.30E–04 ⊙ – 2.02E–01 ± 2.05E–09 ⊙ – 2.02E–01 ± 0.00E+00


F2 𝟏.𝟎𝟎𝐄 + 𝟎𝟎 ± 𝟎.𝟎𝟎𝐄 + 𝟎𝟎⊙ 𝟏.𝟎𝟎𝐄 + 𝟎𝟎 ± 𝟎.𝟎𝟎𝐄 + 𝟎𝟎⊙ 𝟏.𝟎𝟎𝐄 + 𝟎𝟎 ± 𝟎.𝟎𝟎𝐄 + 𝟎𝟎⊙ 1.00E+00 ± 0.00E+00
F3 1.38E–87 ± 5.10E–95 ⊙ 1.38E–87 ± 2.60E–105 ⊙ 1.38E–87 ± 0.00E+00 ⊙ 1.38E–87 ± 0.00E+00
F4 𝟏.𝟖𝟎𝐄 + 𝟎𝟐 ± 𝟎.𝟎𝟎𝐄 + 𝟎𝟎⊙ 𝟏.𝟖𝟎𝐄 + 𝟎𝟐 ± 𝟎.𝟎𝟎𝐄 + 𝟎𝟎⊙ 2.29E+02 ± 2.84E–21 ⊖ 1.80E+02 ± 0.00E+00
F5 5.97E–03 ± 2.60E–03 ⊕ – 1.00E+00 ± 0.00E+00 ⊕ – 1.00E+00 ± 0.00E+00 ⊕ – 9.61E–01 ± 4.50E–01
F6 2.71E–100 ± 4.91E–100 ⊖ 𝟎.𝟎𝟎𝐄 + 𝟎𝟎 ± 𝟎.𝟎𝟎𝐄 + 𝟎𝟎⊙ 1.86E–12 ± 6.87E–13 ⊖ 0.00E+00 ± 0.00E+00
F7 4.62E–43 ± 3.09E–41 ⊖ 2.19E–109 ± 7.24E–138 ⊖ 3.35E–19 ± 1.28E–18 ⊖ 0.00E+00 ± 0.00E+00
F8 2.93E–01 ± 3.41E–05 ⊙ 2.93E–01 ± 5.49E–19 ⊙ 2.93E–01 ± 1.28E–15 ⊙ 2.93E–01 ± 6.23E–21
H. Emami

F9 6.02E–74 ± 1.55E–73 ⊖ 𝟎.𝟎𝟎𝐄 + 𝟎𝟎 ± 𝟎.𝟎𝟎𝐄 + 𝟎𝟎⊙ 8.27E–14 ± 2.69E–13 ⊖ 0.00E+00 ± 0.00E+00


Table 7  (continued)
SELO HBO LFD SETO
Mean ± Std Mean ± Std Mean ± Std Mean ± Std

F10 – 3.79E–03 ± 5.53E–11 ⊙ – 3.79E–03 ± 2.03E–20 ⊙ – 3.79E–03 ± 4.31E–16 ⊙ – 3.79E–03 ± 6.14E–21


⊖ 4 1 4 –
⊕ 1 1 1 –
⊙ 5 8 5 –

Best results are illustrated in boldface


Stock exchange trading optimization algorithm: a…
2149

13
Table 8  Statistical results of 30D unimodal functions
2150

Function GA PSO GSA SCA


Mean ± Std Mean ± Std Mean ± Std Mean ± Std

13
F11 3.20E+01 ± 1.17E+00 ⊖ 2.74E+01 ± 1.02E+01 ⊖ 6.10E–18 ± 4.59E–19 ⊖ 4.18E–07 ± 2.98E–06 ⊖
F12 7.93E+01 ± 2.94E+01 ⊖ 4.31E+01 ± 9.22E+00 ⊖ 6.73E–01 ± 3.07E–02 ⊖ 2.19E+00 ± 1.42E+00 ⊖
F13 9.01E+03 ± 9.77E+02 ⊖ 8.44E+02 ± 3.50E+02 ⊖ 4.02E–03 ± 8.90E–02 ⊖ 2.60E+00 ± 6.32E+01 ⊖
F14 3.88E–09 ± ⊖7.66E–10 ⊖ 5.17E–11 ± 3.08E–12 ⊖ 3.06E–18 ± 8.05E–19 ⊖ 3.91E–10 ± 5.66E–10 ⊖
F15 1.44E+04 ± 2.56E+04 ⊖ 2.36E+03 ± 1.08E+04 ⊖ 2.89E+01 ± 1.40E+01 ⊖ 6.92E+01 ± 8.01E+01 ⊖
F16 4.71E–05 ± 2.70E–05 ⊖ 7.33E–07 ± 2.96E–06 ⊖ 3.16E–09 ± 9.25E–09 ⊖ 3.11E–05 ± 2.98E–05 ⊖
F17 2.05E+01 ± 7.30E+00 ⊖ 3.30E–02 ± 7.92E–01 ⊖ 6.41E–03 ± 3.77E–02 ⊖ 2.11E+01 ± 1.13E+01 ⊖
F18 6.12E–01 ± 4.47E–01 ⊖ 2.17E–02 ± 3.99E–02 ⊖ 4.70E+01 ± 2.08E+01 ⊖ 9.11E–06 ± 3.55E–07 ⊖
F19 1.54E–03 ± 4.39E–02 ⊖ 4.17E–15 ± 2.78E–14 ⊖ 9.15E–87 ± 3.08E–88 ⊖ 2.19E+03 ± 7.33E+03 ⊖
F20 4.15E–20 ± 3.91E–20 ⊖ 3.07E–13 ± 4.79E–14 ⊖ 5.13E–16 ± 9.70E–17 ⊖ 8.53E–04 ± 1.78E–05 ⊖
F21 5.35E–01 ± 2.70E+02 ⊖ 7.15E–01 ± 2.68E+01 ⊖ 9.66E–17 ± 4.37E–18 ⊖ 6.91E–04 ± 2.73E–04 ⊖
F22 4.14E–55 ± 2.17E–48 ⊖ 6.09E–102 ± 2.81E–107 ⊖ 1.56E–41 ± 2.40E–42 ⊖ 5.32E–190 ± 1.91E–184 ⊖
⊖ 12 12 12 12
⊕ 0 0 0 0
⊙ 0 0 0 0
SELO HBO LFD SETO
Mean ± Std Mean ± Std Mean ± Std Mean ± Std

F11 5.22E–78 ± 9.40E–81 ⊖ 𝟎.𝟎𝟎𝐄 + 𝟎𝟎 ± 𝟎.𝟎𝟎𝐄 + 𝟎𝟎⊙ 9.22E–01 ± 1.61E–01 ⊖ 0.00E+00 ± 0.00E+00
F12 9.73E–01 ± 5.02E–04 ⊖ 6.67E–01 ± 3.17E–04 ⊖ 9.99E–01 ± 4.87E–02 ⊖ 6.66E–01 ± 4.61E–90
F13 7.38E–02 ± 5.49E–02 ⊖ 5.67E–07 ± 1.04E–07 ⊖ 4.52E–06 ± 2.33E–08 ⊖ 0.00E+00 ± 0.00E+00
F14 4.70E–25 ± 3.76E–24 ⊖ 𝟎.𝟎𝟎𝐄 + 𝟎𝟎 ± 𝟎.𝟎𝟎𝐄 + 𝟎𝟎⊙ 1.61E–05 ± 1.14E–04 ⊖ 0.00E+00 ± 0.00E+00
F15 2.90E+01 ± 1.90E+01 ⊖ 7.76E+01 ± 4.02E+01 ⊖ 2.94E–02 ± 6.32E–01 ⊖ 2.86E+01 ± 0.00E+00
F16 3.66E–20 ± 7.70E–20 ⊖ 𝟎.𝟎𝟎𝐄 + 𝟎𝟎 ± 𝟎.𝟎𝟎𝐄 + 𝟎𝟎⊙ 1.46E–01 ± 2.04E–01 ⊖ 0.00E+000.00E+00
H. Emami

F17 1.94E–32 ± 5.44E–35 ⊖ 7.27E–47 ± 2.85E–55 ⊖ 7.08E–04 ± 6.13E–04 ⊖ 0.00E+00 ± 0.00E+00


Table 8  (continued)
SELO HBO LFD SETO
Mean ± Std Mean ± Std Mean ± Std Mean ± Std

F18 5.24E–63 ± 6.14E–62 ⊖ 𝟎.𝟎𝟎𝐄 + 𝟎𝟎 ± 𝟎.𝟎𝟎𝐄 + 𝟎𝟎⊙ 6.40E–32 ± 5.28E–33 ⊖ 0.00E+00 ± 0.00E+00
F19 4.85E–101 ± 1.16E–102 ⊖ 𝟎.𝟎𝟎𝐄 + 𝟎𝟎 ± 𝟎.𝟎𝟎𝐄 + 𝟎𝟎⊙ 1.11E–40 ± 8.76E–41 ⊖ 0.00E+00 ± 0.00E+00
F20 3.54E–55 ± 1.67E–57 ⊖ 𝟎.𝟎𝟎𝐄 + 𝟎𝟎 ± 𝟎.𝟎𝟎𝐄 + 𝟎𝟎⊙ 7.25E–05 ± 4.75E–25 ⊖ 0.00E+00 ± 0.00E+00
F21 𝟎.𝟎𝟎𝐄 + 𝟎𝟎 ± 𝟎.𝟎𝟎𝐄 + 𝟎𝟎⊙ 𝟎.𝟎𝟎𝐄 + 𝟎𝟎 ± 𝟎.𝟎𝟎𝐄 + 𝟎𝟎⊙ 5.00E–04 ± 2.71E–04 ⊖ 0.00E+00 ± 0.00E+00
F22 1.26E–94 ± 7.19E–96 ⊖ 4.34E–232 ± 0.00E+00 ⊖ 𝟎.𝟎𝟎𝐄 + 𝟎𝟎 ± 𝟎.𝟎𝟎𝐄 + 𝟎𝟎⊙ 0.00E+00 ± 0.00E+00
⊖ 11 5 11 –
⊕ 0 0 0 –
⊙ 1 7 1 –

Best results are illustrated in boldface


Stock exchange trading optimization algorithm: a…
2151

13
Table 9  Statistical results of 30D multimodal functions
2152

Function GA PSO GSA SCA


Mean ± Std Mean ± Std Mean ± Std Mean ± Std

13
F23 1.70E+01 ± 2.57E+00 ⊖ 2.30E+00 ± 5.54E+01 ⊖ 7.20E–10 ± 6.31E–10 ⊖ 1.25E+01 ± 7.42E+00 ⊖
F24 5.10E–03 ± 8.11E–02 ⊖ 7.90E–02 ± 8.30E–01 ⊖ 3.22E–10 ± 1.55E–11 ⊖ 3.50E–02 ± 7.80E–02 ⊖
F25 1.16E+01 ± 4.33E+00 ⊖ 1.16E–02 ± 5.33E–03 ⊖ 5.42E–04 ± 8.12E–04 ⊖ 3.81E–01 ± 5.12E–01 ⊖
F26 6.80E–01 ± 2.05E–03 ⊖ 1.00E+00 ± 3.59E–09 ⊖ 1.00E+00 ± 7.36E–15 ⊖ 2.17E+00 ± 5.02E+00 ⊖
F27 8.37E+00 ± 4.68E+00 ⊖ 1.46E+01 ± 2.07E+01 ⊖ 1.63E+01 ± 5.18E+00 ⊖ 1.04E+01 ± 1.53E+01 ⊖
F28 9.12E–01 ± 4.29E–02 ⊖ 4.35E–01 ± 3.09E–02 ⊖ 1.40E+00 ± 3.30E–01 ⊖ 2.10E–01 ± 4.87E–02 ⊖
F29 4.32E+00 ± 2.25E+00 ⊕ 7.51E+00 ± 2.41E+00 ⊕ 6.56E+03 ± 7.13E+02 ⊖ 9.39E+01 ± 1.70E+01 ⊖
F30 3.75E+03 ± 6.56E+01 ⊖ 5.67E+04 ± 3.87E+02 ⊖ 7.13E–04 ± 6.64E–03 ⊖ 3.64E–04 ± 9.25E–05 ⊖
F31 5.52E–11 ± 3.14E+11 ⊖ 1.04E–09 ± 4.97E–12 ⊖ 9.14E–12 ± 3.32E–15 ⊖ 7.16E–11 ± 1.90E–12 ⊖
F32 7.41E–09 ± 6.63E–08 ⊖ 6.18E–11 ± 2.57E–10 ⊖ 4.60E–31 ± 2.77E–30 ⊖ 7.40E–11 ± 1.27E–10 ⊖
⊖ 9 9 10 10
⊕ 1 1 0 0
⊙ 0 0 0 0
SELO HBO LFD SETO
Mean ± Std Mean ± Std Mean ± Std Mean ± Std

F23 4.67E–17 ± 9.13E–16 ⊖ 6.22E–15 ± 6.22E–15 ⊖ 2.87E–21 ± 3.41E–37 ⊖ 0.00E+00 ± 0.00E+00


F24 4.50E–02 ± 3.17E–03 ⊖ 8.09E–28 ± 7.13E–30 ⊖ 1.99E–07 ± 2.45E–06 ⊖ 0.00E+00 ± 0.00E+00
F25 6.23E–25 ± 8.70E–26 ⊖ 𝟎.𝟎𝟎𝐄 + 𝟎𝟎 ± 𝟎.𝟎𝟎𝐄 + 𝟎𝟎⊙ 3.86E–06 ± 1.80E–06 ⊖ 0.00E+00 ± 0.00E+00
F26 9.00E–01 ± 4.13E–05 ⊙ 1.00E+00 ± 1.06E–06 ⊖ 9.00E–01 ± 0.00E+00 ⊙ 9.00E–01 ± 0.00E+00
F27 9.12E–27 ± 1.45E–30 ⊖ 𝟎.𝟎𝟎𝐄 + 𝟎𝟎 ± 𝟎.𝟎𝟎𝐄 + 𝟎𝟎⊙ 6.73E–07 ± 4.97E–07 ⊖ 0.00E+00 ± 0.00E+00
F28 1.03E+00 ± 3.46E–01 ⊖ 9.99E–02 ± 0.00E+00 ⊖ 3.40E–06 ± 8.22E–07 ⊖ 0.00E+00 ± 0.00E+00
F29 3.97E+02 ± 1.04E+02 ⊖ 1.00E+00 ± 5.60E–18 ⊕ 6.34E+01 ± 7.16E+01 ⊖ 1.13E+01 ± 3.71E+01
F30 2.88E–15 ± 1.64E–17 ⊖ 8.10E–26 ± 5.20E–254 ⊖ 2.39E–10 ± 5.90E+17 ⊖ 0.00E+00 ± 0.00E+00
H. Emami

F31 8.16E–04 ± 3.55E–05 ⊖ 3.51E–12 ± 1.82E–15 ⊖ 5.38E–12 ± 2.17E–36 0.00E+00 ± 5.10E–234


Table 9  (continued)
SELO HBO LFD SETO
Mean ± Std Mean ± Std Mean ± Std Mean ± Std

F32 2.55E–06 ± 3.71E–06 ⊖ 7.35E–30 ± 4.37E–29 ⊖ – 1.00E+00 ± 0.00E+00 ⊙ – 1.00E+00 ± 0.00E+00


⊖ 9 7 8 –
⊕ 0 1 0 –
⊙ 1 2 2 –

Best results are illustrated in boldface


Stock exchange trading optimization algorithm: a…
2153

13
Table 10  Statistical results on 10D group IV test function
2154

Function GA PSO GSA SCA


Mean ± Std Mean ± Std Mean ± Std Mean ± Std

13
F33 4.08E+02 ± 5.18E+00 ⊖ 4.05E+02 ± 7.95E+00 ⊖ 4.05E+02 ± 4.71E+00 ⊖ 4.29E+02 ± 1.51E+01 ⊖
F34 6.41E+02 ± 1.25E+01 ⊖ 6.31E+02 ± 3.40E+00 ⊖ 6.19E+02 ± 6.03E+00 ⊖ 6.18E+02 ± 5.52E+00 ⊖
F35 7.27E+02 ± 3.59E+00 ⊖ 7.42E+02 ± 6.63E+00 ⊖ 7.39E+02 ± 2.92E+00 ⊖ 7.63E+02 ± 4.92E+00 ⊖
F36 9.03E+02 ± 4.70E+00 ⊖ 𝟗.𝟎𝟎𝐄 + 𝟎𝟐 ± 𝟎.𝟎𝟎𝐄 + 𝟎𝟎⊙ 𝟗.𝟎𝟎𝐄 + 𝟎𝟐 ± 𝟎.𝟎𝟎𝐄 + 𝟎𝟎⊙ 9.56E+02 ± 3.97E+01 ⊖
F37 1.68E+03 ± 9.80E+01 ⊖ 1.63E+03 ± 1.89E+02 ⊖ 2.53E+03 ± 2.18E+02 ⊖ 2.24E+03 ± 1.07E+02 ⊖
F38 1.76E+03 ± 1.28E+02 ⊖ 1.80E+03 ± 1.99E+02 ⊖ 9.82E+03 ± 1.32E+03 ⊖ 2.13E+03 ± 2.16E+02 ⊖
F39 2.10E+03 ± 1.76E+01 ⊖ 2.16E+03 ± 4.91E+01 ⊖ 2.19E+03 ± 4.17E+01 ⊖ 2.07E+03 ± 1.94E+01 ⊖
F40 2.63E+03 ± 6.23E+01 ⊖ 2.65E+03 ± 9.24E+01 ⊖ 2.67E+03 ± 4.30E+00 ⊖ 2.62E+03 ± 9.71E+00 ⊖
⊖ 8 7 7 8
⊕ 0 0 0 0
⊙ 0 1 1 0
SELO HBO LFD SETO
Mean ± Std Mean ± Std Mean ± Std Mean ± Std

F33 4.02E+02 ± 1.32E+00 ⊖ 𝟒.𝟎𝟎𝐄 + 𝟎𝟐 ± 𝟎.𝟎𝟎𝐄 + 𝟎𝟎⊙ 𝟒.𝟎𝟐𝐄 + 𝟎𝟐 ± 𝟐.𝟏𝟓𝐄 + 𝟎𝟎⊙ 4.00E+02 ± 0.00E+00
F34 6.02E+02 ± 1.43E+01 ⊖ 𝟔.𝟎𝟎𝐄 + 𝟎𝟐 ± 𝟎.𝟎𝟎𝐄 + 𝟎𝟎⊙ 6.05E+02 ± 6.22E+00 ⊖ 6.00E+02 ± 0.00E+00
F35 7.13E+02 ± 6.90E+00 ⊖ 7.10E+02 ± 3.79E–17 ⊖ 7.26E+02 ± 3.50E+01 ⊖ 7.00E+02 ± 2.41E–15
F36 𝟗.𝟎𝟎𝐄 + 𝟎𝟐 ± 𝟎.𝟎𝟎𝐄 + 𝟎𝟎⊙ 𝟗.𝟎𝟎𝐄 + 𝟎𝟐 ± 𝟎.𝟎𝟎𝐄 + 𝟎𝟎⊙ 𝟗.𝟎𝟎𝐄 + 𝟎𝟐 ± 𝟎.𝟎𝟎𝐄 + 𝟎𝟎⊙ 9.00E+02 ± 0.00E+00
F37 1.62E+03 ± 5.20E+01 ⊖ 𝟏.𝟎𝟎𝐄 + 𝟎𝟑 ± 𝟎.𝟎𝟎𝐄 + 𝟎𝟎⊙ 1.34E+03 ± 6.76E+01 ⊖ 1.00E+03 ± 0.00E+00
F38 1.68E+03 ± 4.30E+01 ⊖ 1.60E+03 ± 1.15E+02 ⊖ 1.84E+03 ± 4.30E+01 ⊖ 1.58E+03 ± 5.60E+01
F39 2.04E+03 ± 2.60E+01 ⊖ 2.00E+03 ± 6.40E–01 ⊕ 2.05E+03 ± 1.13E+02 ⊖ 2.01E+03 ± 7.22E+01
F40 2.63E+03 ± 3.90E+01 ⊖ 2.56E+03 ± 8.30E+01 ⊕ 2.55E+03 ± 6.70E+01 ⊕ 2.58E+03 ± 7.64E+01
⊖ 7 2 5 –
⊕ 0 2 1 –
⊙ 1 4 2 –
H. Emami

Best results are illustrated in boldface


Stock exchange trading optimization algorithm: a… 2155

• In the case of fixed-dimension test cases, the SETO and HBO take 1st rank for
all test functions in terms of best mean results. However, in terms of std, the
first position belongs to SETO, which shows its stable convergence behavior in
solving fixed-dimension problems. Both SCA and PSO attain third rank among
others. GA, PSO, GSA, SCA, SELO, HBO, LFD, and SETO, respectively, gen-
erate 5, 7, 5, 7, 6, 9, 6, and 9 best mean results out of the total 10 functions.
From the results given in Table 7, it is evident that both SETO and HBO have
excellent exploitation ability; however, SETO is more stable than HBO. The high
exploitation power of SETO is due to two reasons. First, the algorithm updates
the position of shares in the search space if the next positions are better than
precedent positions. Second, shares move toward the best solution from different
directions at each generation that helps them jump out of local optima. Figure 5a
illustrates the results of the Friedman mean rank test [85] on fixed-dimension
functions. The Friedman mean rank value of SETO is minimum, which shows
that it obtains 1st rank compared with other algorithms.
• The results reported by SETO in solving unimodal functions are superior. It
generates the best mean results in all test functions. The second rank belongs to
HBO with 7 best mean results out of the total 12. This confirms that SETO has
superior exploitation power and convergence speed in solving unimodal func-
tions. GA, PSO, GSA, SCA, SELO, HBO, LFD, and SETO, respectively, gen-
erate 0, 0, 0, 0, 1, 7, 1, and 12 best mean results out of the total 12 functions.
Inspecting the std values shows that SETO attains the best standard deviations
among other algorithms, which confirms its stability in the searching process.
Figure  5b shows the results of the Friedman test on unimodal functions. As
shown in the plot, SETO obtains the best mean rank among others.
• As shown in Table 9, SETO is very powerful in solving multimodal functions. It
generates the best mean results for all test functions except F29. Inspecting the
results, we conclude that SETO significantly outperforms its counterparts due to
its high exploration power. The reason for this success lies in the position updat-
ing mechanism in the rising phase, in which the shares jump out of the local
optima and move toward the best solution from different directions. GA, PSO,
GSA, SCA, SELO, HBO, LFD, and SETO, respectively, generate 1, 1, 0, 0, 1,
3, 2, and 9 best mean results out of the total 10 functions. As shown in Fig. 5c,
the SETO attains 1st position and HBO 2nd rank among all algorithms on multi-
modal functions.
• The performance of SETO in solving group IV shifted and rotated, hybrid and
composite functions is superior, and it outperformed other algorithms on F33–
F38 functions. For F39 and F40, HBO and LFD generate the best mean results,
respectively. The mean results for F39 and F40, where SETO is not the top per-
former algorithm, are still very comparable and competitive to the best results
attained by HBO and LFD. As illustrated in Fig. 5d, SETO attains the best mean
rank among others in solving group IV functions. This confirms that SETO can
provide a proper balance between exploitation and exploration mechanisms in
solving complex and difficult problems. GA, PSO, GSA, SCA, SELO, HBO,
LFD, and SETO, respectively, generate 0, 1, 1, 0, 1, 5, 3, and 6 best mean results
out of the total 8 functions.

13
2156
H. Emami

7 8 7.08
5.85
6 5.45 5.25
7 6.25 6.08
Friedman mean rank

Friedman mean rank


5.05
5 6
4.2 4.58 4.71
3.7 5
4 3.3 3.2
4 3.42
3
3 2.08
2 1.79
2
1 1
0 0
GA PSO GSA SCA SELO HBO LFD SETO GA PSO GSA SCA SELO HBO LFD SETO

Algorithm Algorithm

(a) Fixed-dimension functions (b) Unimodal functions

7 8
6.1 6.1
6.5 6.69
6 5.4 5.6 7 6.19
5.2
Friedman mean rank

Friedman mean rank


5.63
5 6
5
4 3.25 3.81
2.7 4 3.19
3
3 2.19
1.65 1.81
2 2
1 1
0 0
GA PSO GSA SCA SELO HBO LFD SETO GA PSO GSA SCA SELO HBO LFD SETO

Algorithm Algorithm

(c) Multimodal functions (d) Shifted, hybrid and composite functions


Fig. 5  Statistical result of Friedman mean rank test for benchmark functions

The key factor to efficient search is the proper harmonization between exploration
(diversification) and exploitation (intensification). In the SETO algorithm, the ris-
ing operator is responsible for exploring the search space, and the falling operator is
responsible for exploiting the promising areas. The rising operator directs the search
agents (shares) in the solution space to explore unvisited areas and finds the prom-
ising areas, whilst the falling operator tries to carefully examine the inside of the
promising areas via accumulated local knowledge. The falling operator moves the
solutions far from the current area of search so that explorative move should reach
all the regions within search space accessed at least once. On the other hand, using
local experience, the falling operator forces the solutions to converge quickly with-
out wasting too many moves. The results confirm that SETO can provide a proper
balance between exploitation and exploration mechanisms in the search and optimi-
zation process.
Figure 6 presents the mean and overall ranks of comparison algorithms computed
by the nonparametric Friedman test [85] on all benchmark functions. The results
reveal that SETO obtains 1st overall rank and HBO obtains 2nd rank among all algo-
rithms. The third and fourth ranks belong to SELO and LFD, respectively. The dif-
ference between the LFD and SELO is insignificant and minute. GA is ranked last.
This phenomenon suggests that the introduction of new algorithms or the improve-
ment of existing ones is needed to solve classic and modern optimization problems.
Table  11 presents the results of the multi-problem-based Wilcoxon signed-rank
test [85] at significant level 𝛼 = 0.05 for benchmark functions. This test is performed

13
Stock exchange trading optimization algorithm: a… 2157

to determine the significant differences between the reported results by comparison


algorithms. In Table 11, the SETO is the control algorithm. The results show that
the SETO is statistically successful than its counterparts in solving test functions.
To show the quantitative differences between the results of SETO and those
attained by comparison algorithms on benchmark functions, we perform a contrast
estimation [85]. The objective is to determine by how far the SETO outperforms its
counterparts. As shown in Table  12, SETO has a significant difference with other
algorithms that shows its good optimization ability on different test problems.

4.5 Scalability analysis

The convergence speed and optimization ability of algorithms will decrease as the
dimension of problems increases. To investigate this issue, we performed a series
of tests on 1000-dimension benchmark functions to evaluate the scalability of algo-
rithms. The experiments are performed on scalable unimodal functions F11–F22
and multimodal functions F23–F32. The algorithms terminate when they reach the
global optimum point, or they have failed to find a better solution than the exist-
ing solution during the last 50,000 FEs. The results are listed in Tables 13 and 14.
From the results, it can be concluded that SETO attains all the best mean results in
1000 dimension problems except F15. However, the best mean result for F15 is very
competitive to the best result. SELO, LFD, and HBO generate good performances;
however, their difference with SETO is not minute. The results confirm the superior
scalability of SETO compared with its counterparts. Figure 7 illustrates the execu-
tion time consumed by algorithms to reach the global optimum. From the figure, we
observe that SETO takes less execution time than other algorithms in most test func-
tions. SETO performs exploration and exploration at the same time and converges
faster. Therefore, SETO has less search time than other algorithms. After reaching
the global optimum, the solutions do not change, and according to the termination
conditions mentioned in Sect. 3.2.6, the algorithm stops.

4.6 Convergence test

To investigate the searching performance of the algorithms, we perform a conver-


gence test on five candidate functions F6, F11, F28, F35, and F40 as representatives
from each benchmark group of fixed-dimension, unimodal, multimodal, shifted and
rotated, and composite test functions, respectively. Figure  8 illustrates the graphi-
cal representation, convergence plot, and distribution of solutions for test functions.
The convergence plots confirm that the SETO avoids premature convergence; how-
ever, it converges relatively faster than other optimizers in solving different types of
test functions. This is due to the efficient exploitation, exploration, and local avoid-
ance ability of SETO compared with others. As shown in the box-and-whisker plots,
SETO generates solutions with the minimum dispersion, which proves its stability
in the search process.

13
2158
H. Emami

Mean rank Overall rank


9 8
8 7
Friedman mean rank

7 6.24 6
6 5.57 5.38 5.61
5
5 4.23 4.3 4
4 3
3 2.56
2 2.11
2 1
1
0
GA PSO GSA SCA SELO HBO LFD SETO
Algorithm

Fig. 6  The mean and overall ranks of optimization algorithms computed by Friedman test for all bench-
mark functions

Table 11  Results of multi- SETO vs. T+ T– p value winner


problem-based two-sided
Wilcoxon signed-rank test at GA 466 30 0.00001 SETO
0.05 significant level for SETO
against counterpart algorithms PSO 358 20 0.00001 SETO
on benchmark functions GSA 281 19 0.00001 SETO
SCA 427 8 0.00001 SETO
SELO 168 22 0.00001 SETO
HBO 65.5 25.5 0.00298 SETO
LFD 220 56 0.00016 SETO

4.7 Computational complexity

4.7.1 Time complexity

The time complexity of SETO is calculated as follows:

• The population initialization phase costs O(ND).


• Calculating the initial fitness of all shares needs O(NC), where C indicates the
cost of the objective function.
• The time complexity of the rising phase bounded by O(ND + NC).
• The falling phase costs O(ND + NC).
• The exchange phase costs O(N).
• The time complexity of the RSI calculation phase is O(N).

The overall time complexity of SETO within one iteration in the worst case can be
calculated as

13
Stock exchange trading optimization algorithm: a… 2159

Table 12  Contrast estimation between optimization algorithms on all test problems


GA PSO GSA SCA SELO HBO LFD SETO

GA 0 – 0.2255 – 0.2400 – 0.1660 – 0.2435 – 0.3139 – 0.3520 – 0.3792


PSO 0.2255 0 – 0.0145 0.0595 – 0.0181 – 0.0884 – 0.1265 – 0.1538
GSA 0.2400 0.0145 0 0.0740 – 0.0036 – 0.0739 – 0.1120 – 0.1393
SCA 0.1660 – 0.0595 – 0.0740 0 – 0.0775 – 0.1479 – 0.1860 – 0.2132
SELO 0.2435 0.0181 0.0036 0.0775 0 – 0.0704 – 0.1085 – 0.1357
HBO 0.3139 0.0884 0.0739 0.1479 0.0704 0 – 0.0381 – 0.0654
LFD 0.3520 0.1265 0.1120 0.1860 0.1085 0.0381 0 – 0.0273
SETO 0.3792 0.1538 0.1393 0.2132 0.1357 0.0654 0.0273 0

O(ND) + O(NC) + O(ND + NC) + O(ND + NC) + O(N) + O(N) =


O(ND) + O(NC) + 2O(ND + NC) + 2O(N) (29)

Since the cost of computing objective function varies for each optimization problem,
Eq. (29) can be revised as follows:
{
O(ND) + O(ND) + 2O(ND + ND) + 2O(N) ≈ O(ND) if(D > C)
O(NC) + O(NC) + 2O(NC + NC) + 2O(N) ≈ O(NC) otherwise (30)

The overall time complexity of SETO is O(GND) or O(GNC) when the algorithm
iterates for G iterations. The overall time complexity of GA, PSO, GSA, SCA,
SELO, HBO, and LFD is O(GND) in the worst case. The time complexity of the
SETO is asymptotically equivalent to its counterparts. This proves that the SETO is
computationally efficient compared with other algorithms.

4.7.2 Space complexity

The proposed SETO needs O(N × D) space to store population at each generation,
where N denotes the population size, and D is the number of dimensions of prob-
lems. Besides, the algorithm uses O(N) space to store the fitness of shares. The over-
all space complexity of the SETO is O(ND).

5 Engineering problems

To show the applicability of the SETO algorithm on real-world problems, we applied


it to four well-studied engineering problems including three-bar truss design, rolling
element bearing design, pressure vessel design, and speed reducer design. Since the
engineering problems consist of several constraints, the SETO is equipped with a
constraint handling method to handle the design constraints. In this way, if each of
the solutions violates the constraints, the algorithm ignores that solution and regen-
erates a valid one instead.

13
Table 13  Statistical results of 1000D unimodal functions
2160

Function GA PSO GSA SCA


Mean ± Std Mean ± Std Mean ± Std Mean ± Std

13
F11 8.66E+15 ± 1.52E+06 ⊖ 4.40E+02 ± 7.51E+00 ⊖ 8.10E+05 ± 3.86E+00 ⊖ 4.86E+03 ± 6.84E+02 ⊖
F12 2.53E+09 ± 8.96E+05 ⊖ 2.06E+05 ± 2.75E+02 ⊖ 1.07E+07 ± 5.54E+03 ⊖ 4.28E+08 ± 6.55E+06 ⊖
F13 9.31E+05 ± 8.64E+02 ⊖ 9.44E+03 ± 9.80E+01 ⊖ 2.08E+04 ± 9.34E+02 ⊖ 5.23E–02 ± 1.70E–03 ⊖
F14 2.97E–02 ± 6.00E–04 ⊖ 2.92E+00 ± 5.00E–03 ⊖ 1.45E–17 ± 6.35E–18 ⊖ 3.19E+05 ± 3.25E+03 ⊖
F15 1.12E+10 ± 6.47E+02 ⊖ 1.95E+04 ± 3.65E+02 ⊖ 1.51E+07 ± 9.17E+05 ⊖ 2.16E+09 ± 6.32E+07 ⊖
F16 4.28E+04 ± 1.28E+02 ⊖ 4.76E+02 ± 8.47E+01 ⊖ 5.71E+03 ± 6.02E+02 ⊖ 1.06E+03 ± 3.27E+02 ⊖
F17 9.78E+01 ± 1.65E+00 ⊖ 9.95E–01 ± 5.00E–03 ⊖ 3.18E+01 ± 1.52E+01 ⊖ 9.91E+01 ± 1.60E+01 ⊖
F18 inf ⊖ 4.83E+02 ± 1.39E+01 ⊖ 6.31E+01 ± 3.12E+00 ⊖ inf ⊖
F19 3.54E+03 ± 1.56E+01 ⊖ 8.27E+01 ± 5.00E–02 ⊖ 4.62E+06 ± 6.15E+03 ⊖ 1.51E+11 ± 3.65E+06 ⊖
F20 2.64E+06 ± 5.61E+02 ⊖ 3.11E+02 ± 3.50E+01 ⊖ 𝟎.𝟎𝟎𝐄 + 𝟎𝟎 ± 𝟎.𝟎𝟎𝐄 + 𝟎𝟎⊙ 3.08E+05 ± 2.49E+04 ⊖
F21 1.23E+07 ± 6.12E+03 ⊖ 1.56E+05 ± 1.20E–02 ⊖ 4.23E+05 ± 3.31E+02 ⊖ 1.30E+06 ± 2.14E+04 ⊖
F22 𝟎.𝟎𝟎𝐄 + 𝟎𝟎 ± 𝟎.𝟎𝟎𝐄 + 𝟎𝟎⊙ 𝟎.𝟎𝟎𝐄 + 𝟎𝟎 ± 𝟎.𝟎𝟎𝐄 + 𝟎𝟎⊙ 𝟎.𝟎𝟎𝐄 + 𝟎𝟎 ± 𝟎.𝟎𝟎𝐄 + 𝟎𝟎⊙ 𝟎.𝟎𝟎𝐄 + 𝟎𝟎 ± 𝟎.𝟎𝟎𝐄 + 𝟎𝟎⊙
⊖ 11 11 10 11
⊕ 0 0 0 0
⊙ 1 1 2 1
SELO HBO LFD SETO
Mean ± Std Mean ± Std Mean ± Std Mean ± Std
F11 5.12E–04 ± 9.45E–05 ⊖ 1.45E+01 ± 5.02E+00 ⊖ 1.66E–06 ± 8.55E–08 ⊖ 0.00E+00 ± 0.00E+00
F12 1.00E+00 ± 6.00E–03 ⊖ 1.56E+05 ± 1.26E+03 ⊖ 1.00E+00 ± 4.15E–28 ⊖ 6.67E–01 ± 0.00E+00
F13 2.58E–05 ± 7.00E–04 ⊖ 8.61E+02 ± 7.64E+00 ⊖ 2.53E–06 ± 1.19E–08 ⊖ 0.00E+00 ± 0.00E+00
F14 5.66E–06 ± 6.48E–08 ⊖ 7.19E–09 ± 1.75E–10 ⊖ 1.77E–03 ± 2.56E–05 ⊖ 0.00E+00 ± 0.00E+00
F15 9.97E+02 ± 6.03E+01 ⊕ 1.76E+03 ± 1.42E+02 ⊖ 9.89E+02 ± 3.26E–02 ⊕ 9.99E+02 ± 0.00E+00
F16 2.47E–04 ± 3.60E–04 ⊖ 2.05E+01 ± 6.56E+00 ⊖ 1.98E–01 ± 3.90E–04 ⊖ 0.00E+00 ± 0.00E+00
F17 7.97E+01 ± 7.55E–01 ⊖ 9.87E+01 ± 8.05E+00 ⊖ 6.94E–04 ± 6.31E–07 ⊖ 0.00E+00 ± 0.00E+00
H. Emami

F18 1.30E+193 ± 6.64E+85 ⊖ inf ⊖ inf ⊖ 0.00E+00 ± 0.00E+00


Table 13  (continued)
Function GA PSO GSA SCA
Mean ± Std Mean ± Std Mean ± Std Mean ± Std

F19 1.15E–03 ± 4.70E–05 ⊖ 6.64E+06 ± 5.64E+03 ⊖ 7.01E–43 ± 3.95E–45 ⊖ 0.00E+00 ± 0.00E+00


F20 2.60E–08 ± 6.00E–06 ⊖ 6.27E–03 ± 3.40E–04 ⊖ 6.32E–05 ± 8.32E–07 ⊖ 0.00E+00 ± 0.00E+00
F21 1.03E–07 ± 5.60E–04 ⊖ 1.39E+03 ± 9.66E+01 ⊖ 3.01E-04 ± 1.70E–05 ⊖ 0.00E+00 ± 0.00E+00
F22 𝟎.𝟎𝟎𝐄 + 𝟎𝟎 ± 𝟎.𝟎𝟎𝐄 + 𝟎𝟎⊙ 𝟎.𝟎𝟎𝐄 + 𝟎𝟎 ± 𝟎.𝟎𝟎𝐄 + 𝟎𝟎⊙ 𝟎.𝟎𝟎𝐄 + 𝟎𝟎 ± 𝟎.𝟎𝟎𝐄 + 𝟎𝟎⊙ 0.00E+00 ± 0.00E+00
⊖ 10 11 10 –
⊕ 1 1 1 –
⊙ 1 1 1 –

Best results are illustrated in boldface


Stock exchange trading optimization algorithm: a…
2161

13
Table 14  Statistical results of 1000D multimodal functions
2162

Function GA PSO GSA SCA


Mean ± Std Mean ± Std Mean ± Std Mean ± Std

13
F23 2.09E+01 ± 5.19E+00 ⊖ 3.83E+00 ± 1.70E–01 ⊖ 9.41E+00 ± 2.15E+00 ⊖ 2.08E+01 ± 2.16E+00 ⊖
F24 2.39E+03 ± 1.26E+01 ⊖ 2.95E+02 ± 2.30E+01 ⊖ 3.79E+02 ± 5.56E+01 ⊖ 2.82E+02 ± 2.37E+00 ⊖
F25 2.29E+04 ± 7.52E+01 ⊖ 7.00E–01 ± 0.00E+00 ⊖ 6.64E+01 ± 1.53E+01 ⊖ 5.78E+01 ± 7.01E+00 ⊖
F26 3.80E+02 ± 5.80E+01 ⊖ 2.43E+02 ± 1.30E–01 ⊖ 1.69E+02 ± 1.08E+01 ⊖ 3.61E+02 ± 2.60E+01 ⊖
F27 1.56E+04 ± 8.24E+02 ⊖ 4.64E+03 ± 6.70E+01 ⊖ 3.78E+01 ± 2.40E+01 ⊖ 2.66E+03 ± 9.60E+01 ⊖
F28 1.62E+02 ± 1.14E+01 ⊖ 1.80E+00 ± 6.00E–10 ⊖ 3.61E+01 ± 7.68E+00 ⊖ 6.15E+01 ± 6.06E+00 ⊖
F29 6.31E+07 ± 9.52E+03 ⊖ 2.84E+03 ± 4.60E+01 ⊖ 3.13E+06 ± 3.55E+04 ⊖ 7.57E+06 ± 5.82E+04 ⊖
F30 inf ⊖ 1.46E+00 ± 3.20E–03 ⊖ 1.32E+102 ± 1.56E+36 ⊖ inf ⊖
F31 4.10E–182 ± 6.30E–185 ⊖ 𝟎.𝟎𝟎𝐄 + 𝟎𝟎 ± 𝟎.𝟎𝟎𝐄 + 𝟎𝟎⊙ 0.00E+00 ± 0.00E+00 ⊙ 𝟎.𝟎𝟎𝐄 + 𝟎𝟎 ± 𝟎.𝟎𝟎𝐄 + 𝟎𝟎⊙
F32 𝟎.𝟎𝟎𝐄 + 𝟎𝟎 ± 𝟎.𝟎𝟎𝐄 + 𝟎𝟎⊙ 0.00E+00 ± 0.00E+00 ⊙ 𝟎.𝟎𝟎𝐄 + 𝟎𝟎 ± 𝟎.𝟎𝟎𝐄 + 𝟎𝟎⊙ 𝟎.𝟎𝟎𝐄 + 𝟎𝟎 ± 𝟎.𝟎𝟎𝐄 + 𝟎𝟎⊙
⊖ 9 8 8 8
⊕ 0 0 0 0
⊙ 1 2 2 2
SELO HBO LFD SETO
Mean ± Std Mean ± Std Mean ± Std Mean ± Std
F23 3.11E–04 ± 3.16E–07 ⊖ 2.45E+00 ± 1.08E+00 ⊖ 3.24E–04 ± 2.61E–04 ⊖ – 8.88E–16 ± 0.00E+00
F24 1.99E–38 ± 4.74E–39 ⊖ 5.82E–03 ± 3.64E–05 ⊖ 2.58E–03 ± 3.20E–02 ⊖ 0.00E+00 ± 0.00E+00
F25 2.22E–16 ± 2.38E–17 ⊖ 4.39E–05 ± 7.00E–06 ⊖ 1.34E–07 ± 6.80E–08 ⊖ 0.00E+00 ± 0.00E+00
F26 9.00E–01 ± 5.53E–24 ⊙ 8.74E+00 ± 2.10E+00 ⊖ 9.00E–01 ± 0.00E+00 ⊙ 9.00E–01 ± 0.00E+00
F27 6.36E–04 ± 3.16E–05 ⊖ 1.09E+03 ± 6.52E+02 ⊖ 8.24E–06 ± 1.20E–05 ⊖ 0.00E+00 ± 0.00E+00
F28 3.00E–01 ± 5.60E–03 ⊖ 1.44E+01 ± 2.50E+00 ⊖ 1.21E–03 ± 4.80E–04 ⊖ 0.00E+00 ± 0.00E+00
F29 3.09E+03 ± 9.90E+01 ⊖ 6.32E+03 ± 5.53E+02 ⊖ 3.59E+03 ± 9.75E+02 ⊖ 3.59E+01 ± 1.13E+01
F30 1.56E–53 ± 3.51E–55 ⊖ inf ⊖ inf ⊖ 0.00E+00 ± 0.00E+00
H. Emami
Table 14  (continued)
Function GA PSO GSA SCA
Mean ± Std Mean ± Std Mean ± Std Mean ± Std
F31 1.75E–143 ± 3.07E–148 ⊖ 𝟎.𝟎𝟎𝐄 + 𝟎𝟎 ± 𝟎.𝟎𝟎𝐄 + 𝟎𝟎⊙ 𝟎.𝟎𝟎𝐄 + 𝟎𝟎 ± 𝟎.𝟎𝟎𝐄 + 𝟎𝟎⊙ 0.00E+00 ± 0.00E+00
F32 𝟎.𝟎𝟎𝐄 + 𝟎𝟎 ± 𝟎.𝟎𝟎𝐄 + 𝟎𝟎⊙ 𝟎.𝟎𝟎𝐄 + 𝟎𝟎 ± 𝟎.𝟎𝟎𝐄 + 𝟎𝟎⊙ 𝟎.𝟎𝟎𝐄 + 𝟎𝟎 ± 𝟎.𝟎𝟎𝐄 + 𝟎𝟎⊙ 0.00E+00 ± 0.00E+00
⊖ 8 8 7 –
⊕ 0 0 0 –
⊙ 2 2 3 –

Best results are illustrated in boldface


Stock exchange trading optimization algorithm: a…
2163

13
2164
H. Emami

160
GA
140
PSO
120 GSA
Time (sec.)

100 SCA

80 SELO

HBO
60
LFD
40
SETO
20

0
F11 F12 F13 F14 F15 F16 F17 F18 F19 F20 F21 F22 F23 F24 F25 F26 F27 F28 F29 F30 F31 F32
Test function

Fig. 7  Comparison of the execution time of algorithms in 1000D unimodal and multimodal functions

5.1 Three‑bar truss design problem

Figure 9a shows the structure of the three-bar truss design problem. This problem is
one of the most studied test cases used in the literature [7, 36]. The objective is to
design a truss with three bars so that its weight to be minimal. The problem has two
parameters including the area of bars 1 and 3 and the area of bar 2. To design the
truss, three constraints should be considered: stress, deflection, and buckling. The
problem is mathematically defined as follows:
� � � �
let X�⃗ = x1 , x2 = A1 , A2

minimize f (X) �⃗ = l × (2 2x1 + x2 ) 0 ≤ x1 , x2 ≤ 1

subject to g1 (X)�⃗ = √ 2x1 +x2 P − 𝜎 ≤ 0
2x12 +2x1 x2 )
�⃗ = √ x2 (31)
g2 (X) P−𝜎 ≤0
2x12 +2x1 x2 )
�⃗ =
g3 (X) √
1
P −𝜎 ≤0
2x2 +x1 )
where l = 100cm, P = 2KN∕cm2 , and 𝜎 = 2KN/cm2

Table 15 compares the optimization results obtained by algorithms on the three-


bar truss design problem. To generate results, the algorithms were iterated 30 times,
each time with a different initial population. The population size and the FEs are set
to 25 and 100,000, respectively. The results confirm that SETO outperforms other
algorithms in finding the optimal parameters and the weight of the truss.

5.2 Rolling element bearing design problem

Figure  9b shows the schematic view of the rolling element bearing design prob-
lem. It is a maximization problem, which contains ten geometric variables and nine
design constraints to control the assembly and geometric-based restrictions [15].

13
Stock exchange trading optimization algorithm: a… 2165

Fig. 8  Convergence graphs and solution distributions of comparison algorithms on F6, F11, F28, F35,
and F40 test functions

13
2166
H. Emami

The objective is to maximize the dynamic load-carrying capacity of a rolling ele-


ment bearing. This problem is mathematically formulated as follows [15]:
maximize
Cd = fc Z 2∕3 D1.8
b
if(D ≤ 25.4mm)
Cd = 3.647fc Z 2∕3 D1.4
b
if(D > 25.4mm)

subject to
𝜙
g1 (⃗z) = 2sin−1 (D0 ∕D ) − Z + 1 ≤ 0, g2 (⃗z) = 2Db − KD min (D − d) > 0,
b m
g3 (⃗z) = KD max (D − d) − 2Db ≥ 0, g4 (⃗z) = 𝜉Bw − Db ≤ 0,
g5 (⃗z) = Dm − 0.5(D + d) ≥ 0, g6 (⃗z) = (0.5 + e)(D + d) − Dm ≥ 0,
g7 (⃗z) = 0.5(D − Dm − Db ) − 𝜀Db ≥ 0 g8 (⃗z) = fi ≥ 0.515,
g9 (⃗z) = fo ≥ 0.515

where
� � �−0.3
� �1.72 � � 0.3 �0.41 �10∕3
�� �0.41
(1−𝛾)1.39
fi (2f0 −1)
× 𝛾 (1−𝛾)
1−𝛾 2fi
fc = 37.91 1 + 1.04 1+𝛾 fo (2fi −1)1∕3
2fi −1
� � �2 �
x = {(D − 2)∕2 − 3(T∕4)}2 + D∕2 − T∕4 − Db − {d∕2 + T∕4}2
� �
y = 2{(D − d)∕2 − 3(T∕4)} (D∕d) − T∕4 − Db
∏ x
𝜙o = 2 −cos−1 ( y )
D r r
𝛾 = D b , fi = Di , fo = Di , T = D − d − 2Db
m b b
D = 160, d = 90
Bw = 30, ri = ro = 11.0330.5(D + d) ≤ Dm ≤ 0.6(D + d),
0.15(D - d) ≤ Db ≤ 0.45(D − d), 4 ≤ Z ≤ 50, 0.515 ≤ fi and fo ≤ 0.6,
0.4 ≤ KD min ≤ 0.5,
0.6 ≤ KD max ≤ 0.7, 0.3 ≤ e ≤ 0.4, 0.02 ≤ e ≤ 0.1,
0.6 ≤ 𝜉 ≤ 0.85
(32)
Table 16 summarizes the solutions obtained by the proposed SETO and comparison
algorithms for the rolling element bearing design problem. Inspecting the results in
Table 16, we conclude that the SETO obtains superior results compared with other
optimizers and exposes the best design.

5.3 Speed reducer design problem

Figure 9c shows a schematic view of the speed reducer design problem. The objec-
tive is to design a simple gearbox with the minimum weight that is embedded
between the propeller and the engine in light aircraft [15]. The problem consists of
constraints on surface stress, bending stress of the gear teeth, stresses in the shafts,
and transverse deflections of the shafts. The mathematical formulation of the prob-
lem is as follows [15]:

13
Stock exchange trading optimization algorithm: a… 2167

Fig. 9  Engineering design problems used in the tests

Table 15  Comparison of result Algorithm Problem parameters Optimum weight


obtained by algorithms for tree-
bar truss design x1 x2

GA 0.792 0.399 263.9037


PSO 0.7901 0.4042 263.8974
GSA 0.7898 0.4052 263.8967
SCA 0.7875 0.4117 263.8989
SELO 0.7878 0.4108 263.8964
HBO 0.7887 0.4082 263.8959
LFD 0.7879 0.4106 263.8963
SETO 0.7886 0.4083 263.8958

Best results are illustrated in boldface

13
2168
H. Emami

Table 16  Comparison of results for rolling element bearing design problem


Algorithms GA PSO GSA SCA SELO HBO LFD SETO

Dm 127.4083 127.5557 125 125.812 126.3521 125.7189 126.3999 125.7227


Db 20.3698 20.2762 20.6628 20.8214 21.0299 21.4233 21 21.4233
Z 11 11 11 1100 1100 1100 11 11
fi 0.515 0.515 0.515 0.515 0.515 0.515 0.515 0.515
fo 0.515 0.515 0.5333 0.5182 0.515 0.515 0.5251 0.515
Kdmin 0.4 0.5 0.5 50 0.4 0.4 0.5 0.4
Kdmax 0.6 0.6 60 0.63 0.6011 0.7 0.6 0.7
𝜀 0.3 0.7 0.3469 0.3003 0.3 0.3 0.3 0.3
e 0.1 0.3 0.02 0.0669 0.1 0.0998 0.1 0.1
𝜉 0.6 0.0956 0.6884 0.6001 0.6004 60 0.6 0.6
Cd 80863.22 80433.47 81373.29 81256.51 83805.29 85537.48 83670.78 85539.19

Best results are illustrated in boldface

minimize f (⃗x) = 0.7854x1 x22 (3.3333x32 + 14.9334x3 − 43.0934)−


1.508x1 (x62 + x72 ) + 7.4777(x63 + x73 ) + 0.7854(x4 x62 + x5 x72 )

27 397.5
subject to g1 (⃗x) = x1 x22 x3
− 1 ≤ 0, g2 (⃗x) = x1 x22 x32
− 1 ≤ 0,
1.93x43 1.93x53
g3 (⃗x) = − 1 ≤ 0, g4 (⃗x) = x x x4 − 1 ≤ 0,
x2 x3 x64
√( )2
2 3 7

1 745x4
g5 (⃗x) = 110x3 + 16.9 × 106 − 1 ≤ 0
6√
110x2 x3
( )2 (33)
1 745x5
g6 (⃗x) = 85x 3
110x x
+ 157.5 × 106 − 1 ≤ 0
7 2 3
x2 x3 5x
g7 (⃗x) = 40
− 1 ≤ 0, g8 (⃗x) = x 2 − 1 ≤ 0
1
x 15x +1.9
g9 (⃗x) = 12x1 − 1 ≤ 0, g10 (⃗x) = 6x −1 ≤0
2 4
1.1x7 +1.9
g11 (⃗x) = x5
−1≤0
where 2.6 ≤ x1 ≤ 3.6, 0.7 ≤ x2 ≤ 0.8, 17 ≤ x3 ≤ 28,
7.3 ≤ x4 ≤ 8.3, 7.3 ≤ x5 ≤ 8.3, 2.9 ≤ x6 ≤ 3.9,
5.0 ≤ x7 ≤ 5.5

As shown in Table 17, HBO obtains the best results. With a slight difference from
HBO, the proposed SETO takes the second rank. Except for HBO, the proposed
SETO attains the best results compared to other optimizers, which confirms that it
can be a suitable choice for designing the speed reducer.

5.4 Pressure vessel design problem

Pressure vessels are widely used in industry structures such as gas tanks and cham-
pagne bottles. The goal is to design a cylindrical vessel with the minimum fabrica-
tion cost. The problem consists of four design parameters including the thickness of

13
Stock exchange trading optimization algorithm: a… 2169

Table 17  Comparison of results for speed reducer design problem


Algorithm Problem parameters Optimal cost
x1 x2 x3 x4 x5 x6 x7

GA 3.599614 0.7 17 7.300000 7.715320 3.350238 5.286655 3033.6028


PSO 3.600000 0.7 17 8.299999 7.715358 3.352207 5.286655 3043.0812
GSA 3.500252 0.7 17 7.750236 7.715629 3.351082 5.286725 2998.8137
SCA 3.564661 0.7 17 7.300000 7.858052 3.356281 5.288056 3025.4368
SELO 3.500195 0.70002 17 7.307411 7.918465 3.350301 5.286724 2999.2274
HBO 3.500000 0.7 17 7.300000 7.715320 3.350210 5.286650 2994.4711
LFD 3.500006 0.7 17 7.304732 7.715321 3.350224 5.286655 2994.5173
SETO 3.500013 0.700001 17 7.300330 7.715996 3.350216 5.286655 2994.4991

Best results are illustrated in boldface

Table 18  Comparison of results Algorithm Problem parameters Cost


for pressure vessel design
problem Ts Th R L

GA 0.87591 0.43296 45.38407 139.77943 6074.4540


PSO 0.86919 0.43221 45.03562 143.43608 6071.4145
GSA 1.12500 0.62500 55.98870 84.45420 8538.8359
SCA 0.86390 0.42703 44.76160 146.21210 6048.6169
SELO 0.87871 0.43435 45.52902 138.30634 6080.3521
HBO 0.84186 0.41600 43.60000 159.00000 6003.3650
LFD 0.84308 0.41674 43.68298 157.94655 6005.8844
SETO 0.81268 0.40171 42.10791 176.53302 5947.3050

Best results are illustrated in boldface

the head ( Ts ), the thickness of the body ( Th ), the inner radius (R), and the length of
the cylindrical section (L). Figure 9d shows the overall structure of the pressure ves-
sel design problem. The problem is mathematically defined as follows [4]:
let x⃗ = [x1 , x2 , x3 , x4 ] = [Ts , Th , R, L] where 0 ≤ x1 , x2 ≤ 99,
10 ≤ x3 , x4 ≤ 200,

minimize f (⃗x) = 0.6224x1 x2 x4 + 1.778x2 x23 + 3.1661x12 x4 + 19.84x12 x3


subject to c1 (⃗x) = −x1 + 0.0193x3 ≤ 0, (34)
c2 (⃗x) = −x3 + 0.00954x3 ≤ 0,
c3 (⃗x) = −𝜋x32 x4 − 34 𝜋x33 + 1, 296, 000 ≤ 0,
c4 (⃗x) = x4 − 240 ≤ 0

Table  18 reports the results attained by SETO and comparison optimizers. The
parameters and costs of SETO are very competitive to those obtained by other algo-
rithms. This confirms that the SETO is able to deal with the constrained search
space of pressure vessel design problem.

13
2170
H. Emami

In the current implementation of the SETO algorithm, it faces three challenges:

• Tuning some of the control parameters with optimal values for different appli-
cations. Most of the control parameters of SETO are already known and con-
figured using the data drawn from the stock exchange and scientific resources
about technical analysis. This issue turns the SETO into an optimizer quite easy
to implement and execute. However, in some applications, different values for
the parameters can increase the performance of the algorithm. Parameter setting
is not specific to the SETO algorithms and exists in all algorithms.
• Increasing the execution time of the algorithm due to the calculation of the
Euclidean distance between shares in rising and falling phases. As the dimension
of the problem increases, the execution time of the algorithm also increases,
• The algorithm still traps in local optima on some benchmark functions and can-
not converge to the global optimum, as we can see in speed reducer design prob-
lem and some numerical functions such as F39, F40. This suggests that increas-
ing the exploitation and exploration power of the genetic algorithm is needed.

To summarize, the advantages of the SETO algorithm are as follows:

• It can be used for both continuous and discrete problems with some easy modifi-
cations.
• It is simple and efficient. It achieves superior results on different groups of
numerical functions and engineering optimization problems.
• It can be applied to all problems that other algorithms can be applied for.
• It converges to the global optimum of the optimization problems faster than its
counterparts.
• It outperformed other algorithms on most benchmark functions. Out of 40
numerical optimizatin functions, SETO has achieved the global optimum on
36 functions, and out of 4 engineering complex problems, it obtained the best
results on 4 cases.

6 Conclusion

This paper presents a novel stock exchange trading optimization (SETO) algo-
rithm to solve numerical and engineering optimization problems. The algorithm is
based on technical-based trading strategies in the stock market. Rising, falling, and
exchange are the three main phases of the algorithm that hopefully causes the solu-
tions to converge to the global optimum of the cost function. SETO is easy to imple-
ment and conceptually simple. To test the performance of SETO, it is compared
with several state-of-the-art optimizers in solving a wide variety of numerical global
optimization and real-world problems. The results confirm that SETO attained out-
standing performance compared with its counterparts in most test cases. This issue
is demonstrated with the experiments and the statistics of results. There remain sev-
eral directions for future research. One of the interesting works is to apply the SETO
to a variety of real-world applications to precisely determine the advantages and

13
Stock exchange trading optimization algorithm: a… 2171

weaknesses of the algorithm. Another work is to develop a multi-objective version


of the SETO to employ it for solving multi-objective problems. Finally, modeling
various indicators and phenomena in the stock exchange such as options and share
portfolio, and improving the potential of algorithm operators can be helpful to guide
the search process and further improve the performance of the algorithm.

References
1. Brammya G, Praveena S, Ninu Preetha NS, Ramya R, Rajakumar BR, Binu D (2019) Deer hunting
optimization algorithm: a new nature-inspired meta-heuristic paradigm. Comput J
2. Molina D, Poyatos J, Del Ser J, García S, Hussain A, Herrera F (2020) Comprehensive taxonomies
of nature- and bio-inspired optimization: inspiration versus algorithmic behavior, critical analysis
and recommendations. Cognit Comput 12(5):897–939
3. Abbasi M, Yaghoobikia M, Rafiee M, Jolfaei A, Khosravi MR (2020) Energy-efficient workload
allocation in fog-cloud based services of intelligent transportation systems using a learning classifier
system. IET Intell Transp Syst 14(11):1484–1490
4. Houssein EH, Saad MR, Hashim FA, Shaban H, Hassaballah H (2020) Lévy flight distribution: a
new metaheuristic algorithm for solving engineering optimization problems. Eng Appl Artif Intell
94:103731
5. Hussain K, Salleh M, Cheng S, Shi Y (2018) Metaheuristic research: a comprehensive survey. Artif
Intell Rev 52:2191–2233
6. Yang XS, Deb S, Zhao YX, Fong S, He X (2018) Swarm intelligence: past, present and future. Soft
Comput 22(18):5923–5933
7. Heidari AA, Mirjalili S, Faris H, Aljarah I, Mafarja M, Chen H (2019) Harris hawks optimization:
algorithm and applications. Futur Gener Comput Syst 97:849–872
8. Abdel-Basset M, Abdel-Fatah L, Sangaiah AK (2018) Meta-heuristic algorithms: a comprehensive
review. In: Computational intelligence for multimedia big data on the cloud with engineering appli-
cations. Elsevier Inc
9. Kennedy J, Eberhart R (1995) Particle swarm optimization. In: Proceedings of ICNN’95—Interna-
tional Conference on Neural Networks, Perth, WA, Australia, pp 1942–1948
10. Dorigo M, Birattari M, Stutzle T (2006) Ant colony optimization. IEEE Comput Intell Mag 1:28–39
11. Emami H, Derakhshan F (2015) Election algorithm: a new socio-politically inspired strategy. AI
Commun 28(3):591–603
12. Emami H (2019) Chaotic election algorithm. Comput Inform 38:1444–1478
13. Fadakar F, Ebrahimi M (2016) A new metaheuristic football game inspired algorithm. In: 1st Con-
ference on Swarm Intelligence and Evolutionary Computation CSIEC 2016—Proceedings, pp 6–11
14. Askari Q, Younas I, Saeed M (2020) Political optimizer: a novel socio-inspired meta-heuristic for
global optimization. Knowl Based Syst 195:105709
15. Askari Q, Saeed M, Younas I (2020) Heap-based optimizer inspired by corporate rank hierarchy for
global optimization. Expert Syst Appl 161:113702
16. Salih SQ, Alsewari ARA (2020) A new algorithm for normal and large-scale optimization prob-
lems: Nomadic People Optimizer. Neural Comput Appl 32(14):10359–10386
17. Sörensen K, Sevaux M, Glover F (2017) A history of metaheuristics. In: ORBEL29-29th Belgian
Conference on Operations Research, pp 791–808
18. Emami H (2020) Seasons optimization algorithm. Eng Comput 123456789:1–21
19. Mirjalili S, Lewis A (2016) The whale optimization algorithm. Adv Eng Softw 95:51–67
20. Holland JH (1992) Genetic algorithms—computer programs that ‘evolve’ in ways that resemble
natural selection can solve complex problems even their creators do not fully understand. Sci Am
66–72
21. Yao X, Liu Y, Lin G (1999) Evolutionary programming made faster. IEEE Trans Evol Comput
3(2):82–102
22. Huang F, Wang L, He Q (2007) An effective co-evolutionary differential evolution for constrained
optimization. Appl Math Comput 186:340–356
23. Simon D (2008) Biogeography-based optimization. IEEE Trans Evol Comput 12(6):702–713

13
2172
H. Emami

24. Ghaemia M, Feizi-Derakhshi MR (2014) Forest optimization algorithm. Expert Syst Appl

41(15):6676–6687
25. Hayyolalam V, Pourhaji Kazem AA (2020) Black widow optimization algorithm: a novel meta-heu-
ristic approach for solving engineering optimization problems. Eng Appl Artif Intell 87:103249
26. Shayanfar H, Gharehchopogh FS (2018) Farmland fertility: a new metaheuristic algorithm for solv-
ing continuous optimization problems. Appl Soft Comput J 71:728–746
27. Karaboga D, Basturk B (2007) A powerful and efficient algorithm for numerical function optimiza-
tion: artificial bee colony (ABC) algorithm. J Glob Optim 39(3):459–471
28. Yang X (2010) Firefly algorithm, stochastic test functions and design optimisation. Int J Bio-Inspir
Comput 2(2):78–84
29. Gandomia AH, Alavi AH (2012) Krill herd: a new bio-inspired optimization algorithm. Commun
Nonlinear Sci Numer Simul 17(12):4831–4845
30. Wang GG, Deb S, Coelho LDS (2016) Elephant herding optimization. In: Proceedings of 2015 3rd
International Symposium on Computational and Business Intelligence ISCBI, pp 1–5
31. Bansal JC, Sharma H, Jadon SS, Clerc M (2014) Spider monkey optimization algorithm for numeri-
cal optimization. Memet Comput 16(1):31–47
32. Mirjalili S, Mohammad S, Lewis A (2014) Grey wolf optimizer. Adv Eng Softw 69:46–61
33. Soleimanian F, Gholizadeh H (2019) A comprehensive survey: whale optimization algorithm and its
applications. Swarm Evol Comput 48:1–24
34. Arora S, Singh S (2018) Butterfly optimization algorithm: a novel approach for global optimization.
Soft Comput 23(3):715–734
35. Jain M, Singh V, Rani A (2019) A novel nature-inspired algorithm for optimization: squirrel search
algorithm. Swarm Evol Comput 44:148–175
36. Saremi S, Mirjalili S, Lewis A (2017) Grasshopper optimisation algorithm: theory and application.
Adv Eng Softw 105:30–47
37. Dhiman G, Kumar V (2019) Seagull optimization algorithm: theory and its applications for large
scale industrial engineering problems. Knowl Based Syst 165:169–196
38. Mohamad-saleh WTJ, Tan W (2019) Normative fish swarm algorithm (NFSA) for optimization.
Soft Comput 24(3):2083–2099
39. Fathollahi-Fard ANM, Hajiaghaei-Keshteli M, Tavakkoli-Moghaddam R (2020) Red deer algorithm
(RDA): a new nature-inspired meta-heuristic. Soft Comput 24(19):14637–14665
40. Kirkpatrick S, Vecchi GCD, Science MP (1983) Optimization by simulated annealing. Science
220:671–680
41. Rashedi E, Nezamabadi-pour H, Saryazdi S (2009) GSA: a gravitational search algorithm. Inf Sci
179(13):2232–2248
42. Erol OK, Eksin I (2006) A new optimization method: big bang–big crunch. Adv Eng Softw
37:106–111
43. Alatas B (2011) ACROA: artificial chemical reaction optimization algorithm for global optimiza-
tion. Expert Syst Appl 38(10):13170–13180
44. Shah-hosseini H (2011) Principal components analysis by the galaxy-based search algorithm: a
novel metaheuristic for continuous optimisation. Int J Comput Sci Eng 6(2):132–140
45. Feng X, Liu Y, Yu H, Luo F (2017) Physarum-energy optimization algorithm. Soft Comput

23(3):871–888
46. Kaveh A, Dadras A (2017) A novel meta-heuristic optimization algorithm: thermal exchange opti-
mization. Adv Eng Softw 110:69–84
47. Faramarzi A, Heidarinejad M, Stephens B, Mirjalili S (2019) Equilibrium optimizer: a novel optimi-
zation algorithm. Knowl Based Syst 191:105190
48. Kushwaha N, Pant M, Kant S, Jain VK (2018) Magnetic optimization algorithm for data clustering.
Pattern Recognit Lett 115:59–65
49. Alexandros GD (2017) Nature inspired optimization algorithms related to physical phenomena and
laws of science: a survey. Int J Artif Intell Tools 26(6):1–25
50. Geem ZW, Kim JH, Loganathan GV (2001) A new heuristic optimization algorithm: harmony
search. Simulation 76(2):60–68
51. Atashpaz-Gargari E, Lucas C (2007) Imperialist competitive algorithm: an algorithm for optimiza-
tion inspired by imperialistic competition. In: 2007 IEEE Congress on Evolutionary Computation,
CEC2007, Singapore, pp 4661–4667
52. Rao RV, Savsani VJ, Vakharia DP (2012) Teaching-learning-based optimization: an optimization
method for continuous non-linear large scale problems. Inf Sci 183(1):1–15

13
Stock exchange trading optimization algorithm: a… 2173

53. Husseinzadeh Kashan A (2014) League championship algorithm (LCA): an algorithm for global
optimization inspired by sport championships. Appl Soft Comput J 16:171–200
54. Das P, Das DK, Dey S (2018) A new class topper optimization algorithm with an application to data
clustering. IEEE Trans Emerg Top Comput 6750:1–11
55. Mirjalili S (2016) SCA: a sine cosine algorithm for solving optimization problems. Knowl Based
Syst 96:120–133
56. Kumar M, Kulkarni AJ, Satapathy SC (2018) Socio evolution & learning optimization algorithm: a
socio-inspired optimization methodology. Futur Gener Comput Syst 81:252–272
57. Mahmoodabadi MJ, Rasekh M, Zohari T (2018) TGA: team game algorithm. Future Comput
Inform J 3(2):191–199
58. Singh PR, Elaziz MA, Xiong S (2019) Ludo game-based metaheuristics for global and engineering
optimization. Appl Soft Comput J 84:105723
59. Martinez-Alvarez F et al (2020) Coronavirus optimization algorithm: a bio-inspired meta-heuristic
based on the COVID-19 propagation model. Big Data 8(4):308–322
60. Abbasi M, Yaghoobikia M, Rafiee M, Jolfaei A, Khosravi MR (2020) Energy-efficient workload
allocation in fog-cloud based services of intelligent transportation systems using a learning classifier
system. IET Intell Transp Syst 14(11):1484–1490
61. Zhou Z, Kearnes S, Li L, Zare RN, Riley P (2019) Optimization of molecules via deep reinforce-
ment learning. Sci Rep 9(1):1–10
62. Talbi EG (2019) Machine learning for metaheuristics—state of the art and perspectives. In: 11th
International Conference on Knowledge and Smart Technology (KST), pp XXIII–XXIII
63. Owoyele O, Pal P (2021) A novel machine learning-based optimization algorithm (ActivO) for
accelerating simulation-driven engine design. Appl Energy 285:116455
64. Nabipour M, Nayyeri P, Jabani H, Mosavi A, Salwana E, Shahab S (2020) Deep learning for stock
market prediction. Entropy 22(8):840
65. Das SR, Mishra D, Rout M (2019) Stock market prediction using Firefly algorithm with evolution-
ary framework optimized feature reduction for OSELM method. Expert Syst Appl 4:100016
66. Kelotra A, Pandey P (2020) Stock market prediction using optimized deep-ConvLSTM model. Big
Data 8(1):5–24
67. Thakkar A, Chaudhari K (2020) A comprehensive survey on portfolio optimization, stock price and
trend prediction using particle swarm optimization. Springer, pp 1–32
68. Kumar K, Haider MT (2021) Enhanced prediction of intra-day stock market using metaheuristic
optimization on RNN-LSTM network. New Gener Comput 39(1):231–272
69. Abedi M, Gharehchopogh FS (2020) An improved opposition based learning firefly algorithm with
dragonfly algorithm for solving continuous optimization problems. Intell Data Anal 24(2):309–338
70. Rahnema N, Gharehchopogh FS (2020) An improved artificial bee colony algorithm based on whale
optimization algorithm for data clustering. Multimed Tools Appl 79(44):32169–32194
71. Mohammadzadeh H, Soleimanian F (2021) Feature selection with binary symbiotic organisms
search algorithm for email spam detection. Int J Inf Technol Decis Mak 20(1):469–515
72. Soleimanian F, Shayanfar H, Gholizadeh H (2020) A comprehensive survey on symbiotic organisms
search algorithms. Artif Intell Rev 53:2265–2312
73. Mohmmadzadeh H, Soleimanian F (2021) An efficient binary chaotic symbiotic organisms search
algorithm approaches for feature selection problems. J Supercomput
74. Hosseinalipour A, Soleimanian F, Masdari M, Khademi A (2021) A novel binary farmland fertility
algorithm for feature selection in analysis of the text psychology. Appl Intell 1–36
75. Darwish A (2018) Bio-inspired computing: algorithms review, deep analysis, and the scope of
applications. Future Comput Inform J 3(2):231–246
76. Murphy JJ (1999) Technical analysis of the financial markets: a comprehensive guide to trading
methods and applications. Penguin
77. Wilder JW (1978) New concepts in technical trading systems. Trend Research
78. Anderson B, Li S (2015) An investigation of the relative strength index. Banks Bank Syst

10(1):92–96
79. Wafi AS, Hassan H, Mabrouk A (2015) Fundamental analysis models in financial markets—review
study. Procedia Econ Finance 30(15):939–947
80. Civicioglu P (2013) Backtracking search optimization algorithm for numerical optimization prob-
lems. Appl Math Comput 219(15):8121–8144
81. Suganthan P, Ali M, Wu G, Mallipeddi R (2018) Special session & competitions on real-parameter
single objective optimization. In: CEC2018, Rio de Janeiro, Brazil

13
2174
H. Emami

8 2. Haupt RL, SE H (2004) Practical genetic algorithms. Wiley


83. Thangaraj R, Pant M, Abraham A, Bouvry P (2011) Particle swarm optimization: hybridization per-
spectives and experimental illustrations. Appl Math Comput 217(12):5208–5226
84. Mirjalili S, Gandomi AH, Zahra S, Saremi S (2017) Salp swarm algorithm: a bio-inspired optimizer
for engineering design problems. Adv Eng Softw 114:1–29
85. Derrac J, García S, Molina D, Herrera F (2011) A practical tutorial on the use of nonparametric
statistical tests as a methodology for comparing evolutionary and swarm intelligence algorithms.
Swarm Evol Comput 1(1):3–18

Publisher’s Note  Springer Nature remains neutral with regard to jurisdictional claims in published
maps and institutional affiliations.

13

You might also like