Stock - Price - Forecasting - Using - Convolutional - Neural - Networks
Stock - Price - Forecasting - Using - Convolutional - Neural - Networks
Abstract—Forecasting the correct stock price is intriguing particular dataset. To address this issue, several researchers
and difficult for investors due to its irregular, inherent dynamics, have proposed various techniques based on evolutionary
and tricky nature. Convolutional neural networks (CNN) have computation to automatically detect the best CNN structures
impressive performance in forecasting stock prices. One of the and improve performance [5]. It can be difficult to determine
most crucial tasks when training a CNN on a stock dataset is the best parameter in a high-dimensional space.
identifying the optimal hyperparameter that increases accuracy.
In this research, we propose the use of the Firefly algorithm to The procedure used to set the hyper-parameter values is
optimize CNN hyperparameters. The hyperparameters for CNN typically a random search, including running a number of tests
were tuned with the help of Random Search (RS), Particle or making manual adjustments. Random search is the process
Swarm Optimization (PSO), and Firefly (FF) algorithms on of selecting and analyzing inputs randomly for the objective
different epochs, and CNN is trained on selected function [6]. Swarm intelligence (SI) is inspired by social
hyperparameters. Different evaluation metrics are calculated for behavior found in nature, such as the movement of fish and
training and testing datasets. The experimental finding birds. Based on a group's intelligence and behavior, the SI
demonstrates that the FF method finds the ideal parameter with algorithms have a significant ability to determine the optimal
a minimal number of fireflies and epochs. The objective function solution [7]. The PSO algorithm is considered one of the
of the optimization technique is to reduce MSE. The PSO method meta-heuristic evolutionary algorithms. Each particle in PSO
delivers good results with increasing particle counts, while the FF
has its own position, velocity, and fitness and also keeps track
method gives good results with fewer fireflies. In comparison
of its best fitness value and best fitness position. The PSO
with PSO, the MSE of the FF approach converges with
increasing epoch. maintains a record of the global best fitness position and the
global best fitness value [8]. The FF algorithm is a
Keywords—Convolutional neural networks; swarm metaheuristic algorithm based on the attraction of fireflies
intelligence; random search; particle swarm optimization; firefly towards brighter fireflies. The FF algorithm is able to identify
optimal parameters for CNN that minimize the error or fitness
I. INTRODUCTION function with fewer iterations. Each firefly in the FF algorithm
The non-linear characteristics of stock market data make it has a position in the search space that corresponds to a
challenging to guess the next movement of stock value. Exact solution, and it progresses toward more brilliant solutions. For
stock forecasting can boost consumer and seller confidence in each iteration, the FF algorithm keeps track of the best
the stock market, which will attract investors to buy shares position, which will reduce the objective function cost [9].
and grow the nation's economy [1]. The accuracy of neural This study evaluates optimization techniques such as RS,
networks and their variations in predicting stock prices is PSO, and FF on CNN to forecast stock prices. The Tata
rising day by day [2, 3]. Several studies using time-series data Motors stock dataset was taken from Yahoo Finance between
have demonstrated that CNN is useful for forecasting issues January 1, 2003, and September 30, 2022. The CNN is trained
[4]. CNN can accurately and efficiently identify the changing on hyperparameter return using RS, PSO, and FF algorithms,
trend in stock value, and it may be used in other financial and the trained model is used to forecast stock prices for the
transactions. Choosing the best CNN parameters is one of the next day as well as stock prices for the entire month of
difficulties we encounter while constructing CNN September. The results show that the FF method returns the
architectures. The result may vary if we apply different best hyperparameters with fewer fireflies, reduces MSE, and
parameter values to CNN architectures while solving the same requires the fewest training epochs. The remaining paper is
problem. An optimization process is defined as determining structured as follows: In Section II, the stock price forecasting
the ideal combination of inputs to reduce or enhance the cost literature is discussed, and the dataset used, the CNN
of the objective function without impacting training architecture, and different optimization techniques are
performance. Optimization is a computational problem whose explained in Section III. The evaluation metrics used and the
aim is to extract the best solution among all possible solutions. accuracy of CNN trained on hyperparameter returns by
The hyperparameters that affect CNN's architecture include different optimization techniques are discussed in Section IV.
filters, kernel size, stride, padding, pool size, batch size, We conclude the work in Section V with a summary and
epoch, and others. In a reasonable amount of time, we would recommendations for additional research.
like to identify the ideal set of hyperparameter values for a
378 | P a g e
www.ijacsa.thesai.org
(IJACSA) International Journal of Advanced Computer Science and Applications,
Vol. 13, No. 11, 2022
II. LITERATURE SURVEY objective function to be optimized [17]. The bee algorithm
Traditional stock value analysis is based on economics and emulates honey bee foraging activities to find the optimal
finance, and it primarily focuses on external factors solution to an optimization issue. The food source or flower is
influencing stock prices, such as international relations, considered a candidate solution, and the population of n bees
exchange rates, interest rates, business policy, financial searches the solution space [18]. Two important parameters of
institutions, political factors, etc. It's challenging to convince a deep learning network are the number of hidden layers and
people of the accuracy of the traditional fundamental analysis the number of neurons in each layer. The PSO method has
method. The technical analysis approach primarily great potential for optimizing parameters and saving precious
concentrates on the movement of the stock price, computational resources when deep learning models are being
psychological expectations of investors, trading volume, tuned. Each particle in a PSO has three properties: a position
historical data, etc. [10]. In comparison to other deep learning that corresponds to a solution's; a velocity that is a moving
models, CNN is more accurate and can detect both uptrend parameter; and a fitness value that is calculated by an
and downtrend stock movements [11]. Setting objective function that takes the particle's position into
hyperparameters is necessary for CNN implementation, which consideration. The position of every particle in the swarm is
influences accuracy, learning time, and CNN architecture. The updated such that it will migrate toward the one with the best
effectiveness of machine learning will be significantly position [19]. One of the most effective metaheuristic
increased if an effective hyperparameter optimization algorithms for solving optimization issues is the FF algorithm,
algorithm can be designed to optimize any specific machine which is based on the flashing characteristic of fireflies. The
learning method. Bayesian optimization based on the Bayesian unisex nature of fireflies makes them attracted to brighter
theorem can be used to solve the hyperparameter tuning lights, and as distance increases, their attractiveness and
problem, as it can be considered an optimization issue [12]. brightness will decrease [20]. Metaheuristic algorithms,
Manual search and RS are the most widely used strategies for particularly evolutionary and SI algorithms, are strong
hyper-parameter optimization. Randomly searching is more methods for addressing a wide range of challenging
efficient for hyperparameter optimization, where random engineering issues that arise in the real world. The
combinations of hyperparameters are chosen and used to train echolocation abilities of microbats served as an inspiration for
a model. The hyperparameter combinations with the best costs the bat algorithm. Each bat has some position, flying
are selected [13]. Finding the ideal collection of randomly with velocity, fixed frequency, varying wavelength,
hyperparameter values in a reasonable amount of time is and loudness in order to search for prey [21]. While training
difficult because the values of the hyperparameters change CNN on a stock dataset, it is required to select optimal
when the dataset changes. The weighted random search parameters that improve training performance and reduce
approach, which locates the global optimum more quickly error. It is essential to choose the best optimization technique
than RS, combines RS with a probabilistic greedy heuristic to boost the CNN's performance and reduce errors.
method [14]. Reinforcement learning (RL)-based optimization III. METHODOLOGY
algorithms give good performance and highly competitive
results for CNN hyperparameter tuning with fewer iterations A. Data Description
[15]. From January 1, 2003, until September 30, 2022, historical
The hyperparameters include the number of layers, the size stock information for Tata Motors was retrieved from Yahoo
of the kernel in each layer, the loss function, the optimizer, Finance [22]. Table I shows features and values for Tata
and many others. Integers or real numbers may be used as Motors stock data. The dataset contains features such as date,
hyperparameters, and there is an infinite range of possible open value, low value, close value, high value, volume, and
values for each, so CNN hyperparameter tuning is a adjacent close value.
challenging problem. Swarm intelligence algorithms have It is typical to assess the calculation of profit or loss using
been used for decades to solve challenging optimization issues the closing price of a stock on a given date; hence, we
and give promising results [16]. The fish swarm optimization considered the closing price as the target variable. The plot of
method utilizes a fish's social behavior to carry out a variety of the target variable against time is shown in Fig. 1.
tasks, with each fish serving as a potential solution to the
optimization problem. The aquarium is the design space where
the fish are found, and the food density is related to an
379 | P a g e
www.ijacsa.thesai.org
(IJACSA) International Journal of Advanced Computer Science and Applications,
Vol. 13, No. 11, 2022
(( ) ) (1)
380 | P a g e
www.ijacsa.thesai.org
(IJACSA) International Journal of Advanced Computer Science and Applications,
Vol. 13, No. 11, 2022
D. PSO Algorithm where and are the new calculated velocity and
The particle swarm optimization technique is inspired by position, t represents the iteration number, is the velocity of
the social behavior of bird swarms and was designed by the ith particle at iteration t, is the position of a particle. W
Kennedy and Eberhart in 1995. PSO simulates swarm is the inertia weight that regulates the motion of the particles.
behaviors to optimize the way of finding food, and each bird The particle may continue travelling in the same direction if
in the swarms constantly changes the search pattern based on W = 1, because the particle's motion is entirely determined by
its own and other members' learning experiences [24]. The the preceding motion. if 0≤W<1, this influence is diminished,
objective function in data science entails feeding a candidate causing a particle to travel to different areas within the search
solution into a model and evaluating it against the training domain. The C1, C2 are the correlation factors and if C1=C2
dataset. The cost may be an error score, also known as the loss =0, all particles continue to move at their current speed until
of the model, which is to be decreased, or an accuracy score, they collide with the search space border. The r1 and r2 are the
which is to be increased. The optimization process aims to random number in between 0 and 1. The is the particle‟s
increase or decrease the cost of an objective function. There best position for iteration t and Pq is the global best position.
may be multiple local maximums and minimums for objective The updated position is used to evaluate the value for the
functions, but only one global maximum or minimum. Each objection function in each iteration. The particle best position
particle in a PSO has a position, which corresponds to the (pbest) and global best position (gbest) are updated based on
attributes of a solution; a velocity, which is a moving the objective function return value. The best parameters
parameter; and a fitness value, which is determined by an returned by PSO were used to train CNN, and performance
objective function taking into account the position of the was calculated using evaluation metrics. The flowchart for
particle [25]. The particle's quality is measured by its fitness PSO-CNN is shown in Fig. 4.
value. Each swarm particle's position is updated such that it
will move closer to the one with the best position. Each
particle maintains pbest, the best solution each particle
independently found, and gbest, the best solution found by all
particles, to update its position and velocity in each iteration
[26]. The processing steps of the PSO algorithm are
mentioned below [27].
Step 1: Generate random position p and velocity v in all
dimensions by using the following equation:
( ) (7)
( ) (8)
xi represents the position of particle i, r is a random
number, l represents the lower bound, and u represents the
upper bound. For velocity calculation, the lower bound is 0
and the upper bound is 1.
Step 2: Calculate objective function value f(xi) for each
particle where i=1..n. The initial position of a particle is pbest
for that particle, i.e., pbest=xi.
Step 3: Find gbest i.e., best position from all particle.
Step 4:
For t=1 to max_iterations
For i= 1 to no_of_particles
Fig. 4. Flowchart of the PSO-CNN Algorithm.
Update particle velocity and position
Calculate Objective function value and pbest E. Firefly Algorithm
End for i An optimization issue is one where an objective function is
Find gbest position having less cost maximized or minimized by selecting appropriate values from
End for t a set of possible values for the variables. This research takes
into consideration a minimization problem in which the
Equation 9 describes how to calculate particle velocity, and objective function calculates and returns the difference
Equation 10 describes how to calculate particle position. between the actual and forecasted stock price. The Xin-Shi
Yang-developed Firefly Algorithm is a stochastic and
( ) ( ) (9)
metaheuristic optimization algorithm inspired by nature. It
(10) was designed to mimic the social behavior of fireflies based
on their flashing and attraction properties [28].
381 | P a g e
www.ijacsa.thesai.org
(IJACSA) International Journal of Advanced Computer Science and Applications,
Vol. 13, No. 11, 2022
Below are the guidelines for the FF optimization most applications, it frequently ranges from 0.01 to 100. The
algorithm: distance r between fireflies i and j can be calculated using the
Euclidean distance formula [32].
An objective function is used to determine the
brilliance of a firefly.
√∑ ( ) (13)
Any firefly is attracted towards more brilliant fireflies,
as all fireflies are unisex, and they will move randomly The flowchart for FF-CNN is shown in Fig. 5. Initially, the
if there is no brilliant firefly. FF algorithm initializes a population, assigns a random
position, and calculates the objective function cost. The initial
Attractiveness is directly proportional to brilliance, and
position of the firefly is the best position. The cost of each
as distance grows, both attractiveness and brilliance
firefly is compared with every other firefly in each iteration,
will decrease [29].
and if the cost of firefly i is greater than the cost of firefly j,
In an optimization problem, the objective function accepts then firefly i will move towards firefly j. After n iterations, the
input variables and returns costs based on their calculation. By best positions were used to train CNN, and accuracy on the
taking into account the optimal parameters, an optimization training and testing datasets was calculated.
algorithm aims to reduce or enhance the cost return by the
objective function [30]. The processing steps of the firefly
algorithm are as below.
Step 1: Generate a random position X in all dimension
using Equation 11.
( ) (11)
Xi is position of firefly i, r is random number, l is lower
bound and u is upper bound.
Step 2: Calculate cost, determined by objective function
f(Xi), for each firefly where i=1..n.
Step 3:
For t=1 to max_iterations
For i= 1 to n
For j= 1 to n
If f(Xi)> f(Xj)
Move firefly i towards j in all d dimensions
Evaluate new position and update cost
Else
Move firefly i randomly
End if
End for j
End for i
Find best position having less cost
Fig. 5. Flowchart of the FF-CNN Algorithm.
End for T
The following equation describes how a firefly i will IV. RESULTS
migrate when it encounters a firefly j, which is more The dataset contains 4902 records split into two sets: the
illuminated [31]. training set has 4882 records covering the period from
01/01/2003 to 31/08/2022; and the testing set has 22 records
( ) (12) covering the period from 01/09/2022 to 30/09/2022. The CNN
where t is the iteration count, is the new position of models are evaluated for their ability to predict the stock price
for the following day and for the entire month of September.
the ith firefly, is the current position of the ith firefly. The
Based on the past 1000 samples, the stock price for the
term β0 and αt in the equation are the attraction and
following day is computed, and the stock price for the entire
randomization parameters respectively. The recommended
month of September is predicted using the entire training set.
values used in most of the implementation are β0=1 and α∈ The assessment metrics should show the difference between
[0, 1] respectively. The term €i is a random number obtained the actual and anticipated value in the stock forecasting
from a Gaussian or uniform distribution. Theoretically, the problem, which is a regression problem. The performance of
light absorption coefficient (γ) varies from 0 to ∞, but in
382 | P a g e
www.ijacsa.thesai.org
(IJACSA) International Journal of Advanced Computer Science and Applications,
Vol. 13, No. 11, 2022
the model is evaluated against MSE, MAE, RMSE, and better outcomes with fewer fireflies and epochs. If we increase
MAPE [33]. The formula for evaluation metrics is as below. the firefly or epoch, the results converge in the early stages.
The evaluation metrics value of a CNN trained on a
∑ ( ) (14) hyperparameter return by various optimization approaches is
shown in Fig. 6 and 7.
∑ | | (15) TABLE II. UPPER AND LOWER BOUNDS FOR THE HYPERPARAMETERS
| |
kernel size 1 5 1
∑ (17) pool size 1 5 1
where N is the total number of samples in the stock batch size 32 256 32
dataset, Yi is the actual stock price, and Ŷi is the predicted
stock price by model. The evaluation metrics shows how TABLE III. SUMMARY OF THE CONTROL PARAMETERS
forecasted value closer to actual and variation in between PSO-CNN FF-CNN
them.
Inertia weight (W) 0.5 Attraction parameter (β0) 0.97
The goal of an optimization problem is to identify the
Correlation factors
optimal solution from a collection of possible options. The 0.5, 0.5 Randomization parameter (αt) 1
(C1, C2)
parameters for CNN are convolution layer filters, kernel size,
pool size in max pooling operation, batch size, etc. If we apply Random number
0≤1 Absorption coefficient (γ) 0.01
(r1, r2)
different parameter values to CNN architectures while solving
the same problem, the result may vary. Selecting the right Maximum iteration
10 Maximum iteration number 10
hyperparameter gives better accuracy with a minimum epoch. number
The hyperparameter values returned by the RS, PSO, and FF
algorithms were used to train the CNN model. Table II TABLE IV. OPTIMIZATION METHODS AND EVALUATION METRICS FOR
TRAINING DATASET
displays the upper and lower bounds for the hyperparameters,
and step size is used to increase or decrease their value.
Particle=10
Firefly=10
PSO-CNN
PSO-CNN
Particle=5
Firefly=5
RS-CNN
FF-CNN
FF-CNN
Metrics
CNN
383 | P a g e
www.ijacsa.thesai.org
(IJACSA) International Journal of Advanced Computer Science and Applications,
Vol. 13, No. 11, 2022
Particle=10
Evaluation
Firefly=10
PSO-CNN
PSO-CNN
Particle=5
Firefly=5
RS-CNN
FF-CNN
FF-CNN
Metrics
Epoch
CNN
384 | P a g e
www.ijacsa.thesai.org
(IJACSA) International Journal of Advanced Computer Science and Applications,
Vol. 13, No. 11, 2022
[3] W. Budiharto, “Data science approach to stock prices forecasting in mobile ad hoc networks,” International Journal of Distributed Sensor
Indonesia during Covid-19 using Long Short-Term Memory (LSTM),” Networks, 13(6), 2017, doi:10.1177/1550147717716815.
Journal of Big Data, 8(47), 2021, doi: 10.1186/s40537-021-00430-0. [19] Z. Fouad, M. Alfonse, M. Roushdy, A. M. Salem, “Hyper-parameter
[4] R. Chauhan, H. Kaur, and B. Alankar, “Air Quality Forecast using optimization of convolutional neural network based on particle swarm
Convolutional Neural Network for Sustainable Development in Urban optimization algorithm”, Bulletin of Electrical Engineering and
Environments,” Sustainable Cities and Society, 75, 2021, doi: Informatics, 10(6), 2021, pp. 3377-3384, doi: 10.11591/eei.v10i6.3257.
10.1016/j.scs.2021.103239. [20] S. Dhawan, “Optimization Performance Analysis of Firefly Algorithm
[5] G. Habib, and S. Qureshi, “Optimization and acceleration of using Standard Benchmark Functions,” International Journal of
convolutional neural networks: A survey,” Journal of King Saud Engineering Research & Technology, 11(2), pp.385-392, 2022, doi:
University –Computer and Information Sciences, 34(7), pp. 4244-4268, 10.17577/IJERTV11IS020165.
2022, doi: 10.1016/j.jksuci.2020.10.004. [21] X. Yang and A. H. Gandomi, “Bat Algorithm: A Novel Approach for
[6] D.G. Misganu, and K. G. Arnt, “Mapping Seasonal Agricultural Land Global Engineering Optimization,” Engineering Computations, 29(5),
Use Types Using Deep Learning on Sentinel-2 Image Time Series,” pp. 464-483, 2012.
remote sensing, 13(2), pp. 289-305, 2021, doi: 10.3390/rs13020289. [22] Yahoo Finance - Business Finance Stock Market News,
[7] J. Xue, and B. Shen, “A novel swarm intelligence optimization <https://in.finance.yahoo.com/> [ Accessed on October 01,2022].
approach: sparrow search algorithm,” Systems Science & Control [23] E. Elgeldawi, A. Sayed, A. R. Galal , and A. M. Zaki, “Hyperparameter
Engineering, 8(1), pp. 22–34, 2020, doi: Tuning for Machine Learning Algorithms Used for Arabic Sentiment
10.1080/21642583.2019.1708830. Analysis,” Informatics, 2021, 8(4), pp. 79-99, 2021,
[8] K. Shahana, S. Ghosh, and C. Jeganathan, “A Survey of Particle Swarm doi:10.3390/informatics8040079.
Optimization and Random Forest based Land Cover Classification,” [24] B. Qolomany, M. Maabreh, A. Al-Fuqaha, A. Gupta, and D. Benhaddou,
International Conference on Computing, Communication and “Parameters optimization of deep learning models using Particle swarm
Automation (ICCCA2016), pp. 241-2452016, doi: optimization,” 2017 13th International Wireless Communications and
10.1109/CCAA.2016.7813756. Mobile Computing Conference (IWCMC), pp. 1285-1290, 2017, doi:
[9] S. Arora, and S. Singh, “The Firefly Optimization Algorithm: 10.1109/IWCMC.2017.7986470.
Convergence Analysis and Parameter Selection,” International Journal [25] D. Wang, D. Tan, and L. Liu, “Particle swarm optimization algorithm:
of Computer Applications, 69(3), pp. 48-52, 2013. an overview,” Soft Comput, 22, pp.387-408, 2018, doi: 10.1007/s00500-
[10] L. Wenjie, L. Jiazheng, L. Yifan, S. Aijun, and W. Jingyang, “ A CNN- 016-2474-6.
LSTM-Based Model to Forecast Stock Prices,” Complexity, 2020, pp. [26] A. G. Gad, “Particle Swarm Optimization Algorithm and Its
1076-2787, 2020, doi: 10.1155/2020/6622927. Applications: A Systematic Review,” Archives of Computational
[11] S. Chen, and H. He, “Stock Prediction Using Convolutional Neural Methods in Engineering, 29, pp. 2531–2561, 2022, doi: 10.1007/s11831-
Network,” IOP Conference Series: Materials Science and Engineering, 021-09694-4.
435, 2018, doi:10.1088/1757-899X/435/1/012026. [27] J. Kennedy and R. Eberhart, “Particle swarm optimization,” Proceedings
[12] J. Wu, X. Chen, H. Zhang, L. Xiong, and H. Lei, S. Deng, of ICNN'95 - International Conference on Neural Networks, 4, pp. 1942-
“Hyperparameter Optimization for Machine Learning Models Based on 1948, 1995, doi: 10.1109/ICNN.1995.488968.
Bayesian Optimization,” Journal of Electronic Science and Technology, [28] A. Abdulhadi, “Application of the Firefly Algorithm for Optimal
7(1), pp. 26-40, 2019, doi:10.11989/JEST.1674-862X.80904120. Production and Demand Forecasting at Selected Industrial Plant,” Open
[13] J. Bergstra, and Y. Bengio, “Random Search for Hyper-Parameter Journal of Business and Management, 08, pp. 2451-2459, 2022,
Optimization,” Journal of Machine Learning Research, 13, pp. 281-305, doi:10.4236/ojbm.2020.86151.
2012. [29] A. Sharma, A. Zaidi, R. Singh, S. Jain, and A. Sahoo, “Optimization of
[14] R. Andonie, and A.C. Florea, “Weighted Random Search for CNN SVM classifier using Firefly algorithm,” 2013 IEEE Second
Hyperparameter Optimization,” International Journal of Computers International Conference on Image Information Processing (ICIIP-
Communications & Control, 5(2), 2020, doi: 2013), pp. 198-202, 2013, doi: 10.1109/ICIIP.2013.6707582.
10.15837/ijccc.2020.2.3868. [30] M. Farrell, K. N. Ramadhani, and S. Suyanto, “Combined Firefly
[15] F. M. Talaat, and S. A. Gamel, “RL based hyper‑parameters Algorithm-Random Forest to Classify Autistic Spectrum Disorders,”
optimization algorithm (ROA) for convolutional neural network,” 2020 3rd International Seminar on Research of Information Technology
Journal of Ambient Intelligence and Humanized Computing, 2022, doi: and Intelligent Systems (ISRITI), pp. 505-508, 2020, doi:
10.1007/s12652-022-03788-y. 10.1109/ISRITI51436.2020.9315396.
[16] Y. Cai, and A. Sharma, “Swarm Intelligence Optimization: An [31] [31] XS. Yang, “Firefly Algorithms for Multimodal Optimization,” O.
Exploration and Application of Machine Learning Technology,” Journal Watanabe and T. Zeugmann (Eds.): SAGA 2009, Lecture Notes in
of Intelligent Systems, 30(1), pp. 460-469, 2021, doi: 10.1515/jisys- Computer Science, 5792, pp. 169–178 doi: 10.1007/978-3-642-04944-
2020-0084. 6_14.
[17] F. Lobato, and J. V. Steffen, “Fish Swarm Optimization Algorithm [32] L. Liberti, C. Lavor, N. Maculan, and A. Mucherino, “Euclidean
Applied to Engineering System Design,” Latin American Journal of Distance Geometry and Applications” SIAM Review. 56, 2012 doi:
Solids and Structures, 11, pp. 143-156, 2014, doi: 10.1590/S1679- 10.1137/120875909.
78252014000100009. [33] A. Botchkarev, “Evaluating performance of regression machine learning
[18] M. Ahmad, A. A. Ikram, R. Lela, I. Wahid, and R. Ulla, “Honey bee models using multiple error metrics in Azure Machine Learning Studio,”
algorithm–based efficient cluster formation and optimization scheme in SSRN Electronic Journal, 2018, doi: 10.2139/ssrn.3177507.
385 | P a g e
www.ijacsa.thesai.org