0% found this document useful (0 votes)
20 views7 pages

Innovative Approaches To Enhance Data Science Optimization

In today's context, there is a growing need for the introduction of innovative techniques and algorithms within the realm of data science. Optimization strategies provide a pathway for the development of data science models. Our main focus is on examining and enhancing state-of-the-art techniques and methodologies applied in data science to effectively tackle various challenges.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
20 views7 pages

Innovative Approaches To Enhance Data Science Optimization

In today's context, there is a growing need for the introduction of innovative techniques and algorithms within the realm of data science. Optimization strategies provide a pathway for the development of data science models. Our main focus is on examining and enhancing state-of-the-art techniques and methodologies applied in data science to effectively tackle various challenges.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 7

Volume 8, Issue 11, November 2023 International Journal of Innovative Science and Research Technology

ISSN No:-2456-2165

Innovative Approaches to Enhance


Data Science Optimization
Mohamed Abdeldaiem Mahboub1
1
Department of Information Systems,
Faculty of Information Technology, University of Tripoli, Libya

Pyla Srinivasa Rao2*


2
Senior Manager, Cyber Security, Capgemini, India

T. Gopi Krishna3
3
Department of Computer Science & Engineering,
School of Electrical Engineering and Computing,Adama Science & Technology University, Ethiopia

Abstract:- In today's context, there is a growing need for II. MOTIVATION


the introduction of innovative techniques and algorithms
within the realm of data science. Optimization strategies In essence, optimization in data science is crucial for
provide a pathway for the development of data science refining models, enhancing accuracy, reducing redundancy,
models. Our main focus is on examining and enhancing and making the most of available resources, ultimately
state-of-the-art techniques and methodologies applied in leading to better decision-making and more valuable insights.
data science to effectively tackle various challenges. These Optimization is fundamental in data science for several
alternatives include rule-based systems and various reasons:
preprocessing methods for data science that are
independent of derivatives. We assert that the most  Enhancing Model Performance: Data science involves
effective approach to achieving our goals involves the building models to make predictions, classifications, or
application of machine learning. Utilizing optimization recommendations. Optimization techniques help improve
methods and algorithms enables the identification of these models, aiming to enhance their performance,
improved solutions for challenges in machine learning accuracy, and efficiency.
optimization, with the potential to significantly enhance  Efficiency Improvement: Optimization helps in making
the learning capabilities and knowledge application of processes more efficient. For instance, optimizing
machines. algorithms and computations reduces time and resources
required for analysis, allowing for quicker insights and
Keywords:- Data science, optimization, rule-based systems. decision-making.
 Resource Utilization: It aids in the effective utilization of
I. INTRODUCTION available resources. Whether it's minimizing
Optimization methods, integrated into various computational power, memory, or storage, optimization
algorithms, play a crucial role in numerous scientific and ensures that resources are used optimally, reducing costs
technological domains, particularly in data science. The rapid and improving scalability.
and efficient preprocessing of large datasets is essential in  Feature Selection and Engineering: Optimization
this field. This study initiates with a exploration of traditional techniques assist in selecting the most relevant features
optimization methods, aiming to unveil new extensions or for models. This process helps in reducing overfitting and
analyses deemed valuable in recent research. The primary enhancing model interpretability by focusing on the most
objective is to enhance data science optimization by impactful variables.
analysing theories and identifying the most effective methods  Hyperparameter Tuning: Optimization is essential for
for solving diverse problems within this domain [7,8] tuning the hyperparameters of machine learning models.
Leveraging mathematical concepts, operations, and We have Finding the best combination of hyperparameters ensures
opted for utilizing symbols from formal language theory and that models are well-tailored to the specific dataset,
automata theory as our chosen approach. Formal language leading to better performance.
theory, an interdisciplinary field merging linguistics,  Decision-Making: Optimization aids in making data-
mathematical logic, and computer science, is instrumental in driven decisions. By optimizing business processes based
designing programming languages through finite state on data insights, organizations can make more informed
machines [11]. Our research focuses on improving data and effective decisions.
science optimization through the application of innovative  Prediction and Forecasting: Optimization plays a crucial
methods rooted in these mathematical concepts. Furthermore, role in predictive analytics and forecasting. By optimizing
we delve into soft set theory, exploring its theoretical models, the accuracy and reliability of predictions are
foundations and practical applications, while introducing enhanced, which is crucial for businesses in planning and
novel ideas for its utilization in data science optimization. strategizing.

IJISRT23NOV1204 www.ijisrt.com 964


Volume 8, Issue 11, November 2023 International Journal of Innovative Science and Research Technology
ISSN No:-2456-2165
 Risk Management: In various fields like finance and evident without thorough analysis phase to scale up the
healthcare, optimizing models helps in risk management. optimization for a targeted high degree of any
By analyzing and optimizing risk factors, organizations performance systems in data science; which we have
can mitigate potential risks and make better decisions. taken in advance as the motivation rules for satisfying our
 Pattern Recognition: Optimization allows for the optimization methods [6].
identification of underlying patterns in data, helping in
recognizing trends and anomalies that might not be
Our research aims to uncover the optimal outcomes
III. METHODOLOGY from our recently proposed methods. We conducted a
comprehensive examination of soft set theory, exploring both
In our exploration of preprocessing methods utilized in its theoretical underpinnings and practical applications [2, 3].
machine learning, we have embraced the optimized methods Additionally, we introduced innovative concepts for applying
detailed in Table 1 as our designated methodology [14]. Our soft sets theory [5]. This exploration has resulted in
investigation spans a comprehensive examination of straightforward and efficient representations of potent tools,
commonly used preprocessing methods and techniques in establishing a state-of-the-art foundation for decision-making
data science, including their optimized variations. To ensure in data science, data mining, and deriving conclusions from
a thorough comprehension of these selected methods from data.
both mathematical and computational standpoints, we have
systematically structured the entire data science landscape Our findings suggest that incorporating the total
into coherent tables. These tables offer detailed insights into function within the soft set transformation can yield optimal
each studied method, presenting attribute names and values. results in preprocessing methods, as illustrated in Table 1.
Table 1 acts as a visual guide for the organization of methods This strategic integration enhances the effectiveness of the
in the ensuing stages of our research [6,7]. preprocessing methods employed in our research.

Table 1: Optimized Preprocessing Methods in Data Science


S. No Procedure Title Description of the Approach Procedure Variables
1 Data Purification The initial phase in various data processing methods a
involves data cleaning, a process that includes the
elimination of missing values, outliers, and redundant data.
This essential step is pivotal for ensuring data accuracy and
emphasizes the importance of preprocessing in optimizing
data science workflows..
2 Feature Standardization Scaling and normalization represent crucial techniques for b
standardizing features to a consistent scale. This procedural
step guarantees that all features maintain equal significance
in the model.
3 Variable Subset Feature selection entails choosing the most essential c
Determination features for the model aiding in reducing data size and
enhancing the model's overall performance.
4 Feature Crafting In the process of feature engineering, new features are d
created using existing ones, contributing to the
improvement of the model by providing additional
information about the metadata..
5 Data Expansion Augmenting data involves expanding the dataset by e
generating new data from existing sources, a procedural
step aimed at enhancing model accuracy by offering
additional learning material.
6 Concurrent Computing Incorporating parallel processing is an essential step for f
applying specific techniques to accelerate the preprocessing
phase through the simultaneous execution of multiple
processes. This approach is instrumental in decreasing the
time needed for preprocessing extensive datasets. Through
the implementation of these techniques, we can optimize
the preprocessing stage, enhancing both the accuracy and
efficiency of our model.

IJISRT23NOV1204 www.ijisrt.com 965


Volume 8, Issue 11, November 2023 International Journal of Innovative Science and Research Technology
ISSN No:-2456-2165
IV. PRIOR RESEARCH Shubhkirti Sharma and collaborators [17] introduced
strategies to improve outcomes in various contexts, shedding
Several studies have investigated optimization light on their advantages and disadvantages. Amit Sagu et al
algorithms and techniques in recent years, with a focus on [18] formulated two innovative methods to enhance the
models and frameworks aimed at improving the performance performance of deep learning models for detecting and
of various computer systems. Our examination of soft set preventing cyber-attacks. Xiangning Chen et al [19] proposed
theory has highlighted its versatility across diverse domains, a method for discovering new algorithms through program
particularly its effectiveness in information systems. searches, with a particular emphasis on improving algorithms
Molodtsov [3] explored various applications of soft set for training deep neural networks. Yandong Sh et al [20]
theory, encompassing the study of function smoothness, explored techniques for "learning to optimize" in 6G wireless
game theory, operations research, and theory of measurement. networks, utilizing machine learning frameworks to identify
Maji [4] showcased the effectiveness of neutrosophic soft set characteristics of optimization problems in diverse domains.
in solving decision-making problems. Andreas and B. Lavanya et al [21] delved into automatic genre
colleagues delved into the relationship between vector classification, emphasizing its role in improving web
optimization and financial risk measures. Zhong and X. searches and information retrieval, while also examining
Wang [5] introduced an innovative approach to parameter trends and stages in the field.
reduction using soft set theory. Nasef and collaborators [6]
formulated a decision-making solution for real estate A. Math Preliminaries
marketing strategy.. The foundational principles of set theory play a pivotal
role in algebra, with a significant concept known as the total
Endert and collaborators condensed noteworthy function holding particular importance for our proposed
research findings, while Kaiwen L. et al [7] executed a optimization model [1]. Within the realm of set theory, the
comparative study on approaches for solving multi-objective total function method, a mathematical function, becomes
problems. Radwa et al [8] conducted a comprehensive instrumental in enhancing data science adaptation. By
analysis of recent advancements in automated machine employing novel methods, it contributes to the overall
learning. Ebubeogu et al [9] scrutinized prior research to optimization of the system, particularly in the selection of
pinpoint essential issues in data quality and compiled a list of datasets during the preprocessing phase.
effective methods for data preprocessing. Amir Ahmad and
Shehroz S. [10], along with Khan [11], proposed a In our proposed application of total function properties
methodology for investigating mixed data clustering in set theory, a total function F from X to Y is defined as a
algorithms by identifying crucial research topics.. binary relation on ×X×Y satisfying two key properties:
 For each x ∈X→∈x∈X→y∈Y, such that ∈[x,y]∈f
Seba Susan and fellow researchers [12] offered insights (1).
into both traditional and modern techniques for intelligently  If [1,1][x1,y1] and [2,2][x2,y2] are in f, then 1=2y1=y2
representing samples from both majority and minority classes. (2).
Dharma and colleagues [13] introduced a spectrum of
optimization algorithms, while Abdu-rakhmon Sadiev et al Leveraging the benefits of total function properties, we
[14] introduced federated learning as a framework for have incorporated them into our proposed optimization
distributed learning and optimization. Syed Muzamil Basha model. The transformation of total function simplification
et al [15] conducted a study evaluating the performance of aligns with the specific needs of information systems [3,4].
optimization algorithms through various learning strategies, The novelty of our research lies in the mathematical
considering factors such as time and space requirements, as advancements applied to soft set theory applications,
well as solution accuracy. Ishaani Priyadarshini et al [16] positioning it as a state-of-the-art approach rather than a mere
explored various machine learning methods, including demonstration of total function in real-time systems. To
random forest, decision trees, k-nearest neighbors, illustrate our assumptions, consider the example where
convolutional neural networks, long short-term memory, and =(1,2,3,4,5,6)X=(1,2,3,4,5,6) and Y=(a,b,c,d,e,f). The relation
gated recurrent units, for the recognition of human activities. between X and Y in the total function from x to y is
represented in Table 2.

Table 2: Total Function Representation In Set Theory


F Y1 Y2 Y3 Y4 Y5 Y6
X1 a a a a a a
X2 b b b b b b
X3 c c c c c c
X4 d d d d d d
X5 e e e e e e
X6 f f f f f f

IJISRT23NOV1204 www.ijisrt.com 966


Volume 8, Issue 11, November 2023 International Journal of Innovative Science and Research Technology
ISSN No:-2456-2165
B. Enhancing Soft Set Theory through Total Function For instance, the function soft set with parameter (e1)
Integration must satisfy all six methods, while the function soft set with
In our innovative model, we have harmonized the (e2) parameter meets five conditions. Similarly, the function
advantages of total function in set theory and the soft set with parameter e3 satisfies only four methods. The
implementation of Soft Set theory within information function soft set (F; d4) encompasses three methods (u4, u5,
systems [1,2,3]. This fusion of principles from two theories and u6). The function soft set with parameter e5 is obliged to
has yielded inventive approaches for managing data satisfy only two methods (u5 and u6). Additionally, (F; e6) =
preparation in data science, amplifying the practical efficacy {u6} signifies that the soft set function with the parameter e6
of data science applications. Consider a set of six entails only one method.
preprocessing methods (u1, u2, u3, u4, u5, u6) and a set A
containing parameters (e1, e2, e3, e4, e5, e6), each denoting a Table 3 depicts the varied approaches utilized in the
level of fulfillment, such as 100%, 80%, and 0%. proposed model for data preparation, offering a structure to
gauge and evaluate the efficiency of preprocessing the
Within our framework, a soft set (F; A) illuminates the dataset. This model streamlines the storage of soft sets in a
"Preprocessing Methods," employing machine learning to computer, optimizing the entire dataset both before and after
pinpoint the most efficient methods for optimizing the entire processing. This Table 3. represents a soft set with
system. Drawing inspiration from a situation akin to example parameters (e1, e2, e3, e4, e5, e6) and methods (u1, u2, u3,
1, each soft set function with a distinct parameter (e1, e2, e3, u4, u5, u6), where the values indicate the fulfillment level of
e4, e5, e6) imposes diverse conditions on the fulfillment of each method under different parameters.
methods (u1, u2, u3, u4, u5, u6).

Table 3: Binary Representation Table For Soft Set Data


| U | e1 | e2 | e3 | e4 | e5 | e6 |
|----|----|----|----|----|----|----|
| u1 | 1 | 1 | 1 | 1 | 1 | 1 |
| u2 | 1 | 1 | 1 | 1 | 1 | 0 |
| u3 | 1 | 1 | 1 | 1 | 0 | 0 |
| u4 | 1 | 1 | 1 | 0 | 0 | 0 |
| u5 | 1 | 1 | 0 | 0 | 0 | 0 |
| u6 | 1 | 0 | 0 | 0 | 0 | 0 |

C. Categorization of Preprocessing Approaches


In our newly devised approach, we have introduced a the dataset within our proposed model. The infusion of
taxonomy for optimizing data science preprocessing [8]. Our innovative computational concepts and the assimilation of
research work delves into cutting-edge issues, particularly emerging soft set theory applications actively contribute to
focusing on innovative aspects, such as the application of refining the preprocessing phase. The overarching goal is to
optimized mathematical methods employed in preprocessing attain optimal performance in the realm of data science [7, 9].

Fig. 1: Categorization of Preprocessing in the Context of Data Science

IJISRT23NOV1204 www.ijisrt.com 967


Volume 8, Issue 11, November 2023 International Journal of Innovative Science and Research Technology
ISSN No:-2456-2165
V. SUGGESTED FRAMEWORK
In the wider scope of developing machine learning
We have simplified the architecture of our model, models, a sequence of iterative processes is usually
prioritizing clarity, with the intent of advancing the indispensable, as illustrated in Figure-3. In the phase of
preprocessing phase in both data science and machine selecting methods or algorithms, data scientists frequently
learning. Our primary goal is to uncover innovative strategies delve into possibilities such as Support Vector Machines,
that optimize essential processes through effective data Neural Networks, Bayesian Models, and Decision Trees.
utilization. This research is committed to introducing Subsequent fine-tuning adjustments to the selected algorithm
mathematical improvements to the implementation of soft set are often imperative. The evaluation of model performance
theory, a modern and forward-thinking methodology. The encompasses diverse metrics, including accuracy, sensitivity,
operational framework, depicted in Diagram-2, outlines the specificity, and F1-score [10,14].
essential components of our proposed model [5, 6].

Fig. 2: Optimized Framework for Data Science Preprocessing

In our study, we utilized a machine learning model to Arabic Text corpus and manually organized the dialect words
evaluate the performance of the system in the preprocessing during this phase.
stage. A training set was created by compiling a dataset of
described Arabic dialects. The corpus of the training dataset A. Dataset
includes several dialects, as detailed in Table-5 (Libya-1, We've conducted preprocessing on a moderately sized
Morocco-2, Egypt-3, Jordan-4, Palestine-5, and Sudan-6). dataset of Arabic dialects, specifically aligned with the
Notably, our simple training model produced well-optimized Modern Standard Arabic Language. Our model was
results for the proposed framework. To assess the model's constructed using a machine-learning approach, building
reliability, we intentionally selected a small subset from the upon the foundation of a developed model for the dataset [9,
10]. Table 4. shows transformations.

Table 4: Binary Table For Rule-Based Transformation


+---------+--------------+-----------------------------+---------------------------+------------------
| Dataset | Codes | Text in Dialects (Total Words) | Text in MSA (Total Words)|
+---------+--------------+-----------------------------+---------------------------+------------------
| 1 | Training | 6 | 1200 |
| 2 | Test | 6 | 600 |
| 3 | Total | 12 | 1800 |
+---------+--------------+-----------------------------+---------------------------+------------------
B. Transformation Table Based on Rules
Utilizing conditional rules derived from soft set theory, machine learning model, Rule-based Table 5 was employed.
we converted a standard table into a binary representation, Accuracies of our optimization techniques were computed to
offering an alternative presentation of the soft set. This assess the degree of optimization, aligning with the
preprocessing step is considered a straightforward and parameters of our proposed optimization methods.
versatile approach at the initial stage [15]. Within our

IJISRT23NOV1204 www.ijisrt.com 968


Volume 8, Issue 11, November 2023 International Journal of Innovative Science and Research Technology
ISSN No:-2456-2165

Fig. 3: The logical flow-chart for the dataset preprocessing evaluation

VI. ANALYSIS OF RESULTS transformation adapts the rules into conditional rules,
aligning with the principles of soft set theory. This
Our implementation of chosen methods aimed to straightforward and versatile approach ensures that the data is
optimize the functionality of our machine learning model. appropriately prepared for subsequent use. The epoch, a
The data presented in Table-6 outlines the contents of our crucial phase in training, utilizes all available information to
dataset, which encompasses a variety of Arabic-language refine parameters and enhance accuracy during testing.
documents categorized across different topics. Additionally, Table-6 provides a visual representation of the numerical
we conducted model training using a basic rules-based table. values employed to instruct optimization techniques within
This training process facilitated the conversion of the model's the suggested model.
rules into a binary table format, representing the soft set. This

Table 5: Training Data Results Using Various Optimization Methods In The Proposed Model
SNo Iteration Method-1 M-2 M-3 M-4 M-5 M-6
Progress Data Scaling & Feature Feature Data Parallel
Cleaning Normalization Selection Engineering Augmentation Processing
1 00 0.00 0.00 0.00 0.00 0.00 0.00
2 20 0.893 0.923 0.881 0.883 0.876 0.832
3 40 0.899 0.926 0.871 0.920 0.894 0.836
4 60 0.901 0.944 0.912 0.927 0.900 0.921
5 80 0.924 0.968 0.913 0.936 0.922 0.951
6 100 0.941 0.944 0.955 0.957 0.958 0.961

Table 6. illustrates the effectiveness of our suggested for the first 100 rounds, showcasing improved performance at
methods, revealing a favorable trend around the 60th epoch, each 20th epoch, resulting in heightened accuracies through
where the loss level stabilizes. The model underwent training our optimized approaches [13, 15].

Table 6: Test Accuracy Results Of Our Proposed Model


SNo Enhancement Techniques Accuracy Rates in Testing (100%)
1 M1-Data Cleaning 0.941
2 M2-Scaling & Normalization 0.944
3 M3-Feature Selection 0.955
4 M4-Feature Engineering 0.957
5 M5-Data Augmentation 0.958
6 M6-Parallel Processing 0.961

IJISRT23NOV1204 www.ijisrt.com 969


Volume 8, Issue 11, November 2023 International Journal of Innovative Science and Research Technology
ISSN No:-2456-2165
VII. CONCLUSIONS [10]. Ebubeogu et al, “Systematic literature review of
preprocessing techniques for imbalanced data”,
The optimization of data science is indispensable in doi/10.1049/iet- Sen.2018.5193 October 2019.
advancing high-performance systems heavily reliant on [11]. Amir Ahmad, Shehroz S. Khan,”Survey of State-of-the-
machine learning techniques, ensuring the precision and Art Mixed Data Clustering Algorithms”, Digital Object
dependability of information system applications. Identifier 10.1109/ACCESS.2019.2903568.
Constructing a machine learning model involves a [12]. Seba Susan et al,” The balancing trick: Optimized
comprehensive understanding of varied tools and algorithms, sampling of imbalanced data sets, A brief survey of the
a necessity given the continuous influx of substantial data in recent State of the Art”, DOI: 10.1002/eng2.12298, 7
the digital realm. In the contemporary landscape, the September 2020.
significance of artificial intelligence (AI) is paramount in [13]. Dharma et al, “A Performance Comparison of
expanding and refining our strategies for data handling. Optimization Algorithms on a Generated Dataset”,
Chapter • January 2022, Doi: 10.1007/978-981-16-
Our research has meticulously scrutinized six distinct 3690-5_135.
methods to assess their efficacy with trained data within a [14]. Abdurakhmon Sadiev et al, “Federated Optimization
specific information domain. This ongoing exploration and Algorithms with Random Reshuffling and Gradient
practical application have significantly influenced the field, Compression”, arXiv: 2206.07021v2 [cs.LG], 3 Nov
paving the way for potential advancements in enhancing 2022.
model effectiveness. As we conclude this phase, our [15]. Syed Muzamil Basha et al, “A comprehensive Study on
unwavering commitment to continuous research persists, learning strategies of optimization algorithms and its
with the subsequent stage of our group's research work applications”, DOI:
poised for exploration. 10.1109/ICSSS54381.2022.9782200 ©2022, IEEE.
[16]. Ishaani Priyadarshini et al,” Human activity recognition
ACKNOWLEDGEMENT in cyber-physical systems using optimized machine
The authors extend their heartfelt appreciation to the learning techniques”, doi.org/10.1007/s10586-022-
faculty of IT and the Department of CSE for their invaluable 03662-8,Springer Nature, 2022.
guidance, constructive feedback, and the provision of [17]. Shubhkirti Sharma et al,” A Comprehensive Review on
laboratory services throughout the research process. Multi-Objective Optimization Techniques: Past, Present,
and Future”, doi.org/10.1007/s11831-022-09778-9ne
REFERENCES June, 2022.
[18]. Amit Sagu et al, “Design of Metaheuristic Optimization
[1]. Thomas A.Sudkamp, Languages and Machines, “An Algorithms for Deep Learning Model for Secure IoT
introduction to the Theory of Computer Science”, Environment”, Sustainability, 2023,
eBook, 1997. doi.org/10.3390/su15032204.
[2]. Molodtsov, “Soft set theory-first results, Computers [19]. Xiangning Chen et al, “Symbolic Discovery of
Math”, Applic, (1999), 19-31.. Optimization Algorithms”, google, arXiv:
[3]. MAJI et al, “An Application of Soft Sets in a Decision 2302.06675v4 [cs.LG], 8 May 2023.
Making Problem,” PERGAMON-Computers and [20]. Yandong Shi et al, “Machine Learning for Large-Scale
Mathematics with Applications”, 2002. Optimization in 6G Wireless Networks”, IEEE, arXiv:
[4]. Andreas et al, “Set optimization -a rather short 2301.03377v1 [eess.SP], 3 Jan 2023.
introduction”, arXiv: 1404.5928v2 [math.OC], 2 May [21]. B.Lavanya et al, “Text Genre Classification: A
2014. Classified Study”, Eur. Chem. Bull, DOI: 10.31838
[5]. Q.Zhong and X. Wang, “A new parameter reduction /ecb/ 2023.12.s1-B.383.
method based on soft set theory”, Vol. 9, No. 5 (2016),
99-108.
[6]. Nasef et al, “Soft Set Theory and Its Applications”,
https://www.researchgate.net/publication/326561107,
July 2018.
[7]. FLEXChip Signal Processor (MC68175/D), Motorola,
1996.
[8]. Kaiwen L et al,” Evolutionary Many-Objective
Optimization: A Comparative Study of the State-of-the-
Art”, June 5, 2018. Digital Object Identifier
10.1109/ACCESS.2018.2832181.
[9]. Radwa et al, “Automated Machine Learning: State-of-
The-Art and Open Challenges”, arXiv: 1906.02287v2
[cs.LG], 11 Jun, 2019.

IJISRT23NOV1204 www.ijisrt.com 970

You might also like