Machine Learning at Resource Constraint Edge Device Using Bonsai Algorithm
Machine Learning at Resource Constraint Edge Device Using Bonsai Algorithm
net/publication/349120290
CITATIONS READS
2 601
2 authors:
Some of the authors of this publication are also working on these related projects:
All content following this page was uploaded by Manjunath R Kounte on 08 February 2021.
Abstract- In the worldwide billions of devices connected which carryout computation of substantial amount of
each other to interact with the surrounding environment data is referred to as Fog computing. In Edge
to collect the data based on the context. Using machine computing, computational process is enabled within
learning algorithm intelligence can be incorporated in the device or very close to the end user for storing and
these Internet of Things (IoT) devices to get valuable processing rather than transmitting the data to massive
insights from these data for accurate predictions. computational services provided by Cloud services.
Machine learning model is deployed onto the devices for
making the decisions locally. This enables fast, accurate
prediction within few milliseconds by evading data A. Motivation
transmission to the cloud and makes perfectly applicable
for real time applications. In this paper, the experiment Machine learning applications are successful in
is conducted with publicly available dataset with Bonsai various domains. Edge computing is a promising
algorithm. This algorithm is implemented in Linux technology pushing the computation to the End user.
environment with core i5 processor in python 2.7 and
achieved 92% accuracy with model size of 6.25KB, which Though IoT devices has limited computational power,
can be easily deployed on resource constraint IoT energy, and memory resources for deploying the
devices. machine learning on IoT devices, lot of research going
on embed intelligence to IoT devices.
Keywords-Machine Learning, Resource constraint, Edge
Computing, IoT, Bonsai
As shown in Fig.1 literature is conducted on IEEE
I. INTRODUCTION publications with the keyword “Edge computing and
Machine Learning” clearly depicts the in the last few
Internet of Things applications are used in Plethora of years dramatic increase in the interest of research
domain such as Healthcare, Smart Cities, smart
community for deploying machine learning in
precision agriculture etc. These IoT devices collects
resource constraint edge devices is the motivation for
data based on the context and transmits to the cloud
writing this paper.
for decision making. As decision making requires time
for processing the data and to produce the insights, this
approach will not be suitable for real time applications
wherein data needs to be processed and has to produce
the results within few milliseconds.
Authorized licensed use limited to: REVA UNIVERSITY. Downloaded on February 08,2021 at 18:26:57 UTC from IEEE Xplore. Restrictions apply.
Outline of the article as follows. Section II introduces Designing machine learning models [8,9] to embed
Machine learning at IoT Edge Devices, and various within a IoT devices for providing fast accurate result
architectural devices. Following that, we discussed with reduced latency for resource constrained devices.
On this context few architectural designs [10] and
Bonsai algorithm followed by applications. Finally, strategies are developed for time stringent applications
we present the result followed by conclusion in the last are discussed below.
section.
1) Device centric computation (On-device
II. EDGE MACHINE LEARNING Computation)
The enormous growth of the IoT devices such as smart Machine learning models are trained in the local
devices and smart phones generates huge amount of system or cloud. Training takes place offline. This is
varieties of data. These smart devices deployed at referred as On-device computation [11] or Edge
network edge are called edge devices. Commonly use computing [12]. As shown in Fig. 3, later trained
Edge devices with more computational capability is model is deployed on to the Edge device for faster
called as Edge servers [1]. Edge computing [2,3] refers prediction of model through mathematical operations.
to the computing capabilities of Edge devices and
Edge servers near End users [4]. This intelligent
processing and data analytics happen through various
Machine learning techniques[5,6] at the IoT edge
devices closer to the data source. Machine Learning at
Edge is combination of two paradigms as shown in
Fig. 2.
Authorized licensed use limited to: REVA UNIVERSITY. Downloaded on February 08,2021 at 18:26:57 UTC from IEEE Xplore. Restrictions apply.
3) Distributed Computing (Joint Computing)
Main objective of this technique size of the network Hyperparameters such as number of epochs, learning
by removing weigh connection of the network which rate, activation function is tuned for maximising the
does not have much influence on the output as shown performance.
in Fig.7. Though pruning is older concept, it has
gained lot of attention during the on-device or Edge 5) Performance Metrics
computing paradigm due to the deployment of smaller For most of the AI driven Edge device applications,
machine learning models on to the mobile device or designing a machine learning capability or inference
IoT devices to increase the speed and to decrease the model on edge devices for high accuracy has many
model size with high accuracy. additional constraints imposed by devices resources
Pre-trained network along with the weights will be the that effect the performance parameters [16] as shown in
input for pruning. This includes removing the Table II.
connections and parameters of the model that does not
affect the final output. Pruned model needs to be
recompiled.
Authorized licensed use limited to: REVA UNIVERSITY. Downloaded on February 08,2021 at 18:26:57 UTC from IEEE Xplore. Restrictions apply.
Table II: Performance Metrics
Generally, Tree algorithms can be used for Bonsai algorithm can be trained on laptop or cloud,
classification and regression kind of problems. that can be deployed onto severely resource
Though this is suitable for IoT applications, pruning constrained IoT devices.
trees to fit in to a resource constraint IoT device
B. Flowchart
leads to poor accuracy.
To reduce the model size, Bonsai learns a single
BONSAI[18] is one of the machine learning models
Shallow. This process leads to decrease in accuracy.
designed for deploying to resource constraint edge
To retain accuracy algorithm allows each node to
device considering the storage and latency
predict score and to represent in vector. Final
requirement for real time applications with
predicted vector is summation of all individual
minimum model size to make it fit for small IoT
predicted vector as shown in Fi 8. Input vector can
devices. .
be represented into low dimensional space using
Bonsai is a string and shallow non-linear tree-based learnt sparse projection matrix. In the flowchart Ik(x)
classifier model for supervised learning tasks such represents the Indicator function, Wk and Vk
as binary and multi-class classification, regression, represents the sparse predictor leant at node k, Z
ranking. Bonsai algorithm [19] is designed to fit in represents Sparse projection matrix and б
few KB of memory in small IoT devices. Bonsai can represents Tunable hyper parameters.
be trained in cloud or local system but can make
predictions locally on the edge resource constraint
Authorized licensed use limited to: REVA UNIVERSITY. Downloaded on February 08,2021 at 18:26:57 UTC from IEEE Xplore. Restrictions apply.
Fig.9. Experiment result Screenshot
CONCLUSION
Authorized licensed use limited to: REVA UNIVERSITY. Downloaded on February 08,2021 at 18:26:57 UTC from IEEE Xplore. Restrictions apply.
5) Manjunath R Kounte, Pratyush Kumar Tripathy, for Internet of Things”, Proceedings of the 34th
Pramod P, Harshit Bajpai, "Analysis of Intelligent International Conference on Machine Learning, 2017
Machine using Deep Learning and Natural Language 19) https://github.com/Microsoft/EdgeML/wiki/Bonsai
Processing", 4th International Conference on Trends 20) https://www.kaggle.com/bistaumanga/usps-dataset
in Electronics and Informatics (ICEI), IEEE, 2020
6) Shridevi J Kamble, Manjunath R Kounte, " On Raod
Intelligent Vehicle Path Prediction and Clustering
using Machine Learning Approach", Third
International conference on I-SMAC(IoT in Socail,
Mobile, Analytics and Cloud), IEEE, 2019
7) Nandan Jha, Khaled Alwasel, Areeb Alshoshan,
Xianghua Huang, Ranesh Kumar Naha, Sudheer
Kumar Battula, Saurabj Garg, Deepak Puthal, Philip
James, Albert Zomaya, Schahram Dustdar, Rajiv
Ranjan, “IoTSim-Edge: A simulation Framework for
Modeling the Behaviour of IoT and Edge Computing
Environments”, Journal of Software: Practice
and Experience, pp. 844-867, Jan 2020
8) M.G.Sarwar Murshed, Christopher Murphy, Daqing
Hou, Nazar Khan, Ganesh Ananthanarayanan, Faraz
Hussain, “Machine Learning at the Network Edge: A
survey” 2020
9) Tomasz Szydlo, Joanna Sendorek, Robert Brzoza
woch, “Enabling machine learning on resource
constrained devices by source code generation of the
learned models”, International Conference on
Computational Science, Lecture Notes in Computer
Science book series, vol.10861, pp.682-694, 2018
10) Massimo Merenda, Carlo Porcaro, Demetrio Iero,
“Edge Machine Learning for AI-Enabled IoT
Devices: A Review”, Sensors(Basel)-Open Access
Journal, April 2020
11) Sauptik Dhar, Junyao Guo, Jiayi (Jason) Liu, Samarth
Tripathi, Unmesh Kurup, Mohak Shah, “ On-Device
Machine Learning: An Algorithms and Learning
Theory Perspectives”, vol.1, no.1, ACM, 2020
12) Sergej Svorobej, Patricia Takako Endo, Malika
Bendechache, Christos Fielis-Papadopoulos,
Konstantinos M. Giannotakis, George A.Gravvanis,
Dimitrios Tzovaras, James Byrne, Theo Lynn, “
Simulating Fog and Edge Computing Scenarios: An
Overview and Research Challenges”, Special Issues
Cloud Computing and Internet of Things, Future
Internet, vol.11, no.3, 2019
13) Hai Lin, Sherali Zeadally, Zhihong chen, Houda
Labiod, Lusheng Wang, “A Survey on Computation
Offloading Modeling for Edge Computing”, Journal
of Network and Computer Applications, Elsevier,
vol.169, 2020
14) Soumyalatha Naveen, Manjunath R Kounte, “In
Search of the Future Technologies: Fusion of Machine
Learning, Fog and Edge Computing in the Internet of
Things”, Lecture Notes on Data Engineering and
Communications Technologies, Springer, vol.31,
2019
15) Mohit Kumar, Xingzhou Zhang, Laingkai Liu, Yifan
Wang, Weisong Shi, “Energy-Efficient Machine
Learning on the Edges”, IEEE International Parallel
and distributed Processing Symposium Workshops,
2020
16) Jiasi Chen, Xukan Ran, “Deep Learning With Edge
Computing: A Review”, Proceedings of the IEEE,
vol.107, no.8, August 2019
17) Fangxin Wang, Miao Zhang, Xiangxiang Wang,
Xiaoqianga Ma, And Jiangchuan Liu, “Deep Learning
for Edge Computing Applications: A state of the Art
Survey”, IEEE Access, vol.8, pp.58322-58336, March
2020
18) Ashish Kumar, Saurabh Goyal, Manik Varma,
“Resource-efficient Machine Learning in 2KB RAM
Authorized licensed use limited to: REVA UNIVERSITY. Downloaded on February 08,2021 at 18:26:57 UTC from IEEE Xplore. Restrictions apply.
View publication stats