0% found this document useful (0 votes)
56 views

Neural Network Implementation in MicroPython For TinyML - by Subir Maity - Medium

This document summarizes an approach to implementing neural networks in MicroPython for embedded machine learning applications with limited memory. The key steps are: (1) train a model in TensorFlow on a PC or Colab, (2) export the trained weights and biases, (3) develop a feedforward network in MicroPython without NumPy, (4) code layers and activation functions from scratch, and (5) import the trained weights and biases to validate the MicroPython model on an MCU. Diabetes detection is used as an example problem.

Uploaded by

vojkan73
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
56 views

Neural Network Implementation in MicroPython For TinyML - by Subir Maity - Medium

This document summarizes an approach to implementing neural networks in MicroPython for embedded machine learning applications with limited memory. The key steps are: (1) train a model in TensorFlow on a PC or Colab, (2) export the trained weights and biases, (3) develop a feedforward network in MicroPython without NumPy, (4) code layers and activation functions from scratch, and (5) import the trained weights and biases to validate the MicroPython model on an MCU. Diabetes detection is used as an example problem.

Uploaded by

vojkan73
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 15

9/19/22, 9:24 AM Neural Network Implementation in MicroPython for TinyML | by Subir Maity | Medium

Neural Network Implementation in MicroPython


for TinyML

This article is about the implementation methodology of neural networks in MicroPython


that run on an embedded MCU. A fully connected feed forward network is designed and
tested in the Micropython environment. The hyperparameters are imported from the
trained model in Tensorflow and used in the developed model for validation in
MicroPython. This approach can be useful for embedded machine learning applications
where memory size is a major concern.

MicroPython is a lean and efficient implementation of the Python 3 programming language


that includes a small subset of the Python standard library and is optimized to run on
https://medium.com/@subirmaity/a-simple-neural-network-implementation-approach-in-micropython-for-deep-learning-application-760ab35cb538 1/15
9/19/22, 9:24 AM Neural Network Implementation in MicroPython for TinyML | by Subir Maity | Medium

microcontrollers and in constrained environments.

The problem statement considered for this article is the same as the previous one i.e.
Detection of diabetes. With minor modification, it can be applied to any real-time data.

procedure:

1. Train the model with a large number of available training data in PC/Google Colab
using TensorFlow.

2. Export the weights and bias values of the trained network from TensorFlow and dump
them in a text file.

3. Develop a feed-forward network using basic python code without Numpy (because
microPython does not have Numpy)

4. Develop code for all layers with activation functions

5. Develop code from scratch for conversion of probability to a 0–1 scale with a threshold
of 0.5.

6. Write code for accuracy and plot confusion matrix.

*The diabetes dataset can be downloaded from here.

Step-1: Model Training in Keras

https://medium.com/@subirmaity/a-simple-neural-network-implementation-approach-in-micropython-for-deep-learning-application-760ab35cb538 2/15
9/19/22, 9:24 AM Neural Network Implementation in MicroPython for TinyML | by Subir Maity | Medium

https://medium.com/@subirmaity/a-simple-neural-network-implementation-approach-in-micropython-for-deep-learning-application-760ab35cb538 3/15
9/19/22, 9:24 AM Neural Network Implementation in MicroPython for TinyML | by Subir Maity | Medium

The above-mentioned script is used to develop and train a feed-forward network with 4
neurons in the input layer, 2 neurons in the hidden layer, and a single neuron in the output
layer. Relu activation is used for the input and hidden layer, whereas sigmoid activation is
used for the output layer.

Step-2: Export Weight and bias value of the trained network

weights = model.get_weights()
print(weights)

Output :

[array([[-0.0921422 , -0.02542371, 0.30698848, -0.25456974],


[-0.50550294, -1.0066229 , 0.9122949 , -0.26058084], [
0.51205075, 0.43376786, -0.07458406, -0.37492675], [-0.6911025 ,
0.6660218 , -0.0747927 , -0.18643615], [ 0.30719382, 0.519332 ,
0.52301043, -0.15813994], [-0.1978444 , -0.0485612 , 0.60282296,
-1.2380371 ], [-0.260084 , -0.00271817, 0.7422599 ,
0.00921784], [-1.2720652 , -0.08174618, -0.15088758,
0.68464226]], dtype=float32), array([ 0.03986932, 0.7178214 ,
0.10937195, -0.07356258], dtype=float32), array([[ 1.000371 , 1.5499793
], [-2.840075 , 0.2513094 ], [ 0.07223273, -0.5448719 ],
[ 0.25136417, 0.60296375]], dtype=float32), array([0.4844855 ,
0.61291546], dtype=float32), array([[ 1.1305474], [-1.3455248]],
dtype=float32), array([0.91277504], dtype=float32)]

Export weight and bias in a text file (optional, for future use)

# Saving the array in a text file


file = open("hyper_param.txt", "w+")


content = str(weights)

file.write(content)

file.close()

https://medium.com/@subirmaity/a-simple-neural-network-implementation-approach-in-micropython-for-deep-learning-application-760ab35cb538 4/15
9/19/22, 9:24 AM Neural Network Implementation in MicroPython for TinyML | by Subir Maity | Medium

Step-3: Feedforward Network design in MicroPython from Scratch

A. Fully connected Dense layer

def dense(nunit, x, w, b, activation): # define a single dense layer


followed by activation
res = []

for i in range(nunit):

b[i], activation)
z = neuron(x, w[i],
# print(z)

res.append(z)
return res

B. Code for a single neuron with specific activation

def neuron(x, w, b, activation): # perform operation on a single neuron


and return a 1d array

tmp = zeros1d(x[0])

for i in range(len(x)):

tmp = add1d(tmp, [(float(w[i]) * float(x[i][j])) for j in


range(len(x[0]))])

if activation == "sigmoid":
yp = sigmoid([tmp[i] +

b for i in range(len(tmp))])
elif activation == "relu":

yp = relu([tmp[i] + b

for i in range(len(tmp))])
else:

print("Invalid activation function--->")


return yp

C. Activation functions

##Relu activation

def relu(x): # Relu activation function


# print(x)

y = []

for i in range(len(x)):
if x[i] >= 0:

y.append(x[i])
https://medium.com/@subirmaity/a-simple-neural-network-implementation-approach-in-micropython-for-deep-learning-application-760ab35cb538 5/15
9/19/22, 9:24 AM Neural Network Implementation in MicroPython for TinyML | by Subir Maity | Medium

else:
y.append(0)

# print(y)
return y

##Sigmoid function
def sigmoid(x):

import math

z = [1 / (1
+ math.exp(-x[kk])) for kk in range(len(x))]
return z

D. Classification report

def classification_report(ytrue, ypred): # print prediction results in


terms of metrics and confusion matrix
tmp = 0

TP = 0

TN = 0

FP = 0

FN = 0

for i

in range(len(ytrue)):

# For accuracy calculation
if ytrue[i] == ypred[i]:
tmp += 1

##True positive and negative count



# find true positive
if ytrue[i] == 1 and ypred[i] == 1:
TP += 1

if ytrue[i]
== 0 and ypred[i] == 0: # find true negative
TN += 1

if ytrue[i]
== 0 and ypred[i] == 1: # find false positive
FP += 1

if ytrue[i] == 1 and ypred[i] == 0: # find false negative


FN += 1

accuracy = tmp / len(ytrue)



TP]]
conf_matrix = [[TN, FP], [FN,
#print(TP, FP, FN, TN)

print("Accuracy: " + str(accuracy))


print("Confusion Matrix:")

print(print_matrix(conf_matrix))

E. Extracted weight and bias from the TensorFlow trained network in PC

# Build a Dense layer


# Structure: Input layer 4 neuron with 8 feature, Relu activation


https://medium.com/@subirmaity/a-simple-neural-network-implementation-approach-in-micropython-for-deep-learning-application-760ab35cb538 6/15
9/19/22, 9:24 AM Neural Network Implementation in MicroPython for TinyML | by Subir Maity | Medium

# 1st hidden layer: 2 neuron with 4 input each, Relu activation


# output layer: 1 neuron with 2 input, Sigmoid activation

w1 = [[-0.0921422 , -0.02542371, 0.30698848, -0.25456974],


[-0.50550294, -1.0066229 , 0.9122949 , -0.26058084],

[ 0.51205075, 0.43376786, -0.07458406, -0.37492675],

[-0.6911025 , 0.6660218 , -0.0747927 , -0.18643615],

[ 0.30719382, 0.519332 , 0.52301043, -0.15813994],

[-0.1978444 , -0.0485612 , 0.60282296, -1.2380371 ],

[-0.260084 , -0.00271817, 0.7422599 , 0.00921784],

[-1.2720652 , -0.08174618, -0.15088758, 0.68464226]]


b1 = [ 0.03986932, 0.7178214 , 0.10937195, -0.07356258]

w2 = [[ 1.000371 , 1.5499793 ],

[-2.840075 , 0.2513094 ],

[ 0.07223273, -0.5448719 ],

[ 0.25136417, 0.60296375]]

b2 = [0.4844855 , 0.61291546]

w3 = [[1.1305474],[-1.3455248]]
b3 = [0.91277504]

#Transpose all weight matrix


w1 = transpose(w1)

w2 = transpose(w2)

w3 = transpose(w3)

F. Main program for validation

#Transpose Xtest before feeding to NN



b1, 'relu') #input layer with 4
yout1 = dense(4, transpose(Xtest), w1,
neuron
yout2

= dense(2, yout1, w2, b2, 'relu') #hidden layer with 2 neuron


ypred = dense(1, yout2, w3, b3,'sigmoid') #output layer

print(ypred)

ypred_class

= [1 if i > 0.5 else 0 for i in ypred[0]]


print(ypred_class)

print(classification_report(ytrue,ypred_class))

The complete code is listed below:

https://medium.com/@subirmaity/a-simple-neural-network-implementation-approach-in-micropython-for-deep-learning-application-760ab35cb538 7/15
9/19/22, 9:24 AM Neural Network Implementation in MicroPython for TinyML | by Subir Maity | Medium

https://medium.com/@subirmaity/a-simple-neural-network-implementation-approach-in-micropython-for-deep-learning-application-760ab35cb538 8/15
9/19/22, 9:24 AM Neural Network Implementation in MicroPython for TinyML | by Subir Maity | Medium

https://medium.com/@subirmaity/a-simple-neural-network-implementation-approach-in-micropython-for-deep-learning-application-760ab35cb538 9/15
9/19/22, 9:24 AM Neural Network Implementation in MicroPython for TinyML | by Subir Maity | Medium

https://medium.com/@subirmaity/a-simple-neural-network-implementation-approach-in-micropython-for-deep-learning-application-760ab35cb538 10/15
9/19/22, 9:24 AM Neural Network Implementation in MicroPython for TinyML | by Subir Maity | Medium

https://medium.com/@subirmaity/a-simple-neural-network-implementation-approach-in-micropython-for-deep-learning-application-760ab35cb538 11/15
9/19/22, 9:24 AM Neural Network Implementation in MicroPython for TinyML | by Subir Maity | Medium

https://medium.com/@subirmaity/a-simple-neural-network-implementation-approach-in-micropython-for-deep-learning-application-760ab35cb538 12/15
9/19/22, 9:24 AM Neural Network Implementation in MicroPython for TinyML | by Subir Maity | Medium

https://medium.com/@subirmaity/a-simple-neural-network-implementation-approach-in-micropython-for-deep-learning-application-760ab35cb538 13/15
9/19/22, 9:24 AM Neural Network Implementation in MicroPython for TinyML | by Subir Maity | Medium

https://medium.com/@subirmaity/a-simple-neural-network-implementation-approach-in-micropython-for-deep-learning-application-760ab35cb538 14/15
9/19/22, 9:24 AM Neural Network Implementation in MicroPython for TinyML | by Subir Maity | Medium

Note: In the main program, the Xtest and Ytrue value is directly taken from the training
code after scaling (normalization) and splitting. The same network architecture is used i.e.
4 neurons in the input layer, 2 neurons in the hidden layer, and 1 neuron in the final output
layer.

Step-4: Model validation

Copy the code and run it in Micropython, and, it results:

Verify the result with Tensorflow predicted value in Google Colab (previously attached code
in Model training section):

https://medium.com/@subirmaity/a-simple-neural-network-implementation-approach-in-micropython-for-deep-learning-application-760ab35cb538 15/15

You might also like