0% found this document useful (0 votes)
2 views

MicroPythonNeuralNetwork

This document outlines the implementation of a Deep Feed Forward Neural Network on a microcontroller using MicroPython, detailing the necessary packages and installation methods. It provides a simple example of a neural network with a specified architecture, including training and prediction processes. The project is licensed under MIT and includes links to the source code and documentation.

Uploaded by

a3108727151
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
2 views

MicroPythonNeuralNetwork

This document outlines the implementation of a Deep Feed Forward Neural Network on a microcontroller using MicroPython, detailing the necessary packages and installation methods. It provides a simple example of a neural network with a specified architecture, including training and prediction processes. The project is licensed under MIT and includes links to the source code and documentation.

Uploaded by

a3108727151
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 5

MicroPython - Neural Network

Implement Neural Network Deep Feed Forward on micro-controller using MicroPython

Olivier Lenoir
[email protected]

August 15, 2024

Abstract
Implement Neural Network Deep Feed Forward on micro-controller using MicroPython.
This project is designed in pure MicroPython.

neuralnetwork.DFF((3, 4, 2))
neuralnetwork.DFF((3, 4, 5, 2))

Licence: MIT, Copyright © 2021, 2024 Olivier Lenoir


Project: https://gitlab.com/olivierlenoir/MicroPython-NeuralNetwork
Documentation: https://olivierlenoir.gitlab.io/MicroPython-NeuralNetwork/MicroPythonNeuralNetwork.
pdf

1 Requirements
Download matrix.py 1 and neuralnetwork.py 2 and copy them on your MicroPython board. The same
code can be used on your computer with Python.

1.1 Downloading packages with wget


wget is a command-line tool used to download files from the internet. It supports various protocols
like HTTP, HTTPS, and FTP. You can use it to download single files, entire websites, and even resume
interrupted downloads. It’s particularly useful for automating downloads and working in environments
without a graphical interface.
$ wget https :// gitlab . com / olivierlenoir / MicroPython - Matrix / -/ blob / master / micropython / matrix . py
$ wget https :// gitlab . com / olivierlenoir / MicroPython - NeuralNetwork / -/ blob / master / micropython /
neuralnetwork . py

1.2 Installing packages with mip


Network-capable boards include the mip module, which can install packages from micropython-lib
and from third-party sites (including GitHub, GitLab).
1 matrix.py:
https://gitlab.com/olivierlenoir/MicroPython-Matrix/-/blob/master/micropython/matrix.py
2 neuralnetwork.py: https://gitlab.com/olivierlenoir/MicroPython-NeuralNetwork/-/blob/master/micropython/
neuralnetwork.py

1
mip (“mip installs packages”) is similar in concept to Python’s pip tool, however it does not use the
PyPI index, rather it uses micropython-lib as its index by default. mip will automatically fetch compiled
.mpy file when downloading from micropython-lib.
The most common way to use mip is from the REPL:
>>> import mip
>>> mip . install ( ’ gitlab : olivierlenoir / MicroPython - NeuralNetwork ’)

1.3 Installing packages with mpremote


The mpremote tool also includes the same functionality as mip and can be used from a host PC to install
packages to a locally connected device (e.g. via USB or UART):
$ mpremote mip install gitlab : olivierlenoir / MicroPython - NeuralNetwork

2 Get Started
I’m going to describe how work MicroPython - Neural Network with a very small and simple example.
In this classifier we are using a Sigmoid activation function as σ(x) and his derivative as σ ′ (x). Here is
what we want to predict, with in as inputs and sn as the expected classification.

i1 i2 i3 s1 s2
0 0 0 1 0
0 0 1 0 1
0 1 1 0 1
0 1 0 1 0
1 1 0 0 1
1 1 1 1 0
1 0 1 0 1
1 0 0 1 0

Table 1: Training set

We use a neural network (3, 4, 2). With 3 values in the input layer, 4 values in the hidden layer and
2 values in the output layer.

H
I
O

neuralnetwork.DFF((3, 4, 2))

Figure 1: Neural network (3, 4, 2) detail

Input layer is represented by the matrix I and the output by the matrix O. Hidden layer is matrix H.
Weights between matrix I and H is matrix Wih . Weights between matrix H and O is matrix Who .

1
σ(x) =
1 + e−x
σ ′ (x) = σ(x) · (1 − σ(x)) (1)
σ ′ (xσ ) = xσ · (1 − xσ )

2

I = i1 i2 i3 (2)
 ih ih ih ih

w1,1 w1,2 w1,3 w1,4
ih ih ih ih 
Wih = w2,1 w2,2 w2,3 w2,4 (3)
ih ih ih ih
w3,1 w3,2 w3,3 w3,4

H = h1 h2 h3 h4 (4)

 ho ho

w1,1 w1,2
ho
w2,1 ho 
w2,2
Who = ho
w3,1

ho  (5)
w3,2
ho ho
w4,1 w4,2

O = o1 o2 (6)


S = s1 s2 (7)
From those matrices you can propagate the input I to the output O using the following calcula-
tions:

H = σ(I · Wih )
(8)
O = σ(H · Who )

If the network is properly trained, the output O should be very close from the expected S matrix.
If not, this mind we need to train the artificial neural network with the training data-set and back-
propagate the error to adjust weights.
We are now going to use propagated results to back-propagate the error of each layers. Eo and
Eh are error of layers O and H. Matrix gWho and gWih are the gradient to adjust weights.

Eo = (S − O) × σ ′ (O)
(9)
gWho = H T · Eo

Eh = (Eo · Who ) × σ ′ (H)


(10)
gWih = I T · Eh

Who = Who + gWho


(11)
Wih = Wih + gWih

Weights are updated. Training continue until we are satisfied with the result.

2.1 Train Neural Network


I recommend training Neural Network on a computer. Otherwise you may quickly run into memory
error on your MicroPython board, even if you use garbage collect.
from matrix import Matrix
import neuralnetwork as nn

# Create neural network with an input layer of 3 , an hidden layer of 4


# and an output layer of 2 by default the activation function is sigmoid ()
# but a ReLU also exist as relu ()
ann = nn . DFF ((3 , 4 , 2) )

Now let’s create a training set with input matrix and output matrix.

3
training_set = [
[ Matrix ([[0 , 0, 0]]) , Matrix ([[1 , 0]]) ] ,
[ Matrix ([[0 , 0, 1]]) , Matrix ([[0 , 1]]) ] ,
[ Matrix ([[0 , 1, 1]]) , Matrix ([[0 , 1]]) ] ,
[ Matrix ([[0 , 1, 0]]) , Matrix ([[1 , 0]]) ] ,
[ Matrix ([[1 , 1, 0]]) , Matrix ([[0 , 1]]) ] ,
[ Matrix ([[1 , 1, 1]]) , Matrix ([[1 , 0]]) ] ,
[ Matrix ([[1 , 0, 1]]) , Matrix ([[0 , 1]]) ] ,
[ Matrix ([[1 , 0, 0]]) , Matrix ([[1 , 0]]) ] ,
]

We train the network a thousand times with the training set. The learning rate (lrate) is set to one
by default
print ( ’ Learning ␣ progress ’)
for i in range (1000) :
for a , s in training_set :
ann . train (a , s , lrate =1)
if i % 10 == 0:
print ( ’. ’ , end = ’ ’)

Check if you are satisfied with the training.


def short ( a ) :
return round (a , 3)

print ( ’= ’ * 20)
score = True
for a , s in training_set :
p = ann . predict ( a )
scr = str ( p . map ( round ) ) == str ( s . map ( short ) )
print ( a . map ( short ) , p . map ( short ) , p . map ( round ) , s . map ( short ) , scr )
score &= scr
print ( ’ Good ␣ learning ? ’ , score )

Print out weights.


print ( ’= ’ * 20)
print ( ’ Rounded ␣ weights ’)
for i , w in enumerate ( ann . weights ) :
print ( ’W {} ’. format ( i ) , w . map ( round ) )

2.2 Predict
With the trained weights, we can now use our network.
from matrix import Matrix
from neuralnetwork import DFF

ann = DFF (
(3 , 4 , 2) ,
weights =[
Matrix ([[ -1 , -4 , 6 , -4] , [5 , 0 , 6 , 1] , [4 , -4 , -8 , 6]]) ,
Matrix ([[ -10 , 10] , [5 , -5] , [7 , -7] , [6 , -6]])
]
)

d = Matrix (((1 , 0 , 1) ) )
p = ann . predict ( d )
print ( p )

References
[1] Jean-Claude Heudin, Comprendre le deep learning, une introduction aux réseaux de neurones,
Science-eBook, Octobre 2016, ISBN 979-10-91245-44-9.
[2] Jean-Claude Heudin, Intelligence Artificielle, manuel de survie, Science-eBook, Octobre 2017,
ISBN 978-2-37743-000-0.

4
[3] Damien George, MicroPython, George Robotics Limited, https://micropython.org/.
[4] Nicholas H. Tollervey Programming with MicroPython, embedded programming with MicroPython
& Python, O’Reilly, 1st edition, October 2017, ISBN 978-1-491-97273-1.

[5] Olivier Lenoir, MicroPython - Matrix, GitLab, January 2021, https://gitlab.com/olivierlenoir/


MicroPython-Matrix/.
[6] Tutorials Point, Artificial Neural Network Tutorial, 2021, https://www.tutorialspoint.com/
artificial_neural_network/index.htm
[7] Wikipedia, Sigmoid function, wikipedia.org, 2021, https://en.wikipedia.org/wiki/Sigmoid_
function

You might also like