MicroPythonNeuralNetwork
MicroPythonNeuralNetwork
Olivier Lenoir
[email protected]
Abstract
Implement Neural Network Deep Feed Forward on micro-controller using MicroPython.
This project is designed in pure MicroPython.
neuralnetwork.DFF((3, 4, 2))
neuralnetwork.DFF((3, 4, 5, 2))
1 Requirements
Download matrix.py 1 and neuralnetwork.py 2 and copy them on your MicroPython board. The same
code can be used on your computer with Python.
1
mip (“mip installs packages”) is similar in concept to Python’s pip tool, however it does not use the
PyPI index, rather it uses micropython-lib as its index by default. mip will automatically fetch compiled
.mpy file when downloading from micropython-lib.
The most common way to use mip is from the REPL:
>>> import mip
>>> mip . install ( ’ gitlab : olivierlenoir / MicroPython - NeuralNetwork ’)
2 Get Started
I’m going to describe how work MicroPython - Neural Network with a very small and simple example.
In this classifier we are using a Sigmoid activation function as σ(x) and his derivative as σ ′ (x). Here is
what we want to predict, with in as inputs and sn as the expected classification.
i1 i2 i3 s1 s2
0 0 0 1 0
0 0 1 0 1
0 1 1 0 1
0 1 0 1 0
1 1 0 0 1
1 1 1 1 0
1 0 1 0 1
1 0 0 1 0
We use a neural network (3, 4, 2). With 3 values in the input layer, 4 values in the hidden layer and
2 values in the output layer.
H
I
O
neuralnetwork.DFF((3, 4, 2))
Input layer is represented by the matrix I and the output by the matrix O. Hidden layer is matrix H.
Weights between matrix I and H is matrix Wih . Weights between matrix H and O is matrix Who .
1
σ(x) =
1 + e−x
σ ′ (x) = σ(x) · (1 − σ(x)) (1)
σ ′ (xσ ) = xσ · (1 − xσ )
2
I = i1 i2 i3 (2)
ih ih ih ih
w1,1 w1,2 w1,3 w1,4
ih ih ih ih
Wih = w2,1 w2,2 w2,3 w2,4 (3)
ih ih ih ih
w3,1 w3,2 w3,3 w3,4
H = h1 h2 h3 h4 (4)
ho ho
w1,1 w1,2
ho
w2,1 ho
w2,2
Who = ho
w3,1
ho (5)
w3,2
ho ho
w4,1 w4,2
O = o1 o2 (6)
S = s1 s2 (7)
From those matrices you can propagate the input I to the output O using the following calcula-
tions:
H = σ(I · Wih )
(8)
O = σ(H · Who )
If the network is properly trained, the output O should be very close from the expected S matrix.
If not, this mind we need to train the artificial neural network with the training data-set and back-
propagate the error to adjust weights.
We are now going to use propagated results to back-propagate the error of each layers. Eo and
Eh are error of layers O and H. Matrix gWho and gWih are the gradient to adjust weights.
Eo = (S − O) × σ ′ (O)
(9)
gWho = H T · Eo
Weights are updated. Training continue until we are satisfied with the result.
Now let’s create a training set with input matrix and output matrix.
3
training_set = [
[ Matrix ([[0 , 0, 0]]) , Matrix ([[1 , 0]]) ] ,
[ Matrix ([[0 , 0, 1]]) , Matrix ([[0 , 1]]) ] ,
[ Matrix ([[0 , 1, 1]]) , Matrix ([[0 , 1]]) ] ,
[ Matrix ([[0 , 1, 0]]) , Matrix ([[1 , 0]]) ] ,
[ Matrix ([[1 , 1, 0]]) , Matrix ([[0 , 1]]) ] ,
[ Matrix ([[1 , 1, 1]]) , Matrix ([[1 , 0]]) ] ,
[ Matrix ([[1 , 0, 1]]) , Matrix ([[0 , 1]]) ] ,
[ Matrix ([[1 , 0, 0]]) , Matrix ([[1 , 0]]) ] ,
]
We train the network a thousand times with the training set. The learning rate (lrate) is set to one
by default
print ( ’ Learning ␣ progress ’)
for i in range (1000) :
for a , s in training_set :
ann . train (a , s , lrate =1)
if i % 10 == 0:
print ( ’. ’ , end = ’ ’)
print ( ’= ’ * 20)
score = True
for a , s in training_set :
p = ann . predict ( a )
scr = str ( p . map ( round ) ) == str ( s . map ( short ) )
print ( a . map ( short ) , p . map ( short ) , p . map ( round ) , s . map ( short ) , scr )
score &= scr
print ( ’ Good ␣ learning ? ’ , score )
2.2 Predict
With the trained weights, we can now use our network.
from matrix import Matrix
from neuralnetwork import DFF
ann = DFF (
(3 , 4 , 2) ,
weights =[
Matrix ([[ -1 , -4 , 6 , -4] , [5 , 0 , 6 , 1] , [4 , -4 , -8 , 6]]) ,
Matrix ([[ -10 , 10] , [5 , -5] , [7 , -7] , [6 , -6]])
]
)
d = Matrix (((1 , 0 , 1) ) )
p = ann . predict ( d )
print ( p )
References
[1] Jean-Claude Heudin, Comprendre le deep learning, une introduction aux réseaux de neurones,
Science-eBook, Octobre 2016, ISBN 979-10-91245-44-9.
[2] Jean-Claude Heudin, Intelligence Artificielle, manuel de survie, Science-eBook, Octobre 2017,
ISBN 978-2-37743-000-0.
4
[3] Damien George, MicroPython, George Robotics Limited, https://micropython.org/.
[4] Nicholas H. Tollervey Programming with MicroPython, embedded programming with MicroPython
& Python, O’Reilly, 1st edition, October 2017, ISBN 978-1-491-97273-1.