0% found this document useful (0 votes)
33 views22 pages

Neucom Project Peters

The document describes a project using particle swarm optimization (PSO) to optimize the weights of a neural network implemented on a Silimann neurocomputer evaluation board. PSO is used to minimize the error of the neural network by adjusting the weights through iterations. Software was developed to interface with the Silimann hardware, implement PSO, and optimize weights offline for use on the neurocomputer. The project aims to efficiently train neural networks for applications on embedded hardware.

Uploaded by

shanthi_mimina
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
33 views22 pages

Neucom Project Peters

The document describes a project using particle swarm optimization (PSO) to optimize the weights of a neural network implemented on a Silimann neurocomputer evaluation board. PSO is used to minimize the error of the neural network by adjusting the weights through iterations. Software was developed to interface with the Silimann hardware, implement PSO, and optimize weights offline for use on the neurocomputer. The project aims to efficiently train neural networks for applications on embedded hardware.

Uploaded by

shanthi_mimina
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 22

Institute of

Integrated Sensor Systems


Dept. of Electrical Engineering and Information Technology

Weight Optimization for a Neural Network


using Particle Swarm Optimization (PSO)

Stefanie Peters

October 27, 2006


Prof. Dr.-Ing. Andreas König

Presenter: Stefanie Peters Prof. Dr.-Ing. Andreas König


Lecture Information

Neurocomputing

Prof. Dr.-Ing. Andreas König


Institute of Integrated Sensor Systems

Dept. of Electrical Engineering and Information Technology


University of Kaiserslautern

Fall Semester 2006

Presenter: Stefanie Peters Prof. Dr.-Ing. Andreas König


What did we learn?

Back Propagation

Digital Neural Network Hardware

Analog Neural Network Hardware

Neural Network Visualization

Technical Real World Problems

Presenter: Stefanie Peters Prof. Dr.-Ing. Andreas König


Neurocomputing Project
Weight Optimization for a Neural Network using Particle Swarm
Optimization (PSO)
Silimann
Evaluation
Board

Implementation of Neurocomputer
offline training of
the weights for the
Silmann neural network

Silimann
Trainer
Software

Presenter: Stefanie Peters Prof. Dr.-Ing. Andreas König


Silimann 120cx evaluation board

Silimann 120cx
evaluation board

● Neurocomputer
● Max: 10-6-10 network
=> (10 input neurons, 6 hidden
neurons and 10 output neurons)
● Feed forward neural network
● Output:
Example from Silimann Neuromanual 1.2

Presenter: Stefanie Peters Prof. Dr.-Ing. Andreas König


Particle Swarm Optimization (PSO)

• Population based search algorithm


• Developed to simulate the behavior
of birds in search for food on a
cornfield or fish school
• Evolutionary Computation
Technique
• Each individual (here: particle) has a
randomly initialized position X and
velocity V in the search space

Presenter: Stefanie Peters Prof. Dr.-Ing. Andreas König


Particle Swarm Optimization (PSO)
Depending on
• its actual position and velocity (1)
• its own previous best position (2)
• and the previous best position of a particle in a defined
neighborhood (e.g. the complete swarm) (2)
the new velocity (3) and position (4) in the search space is found.

Particles 4.
1.
2.
3. Previous best
position
Global best
Velocity position
search space
Presenter: Stefanie Peters Prof. Dr.-Ing. Andreas König
Particle Swarm Optimization (PSO)
1. Actual position and velocity 2. Local and global best position

1. 2.

search space search space


3. New velocity 4. New position
3. 4.

search space search space

Particles Velocity Previous best position Global best position


Presenter: Stefanie Peters Prof. Dr.-Ing. Andreas König
Particle Swarm Optimization (PSO)

V t i+1 = V t i + C1 R1 ⋅ ( XLB ti' − X ti ) + C 2 R 2 ⋅ ( XGB tn'' − X ti )


X ti+1 = X ti + Vti+1

V: velocity of particle i at time t / t +1,


X: position in the search space
XLB: best previous position of the particle i with t’ < t (X Local Best)
XGB: best previous position of a particle in the n- neighborhood (e.g. the
complete swarm) of the particle. (X Global Best)
C1 and C2 are positive constants, usually in the range [1,4]
R1 and R2 are two random functions taking values in the range [0,1].

Presenter: Stefanie Peters Prof. Dr.-Ing. Andreas König


Software Implementation PSO

Hardware restrains:
• Max: 10-6-10 network
=> 10 input neurons, 6 hidden neurons and 10 output neurons
=> max. (10+1)*6 + (6+1)*10 = 136 weights (= parameters) for the PSO
=> Those network parameters can be easily changed in the program

Part of a net-file
• Input (data) and output files (basically:
for „Silimann Trainer“
weights of the NN) of the software must
be compatible to the “Silimann Trainer”
files.
• Those values must lie in the range [-1,1].

Presenter: Stefanie Peters Prof. Dr.-Ing. Andreas König


Software Implementation PSO

Software design: COptimizationPSO pso( nNoInputParam


, nNoClasses, nNoTrainSamples
• Implementation of a standard PSO , nNoParticles, fConst1,
fConst2);
• The number of features, classes and Constructor call for a pso run.
Note: each parameter in the constructor has
samples must be known (Constructor) a default value. COptimizationPSO pso( ) will
before loading any data from a file. call pso( 10, 10, 1, 20, 2, 2)

• Adjustable PSO Parameters: pso.LoadTrainingData(„iris_train.txt“)

• Swarm size (Constructor), Loading training data.

• Constants C1 and C2 (Constructor), pso.SetVRange(0.6);


• Maximum/minimum velocity Set minimum/maximum velocity.
• Boundaries of the search space
pso.SetXMinMax (-1, 1);

Set boundaries of the search space.

Presenter: Stefanie Peters Prof. Dr.-Ing. Andreas König


Software Implementation PSO

• Each layer of the NN has its own


activity function (e.g. to limit the
pso.SetActivityFunctionInput(„satus“, s);
output to a defined range).
Set activity function input layer.
• Linear: no restriction of the NN
outputs
• Tanhyp: hyperbolic tangent, pso.SetActivityFunctionHiddenNeuron(„tanhyp“);

-> restriction to values in the range Set activity function hidden neurons.
[-1, 1].
• Saltus: step function, global pso.SetActivityFunctionOutput(„linear“);
threshold s, -> restriction to values
Set activity function output layer.
{-1, 1}.
• More activity functions can be easily
implemented.

Presenter: Stefanie Peters Prof. Dr.-Ing. Andreas König


Software Implementation PSO

• Evaluation of the fitness of a swarm


particle

• PMSE: percentage of mean square


error. PMSE of all N neural
network outputs oi is calculated for
all P pattern samples (tpi: desired PMSE for fitness evaluation.
result).

• EC: Sum of classification errors


ECp for all pattern samples.
If the maximum neural network
output max[opi] corresponds to
tpi = 1 -> ECp = 0, else ECp = 1.

Presenter: Stefanie Peters Prof. Dr.-Ing. Andreas König


Software Implementation PSO

• Evaluation of the fitness of a swarm


particle

• EC2: like EC, but the minimum


difference mdiff = max[opi] - opj pso.SetFitnessFunction(„PMSE“);
(j ≠ i) must be greater than a Set fitness function.
predefined threshold to generate Note: At the current state, only one of these
parameters (PMSE, EC, …) can be applied to
an error ECp = 0. optimize the weights of the neural network
(the other parameters are computed for each
particle and displayed during the optimization
• Further fitness computation as well). The PSO minimizes the selected
fitness parameter.
functions are implemented as
well. See the source code for
more information.

Presenter: Stefanie Peters Prof. Dr.-Ing. Andreas König


Software Implementation PSO

Further important Functions


• InitializePSO( Key ): Sets the seed point for all used random numbers.
(Key == 0: the system time is used as key).
Call these function for every generation:
• ComputeFitness(): Computes the fitness for all particles.
• UpdateVelocity(): Computes and sets the new velocity and position for all
particles.

Output of the fitness parameters during the optimization.

Presenter: Stefanie Peters Prof. Dr.-Ing. Andreas König


Software Implementation PSO

Multiple debugging options increase the comprehensibility of the software


• For example: Output of the hidden and output neurons (to screen or to file)

Presenter: Stefanie Peters Prof. Dr.-Ing. Andreas König


Iris Data

Iris Plants Database


• The data set contains 3 classes of 50 instances each,
where each class refers to a type of iris plant.
• 4 Parameters: sepal length and width, petal
length and width

• => 4 – 6 – 3 Neural Network


• => Training set: 25 instances of each class
• => Test set: 25 different instances of each class
• Sources:
(a) Creator: R.A. Fisher
4 – 6 – 3 Feed forward
(b) Donor: Michael Marshall Neural Network
(MARSHALL%[email protected])
(c) Date: July, 1988Evolutionary Computation Technique

Presenter: Stefanie Peters Prof. Dr.-Ing. Andreas König


Iris Data
A Resulting Net-File weights:
(Optimization: 40 Particles, PMSE, hyptan-Activity for hidden and output neurons,
C1, C2 = 2)
Results Optimization Offline:
Training 75/75 pattern, Test: 73/75 pattern correctly classified.
W1_TRAIN
0.1361;0.5477;0.3449;0.0754;-0.2446
-0.2446;0.3370;-0.8200;0.9504;0.2757
0.2757;0.4560;-0.4385;-0.1748;-0.0970
-0.0970;0.4672;0.3002;0.0073;0.8531
0.8531;0.5733;-0.5808;-0.0668;0.1156
0.1156;0.9243;-0.1257;-0.9057;0.8236
W2_TRAIN
0.1290;-0.7530;0.1069;0.9800;0.3467;-0.7851;0.9652 4 – 6 – 3 Feed forward
-0.8735;-0.0558;-0.3678;0.2691;0.3834;0.5740;-0.2593 Neural Network
0.1578;-0.3466;0.2025;0.1710;-0.7851;0.9503;-0.0766

Presenter: Stefanie Peters Prof. Dr.-Ing. Andreas König


Results: Iris Data 1

• Uploading of the trained weights to the Silimann Evaluation


Board and testing of Training and Test pattern samples:

75 of 75
training
patterns are
correctly
classified by
the Silimann
evaluation
board

Example: Training Class2 Example: Training Class3

Presenter: Stefanie Peters Prof. Dr.-Ing. Andreas König


Results: Iris Data 1

• Uploading of the trained weights to the Silimann Evaluation


Board and testing of Training and Test pattern samples:

75 of 75
training
patterns are
correctly
classified by
the Silimann
evaluation
board

Example: Training Class2 Example: Training Class3

Presenter: Stefanie Peters Prof. Dr.-Ing. Andreas König


Results: Iris Data 1

• Uploading of the trained weights to the Silimann Evaluation


Board and testing of Training and Test pattern samples:

71 of 75 test
patterns are
correctly
classified by
the Silimann
evaluation
board

Example: Test Class2 Example: Test Class3

Presenter: Stefanie Peters Prof. Dr.-Ing. Andreas König


Questions

Presenter: Stefanie Peters Prof. Dr.-Ing. Andreas König

You might also like