0% found this document useful (0 votes)
136 views3 pages

Paper 2

A researcher created a neural network to recognize noisy digits using male and female neurons. The network included input, output, and hidden layers. Neurons in the hidden layers used either a standard activation function or one modeling male psyche with a "blind area" for small inputs. Testing found that a network with equal numbers of male and female neurons in hidden layers recognized digits most accurately. The network with male neurons generalized better, correctly identifying digits that a standard network could not. Including male neurons in neural networks may improve recognition performance.

Uploaded by

Rakeshconclave
Copyright
© Attribution Non-Commercial (BY-NC)
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
136 views3 pages

Paper 2

A researcher created a neural network to recognize noisy digits using male and female neurons. The network included input, output, and hidden layers. Neurons in the hidden layers used either a standard activation function or one modeling male psyche with a "blind area" for small inputs. Testing found that a network with equal numbers of male and female neurons in hidden layers recognized digits most accurately. The network with male neurons generalized better, correctly identifying digits that a standard network could not. Including male neurons in neural networks may improve recognition performance.

Uploaded by

Rakeshconclave
Copyright
© Attribution Non-Commercial (BY-NC)
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 3

International Journal of Computer Information Systems,

Vol. 3, No. 6, 2011



Male Neurons Use in Multilayer Neural Networks

Andrey Makarovskiy
Applied Mathematics Department
Moscow Institute of Electronics and Mathematics
Moscow, Russia
[email protected]


Abstract A special software has been created to recognize noisy
digits using a multilayer neural network. It helped to find that
using the same number of male and female neurons in the hidden
layers improves the correctness of the digits recognition.
Keywords-mail neuron, neural network
I. INTRODUCTION
The work [1] specifies the fundamental differences between
the male and female psyche and suggests the neuron activation
functions modeling the male psyche type. For that purpose,
neurons need to have the blind area for the small input signal
level. The neuron with the activation function in the form of
hyperbolic tangent can serve an example (Figure 1).
Figure 1. Neuron activation function in the form of hyperbolic tangent with
the blind area.
This activation function is described by the formulae:

s + =
s s =
> =
l x l x th f
l x l f
l x l x th f
), (
, , 0
, ), (



where l is half of the blind area width.
These neurons can be used in multilayer neural networks.

II. MULTILAYER NEURAL NETWORK.

A multilayer neural network usually has one input layer,
one output layer and one or two hidden layers. Its structure is
described in details in [2].
Each layer has N
k
neurons, k=1, , L, where L is the
number of layers. Let
k
i
NR stand for it. Input signals x
i
(t),
i=1, ..., N
1
come to the input layer. Let y stand for the neuron
input signals.
All the input signals come to every neuron input of the first
level. All the neuron outputs of each level are connected to the
neuron input of the following level according to the principle
each to each, i.e. all the output signals of the previous level
neurons come to the inputs of any neuron. Each neuron adds
all the input signals to its weighs characterizing it and applies
the activation function to the summarized signal forming the
output signal.
Neuron activation function can differ. The present work
takes hyperbolic tangent for the neuron activation function of
all the layers except for the output one. Hyperbolic tangent
with the blind area is applied to the hidden layer neurons. A
linear function is applied to the output neurons. For this
purpose, the neuron output
k
i
NR is calculated by the equation
(1):
j input k layer i neuron of weight the is w where
t x t w t s f t y
or L k t x t w f t s f t y
k
ij
L
j
N
j
L
ij
L
i out
L
i
N
j
k
j
k
ij
k
i
k
i
L
k
, ,
), ( ) (
2
1
)) ( ( ) (
, )), ) ( ) ( ( )) ( ( ) (
) (
) (
0
) ( ) ( ) (
0
) ( ) ( ) ( ) (
1

=
=
= =
< = =


(1)

f and f
out
in this case stand for neuron activation functions
in layers 1-3 and the output layer respectively.
December Issue Page 7 of 72 ISSN 2229 5208
International Journal of Computer Information Systems,
Vol. 3, No. 6, 2011
The neuron output signals for layer L
) ( , ), ( ), (
) ( ) (
2
) (
1
t y t y t y
L
N
L L
L

(2)
are at the same time the network output signals. They are
compared to the sample network signals:
) ( , ), ( ), (
) ( ) (
2
) (
1
t d t d t d
L
N
L L
L

(3)
The network error is calculated by the equation:

= =
= =
L L
N
i
N
i
L
i
L
i
L
i
t y t d t t Q
1 1
2 ) ( ) ( ) (
)) ( ) ( ( ) ( ) (
2
c (4)
In this case, the neuron network will solve the
classification problem. The network has 10 outlets
corresponding to the range from 0 to 9. The model signals (3)
equal to either -1 or +1: if the study is carried out by M the M
model neuron signal equals to +1 and all the other model
signals equal to -1. The output network signals (2) can assume
different values: it is assumed that if the biggest output value
equals M and is bigger than 0.3 then the input signal
corresponds to M. If there is no signal on the network outputs
that is bigger than 0.3 it is assumed that the network did not
manage to recognize the digit.
The input signals can also have two values (+1 and -1).
The picture is divided into the cells (12 by 12 cells picture was
used for calculations that is 144 cells the value can be
changed). The value +1 corresponds to the presence of the part
of the digit picture in the cell and -1 means that the cell lacks
it. The whole line-expanded input matrix is delivered to the
network input.
III. THE ERROR BACK PROPAGATION ALGORITHM.
The multilayer neural network learning reduces to the
functionality minimizing weight selection (4). For that
purpose, back propagation algorithm is applied based on the
steepest descent method.
[2] provides the basic steepest descent method equations
that are changed a little in this work.
The weights are corrected by the equation:
) (
) (
) ( ) 1 (
) (
) ( ) (
t w
t Q
t w t w
k
ij
k
ij
k
ij
c
c
= + q

where q stands for the learning rate. The error function
derivative is calculated by the equation:
) (
) (
) (
) (
) (
) (
) (
) (
) (
) (
) ( ) (
) (
) ( ) (
t x
t s
t Q
t w
t s
t s
t Q
t w
t Q
k
j
k
i
k
ij
k
i
k
i
k
ij
c
c
=
c
c
c
c
=
c
c


Let us introduce the value:
) (
) (
2
1
) (
) (
) (
t s
t Q
t
k
i
k
i
c
c
= o
(5)
then
) ( ) ( 2 ) ( ) 1 (
) ( ) ( ) ( ) (
t x t t w t w
k
j
k
i
k
ij
k
ij
qo + = +
The calculation method (5) depends on the layer number.
For the last layer, we have:
)) ( ( ) (
) (
) (
2
1
) (
) ( ' ) (
) (
1
) (
) (
2
t s f t
t s
t
t
L
i out
L
i
L
i
N
m
L
m
L
i
L
c
c
o =
c
c
=

=


For the layer k that is different from L, we have:
) ( ) ( )) ( (
) (
) (
) (
) (
2
1
) (
) 1 (
1
) 1 ( ) ( '
1
) (
) 1 (
) 1 (
) (
1
1
t w t t s f
t s
t s
t s
t Q
t
k
mi
N
m
k
m
k
i
N
m
k
i
k
m
k
m
k
i
K
k
+
=
+
=
+
+

+
+
=
c
c
c
c
=
o
o


The network learning for a large number of samples is
carried out by the minimum summarized network error (for all
the samples).
IV. EXPERIMENTAL RESULTS
For the male neuron work testing, we designed two similar
applications based on C# with the possibility of digit pictures
transmission via Clipboard. Then one of the applications was
used to select a number of neurons in the hidden layers so that
the network had a good generalization capacity. A number of
neurons in the hidden layer was doubled in the other
application: half of the neurons were ordinary and the other
half was the male one.
480 samples were used all together for the neuron network
learning.
The neuron network consists of 4 layers. The input layer
has 144 neurons corresponding to the number of input signals;
the hidden layers has 80 neurons each (160 for the neural
network with male neurons) and 10 neurons in the output
layer. The weights were initialized by the random numbers in
the range from -0.05 to 0.05. The leaning rate was selected
equal to 0.02 and 0.01 for male neurons.
The error back propagation method for the network
learning is quick enough and takes about 8 minutes on the
notebook Acer Extensa 5620 made in 2008 that is not
remarkable for a great productivity. The learning time is not
essential for this problem as the network needs to be taught
only once but quick learning makes the software debugging
much easier.
December Issue Page 8 of 72 ISSN 2229 5208
International Journal of Computer Information Systems,
Vol. 3, No. 6, 2011









A neural network with male neurons demonstrates a better
generalization capacity. The value of 0.8 was taken for the half
of the blind area width l.
Figure 2. Only the neural networks with male neurons managed to
recognize this digits corectly.
Figure 2 presents the cases when an ordinary network
could not cope with the recognition task but the neural
network with male neurons provided the correct result (the
digits from 0 to 9).
V. CONCLUSIONS
The experiments completed allow to say that artificial
multilayer neural networks with male neurons application is
very perspective. In such a case a higher recognition standard is
provided.


REFERENCES
[1] A. Makarovskiy, Neuron activation functions modeling the types of
male psyche, International Journal of Computer Information Systems,
Vol. 3, No. 4, 2011.
[2] S. Haykin, Neural Networks. Canada, Prentice Hall, 1999.

AUTHORS PROFILE

Mr. A. Makarovsky was born in Novosibirsk, Russia
on May 8, 1966. In 1986 he graduated from the
Moscow Institute of Transport Engineers. Since 2011
he studies at the postgraduate course (computer scince)
in Moscow State Institute of Electronics and
Mathematics (Technical University)


December Issue Page 9 of 72 ISSN 2229 5208

You might also like