0% found this document useful (0 votes)
14 views35 pages

PNAL8 SelfOrganizingMaps

Uploaded by

engineeringengtr
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
14 views35 pages

PNAL8 SelfOrganizingMaps

Uploaded by

engineeringengtr
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 35

Perceptron Networks and Applications

M. Ali Akcayol
Gazi University
Department of Computer Engineering
Content
 Self organizing maps
 Architecture
 Training
 Competitive process
 Cooperative process
 Synaptic adaptation
 Applications

2
Self organizing maps
 Perceptron, Adaline, backpropagation networks, radial basis
function networks are based on supervised learning.
 The self organizing map (SOM) network is the most prominent
type of artificial neural network model which uses
unsupervised learning.
 SOM is trained using an unsupervised learning algorithm.
 SOMs (also called as Kohonen SOM) are the category of
competitive learning networks.
 No human intervention is required during the training phase.
 A very little information about the characteristics of the input
data is needed.

3
Self organizing maps
 Unlike other ANNs, SOMs use a neighborhood function to
conserve the topological properties of the input space.
 In other words, it provides a topological preserving mapping
from the high dimensional space to lower dimensional space.
 The relative distance between the points are preserved during
the mapping.
 Points that are closest to each other in the input space are
mapped to adjacent map portions in the SOM.
 The SOM can thereby serve as a cluster analyzing technique
for high dimensional data.
 Also, the SOM is capable of generalizing.

4
Self organizing maps
 SOMs are also include two operating modes: training and
mapping.
 Training refers to the building of map using input examples
(vector quantization).
 Mapping refers to the classification of a new input vector.
 A SOM consists of elements called nodes or neurons.
 Each node is assigned with a weight vector and a map space
is associated along with it.
 The nodes are usually arranged in a two-dimensional regular
spacing, such as hexagonal or rectangular grid pattern.

5
Self organizing maps
 In simple words, the SOMs can be considered as an
arrangement of two dimensional assembly of neurons.
 SOM topologies can be in one, two (most common) or even
three dimensions.
 Two most used two dimensional grids in SOMs are rectangular
and hexagonal grid.
 Three dimensional topologies can be in form of a cylinder or
toroid shapes.

6
Content
 Self organizing maps
 Architecture
 Training
 Competitive process
 Cooperative process
 Synaptic adaptation
 Applications

7
Architecture
 SOM networks are mostly designed as 1-D and 2-D.

8
Content
 Self organizing maps
 Architecture
 Training
 Competitive process
 Cooperative process
 Synaptic adaptation
 Applications

9
Training
 A SOM does not require a target output value unlike many
other types of networks.
 Instead, the area is selectively optimized to more closely
resemble the data for the class. The member of the class is
the input vector.
 From the initial classification of random weights, and after
going through several iterations, the SOM finally settles into a
map of stable sectors.
 Each sector is a feature classifier.
 The graphical output can be concluded as a type of feature
map of the input space.

10
Training
 The algorithm utilized for the self organization of the network
is based on three fundamental processes:
 Competitive process
 Cooperation process
 Synaptic adaptation

11
Content
 Self organizing maps
 Architecture
 Training
 Competitive process
 Cooperative process
 Synaptic adaptation
 Applications

12
Competitive process
 First of all, each node’s weights are initialized.
 A vector is selected randomly from the set of training data and
introduced over the lattice.
 Each node’s weight is investigated to determine whose
weights are most alike the input vector.
 The winning node is referred to as the Best Matching Unit
(BMU).
 To determine the BMU, the Euclidean distance for all the
nodes is calculated.
 The node having the weight vector nearest to the input vector
is categorized as the winning node or the BMU.

13
Competitive process
 The steps for the competition are:
 Let the dimension of the input space is denoted by m.
 A pattern chosen randomly from input space.

 Each node’s synaptic weight in the output layer has the same
dimension as the input space. The weight of neuron j is denoted as:
n = number of neurons at output layer

14
Competitive process
 The steps for the competition are:
 In order to determine the best match of the input vector x with the
synaptic weights wj the Euclidean distance is used.
 The Euclidean distance between a pair of vectors X and Wj is
represented by,

where, x is the input vector (current) and w is the node’s weight


vector.

15
Competitive process
 The steps for the competition are:
 For example, to calculate the distance between the input vector, the
color red (1, 0, 0) with an arbitrary weight vector (0.1, 0.4, 0.5),

 The neuron with the infinitesimal distance is called i(x):

 The neuron (i) that fulfills the above condition is called best-
matching or winning neuron for the input vector x.

16
Content
 Self organizing maps
 Architecture
 Training
 Competitive process
 Cooperative process
 Synaptic adaptation
 Applications

17
Cooperative process
 In this step, the radius of the neighborhood of the best
matched unit is calculated.
 The value initially starts at large and typically set to the
‘radius’ of the lattice, but diminishes with each iteration.
 All the nodes that are present within this radius are considered
to be inside the BMU’s neighborhood.
 After determining the BMU, next we have to observe the other
nodes within the BMU’s neighborhood.
 The weight vectors for all nodes will be adjusted in the next
step.
 This can be done by simply calculating the radius of the
neighborhood.

18
Cooperative process
 Almost all nodes are taken within radius from the start of training.

 The winning neuron effectively locates the center of a topological


neighborhood.
 A winning neuron excites other neurons.
 The neurons that exist in its immediate neighborhood are more
excited.
 Inhibits more the neurons that they are in longer distances.
19
Cooperative process
Cooperation - continue
 In the neighborhood are included only excited neurons, while
inhibited neurons exist outside of the neighborhood.
 Let dij is the lateral distance between neurons i and j
(assuming that i is the winner and it is located in the center of
the neighborhood).
 hji denotes the topological neighborhood around neuron i,
then hji is a unimodal function of distance.

where, is the effective width of the neighborhood.


 It measures the degree up to which the excited neurons in the
area of the winning neuron participate in the learning process.
20
Cooperative process
Cooperation - continue
 The distance among neurons is defined as the Euclidean
metric.

where the discrete vector rj describes the position of excited


neuron j and ri defines the position of the winning neuron in
the lattice.
 Another characteristic feature of the SOM algorithm is that the
size of the neighborhood shrinks with time.
 This requirement is satisfied by making the width of the
Gaussian function decreasing with time.

21
Cooperative process
 A popular choice is the exponential decay function.

where is the value of at the initialization of the SOM


algorithm and is a time constant.
 Correspondingly the neighborhood function assumes a time
dependent form:

22
Content
 Self organizing maps
 Architecture
 Training
 Competitive process
 Cooperative process
 Synaptic adaptation
 Applications

23
Synaptic adaptation
 All the nodes in the lattice to be determined whether they lay
within the radius or not.
 If a node is present in the neighborhood then its weight vector
is adjusted.
 The rate of alteration in the weights of the node, depends on
its distance from the BMU.
 The adaptive process modifies the weights of the network so
as to achieve the self-organization of the network.
 Just the weight of the winning neuron and that of neurons
inside its neighborhood are adapted.
 All the other neurons have no change in their weights.

24
Synaptic adaptation
 The weight of the winning neuron and neurons inside its
neighborhood are changed.

 The learning rate is also required to be time varying.

 Convergence phase is needed to fine tune the feature map.


 In general the number of iterations needed for this phase is
500 times the total number of neurons.
 During this phase, η must be a small value, (e.g. 0.01).
25
Training
 Flow diagram of Kohonen SOM algorithm.

26
Content
 Self organizing maps
 Architecture
 Training
 Competitive process
 Cooperative process
 Synaptic adaptation
 Applications

27
Applications
World Poverty Map
 The Self-Organizing Map (SOM) can
be used to portray complex
correlations in statistical data.
 In this example, the data consisted
of World Bank statistics of
countries in 1992.
 39 indicators describing various
quality-of-life factors, such as
state of health, nutrition,
educational services, etc, were used.
 The complex joint effect of these
factors can be visualized by
organizing the countries using
the SOM.
28
Applications
World Poverty Map – cont.
 Countries that had similar values of the indicators found a
place near each other on the map.
 The different clusters on the map were automatically encoded
with different bright colors.
 Each country was in fact automatically assigned a color
describing its poverty type in relation to other countries.
 The poverty structures of the world were visualized so that
each country on the geographic map has been colored
according to its poverty type.

29
Applications
World Poverty Map – cont.

30
Applications
World Poverty Map – cont.
 A map of the world where countries have been colored with
the color describing their poverty type.

31
Applications
Animal similarity
 Animal can be grouped using their attributes.
 In this example, 16 animals with 13 attributes are used.

32
Applications
Animal similarity – cont.
 After the SOM trained, the animals have been grouped
according to the similarities.

33
Applications
Color similarity
 In the figure, the similarity between thousands of randomly
distributed points, each with different properties can be found
with SOM network.

34
Homework

 Prepare a report on the use of self organizing maps in medical


applications.

35

You might also like