0% found this document useful (0 votes)
1 views9 pages

Unsupervised Learning Network

The document discusses unsupervised learning, a machine learning approach that identifies patterns in data without human intervention, focusing on clustering, association, and dimensionality reduction. It specifically highlights Kohonen Neural Networks, also known as self-organizing maps, which organize data into lower-dimensional grids through a competitive learning process. The document outlines the training algorithm for these networks, detailing initialization, competitive and cooperative phases, and the importance of learning rate and neighborhood size decay.

Uploaded by

Divya S S
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
1 views9 pages

Unsupervised Learning Network

The document discusses unsupervised learning, a machine learning approach that identifies patterns in data without human intervention, focusing on clustering, association, and dimensionality reduction. It specifically highlights Kohonen Neural Networks, also known as self-organizing maps, which organize data into lower-dimensional grids through a competitive learning process. The document outlines the training algorithm for these networks, detailing initialization, competitive and cooperative phases, and the importance of learning rate and neighborhood size decay.

Uploaded by

Divya S S
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 9

UNSUPERVISED LEARNING

NETWORK:KOHENEN NEURAL
NETWORKSUBMITTED BY
DIVYA S S
TVE23ECAI05
UNSUPERVISED LEARNING
NETWORK
 It is a type of machine learning.
 Learns from data without human supervision.
 It finds the underlying structure of dataset,group the data according
to similarities and represent that dataset in a compressed format.
Types
 Clustering: It is the method of grouping the objects into clusters such
that objects with most similarities remain into a group and less or no
similarities in the other group.

 Association: used for finding the relationship between variables in


large dataset. For example people who buy X item are tend to buy Y
items.
Contd…
 Dimensionality reduction: It is a way of converting the higher
dimensions dataset into lesser dimensions dataset ensuring that it
provides similar information.
Kohenen Neural Network

 Kohonen networks are a type of neural network , also


known as a self-organizing map.
 It is used for unsupervised learning and data visualization.
 These maps organize data into lower dimensional grid.
 The network neurons arranged in a grid learn to represent
different features of the input data through competitive
learning process.
 In competitive learning, a network of artificial neurons
competes to become an active response to a specific input.
 It has two layers-input layer and output layer.
SOM Training Algorithm
1. Initialization: Initialize the SOM grid with random weights
for each neuron. These weights are typically small random
values.
2. Input data selection: Randomly select a data point from
the dataset.
3. Competitive phase (or Neuron Selection):
1. Compute the similarity between the input data point and
each neuron's weight vector.
2. Determinethe winning neuron (the one with the weight
vector most similar to the input data).
3. This
winning neuron and its neighboring neurons will be
adjusted in the next step.
Contd…

1. Cooperative phase (or Weight Update):


1. Update the weights of the winning neuron and its neighbors to be more
similar to the input data point.
2. Neighboring neurons are determined based on a neighborhood function
which decreases over time (usually a Gaussian function).
3. The weights are adjusted to pull closer to the input data point in the input
space.
2. Iteration:
1. Repeat steps 2-4 for a specified number of iterations or until convergence
criteria are met.
2. Common convergence criteria include reaching a certain number of
iterations, achieving a small enough change in the map, or meeting a
threshold for the neighborhood size.
Contd…

1. Learning Rate and Neighborhood Size Decay:


1. Typically, the learning rate and the neighborhood size decrease over time
to allow the SOM to converge to a stable solution gradually.
2. This decay can be linear or exponential, and it ensures that the SOM
reaches a stable configuration without oscillations.
2. Finalization:
1. Once the training process is completed, the SOM can be used for various
tasks such as data visualization, clustering, or further analysis.
Thank you

You might also like