A Seminar On: - Artificial Neural Network
A Seminar On: - Artificial Neural Network
A Seminar On: - Artificial Neural Network
\
NEURAL NETWORK…
What is a Neural Network?
Adaptive learning: An ability to learn how to do tasks based on the data given
for training or initial experience.
A NEURAL NETWORK(NN)
COMPARISION BETWEEN COMPUTER
SYSTEMS & NEURAL NETWORKS
computer systems are good at fast arithmetic.
computers are also good at doing precisely
what the programmer programs them to do.
but at the same time computer systems are not
so good at –
• interacting with noisy data.
• massive parallelism
• fault tolerism
• and adapting to circumstances
CONTD..
A SIMPLE NEURON
Firing rules
• The firing rule is an important concept in neural networks and
accounts for their high flexibility. A firing rule determines how
one calculates whether a neuron should fire for any input
pattern. It relates to all the input patterns, not only the ones on
which the node was trained.
• A simple firing rule can be implemented by using Hamming
distance technique. The rule goes as follows:
• Take a collection of training patterns for a node, some of which
cause it to fire (the 1-taught set of patterns) and others which
prevent it from doing so (the 0-taught set). Then the patterns
not in the collection cause the node to fire if, on comparison ,
they have more input elements in common with the 'nearest'
pattern in the 1-taught set than with the 'nearest' pattern in the
0-taught set. If there is a tie, then the pattern remains in the
undefined state.
CONTD…
• For example, a 3-input neuron is taught to output 1 when the input
(X1,X2 and X3) is 111 or 101 and to output 0 when the input is 000 or
001. Then, before applying the firing rule, the truth table is;
X1: 0 0 0 0 1 1 1 1
X2: 0 0 1 1 0 0 1 1
X3: 0 1 0 1 0 1 0 1
OUT: 0 0 0/1 0/1 0/1 1 0/1 1
As an example of the way the firing rule is applied, take the pattern
010.
It differs from 000 in 1 element, from 001 in 2 elements, from 101 in 3
elements and from 111 in 2 elements. Therefore, the 'nearest' pattern is
000 which belongs in the 0-taught set. Thus the firing rule requires that
the neuron should not fire when the input is 001. On the other hand,
011 is equally distant from two taught patterns that have different
outputs and thus the output stays undefined (0/1).
CONTD..
• By applying the firing in every column the
following truth table is obtained;
X1: 0 0 0 0 1 1 1 1
X2: 0 0 1 1 0 0 1 1
X3: 0 1 0 1 0 1 0 1
OUT: 0 0 0 0/1 0/1 1 1 1
• For example:
The network of figure 1 is trained to recognizes the
patterns T and H. The associated patterns are all
black and all white respectively as shown below.
• If we represent black squares with 0 and white
squares with 1 then the truth tables for the 3 neurons
after generalisation are;
TOP NEURON
X11: 0 0 0 0 1 1 1 1
X12: 0 0 1 1 0 0 1 1
X13: 0 1 0 1 0 1 0 1
OUT: 0 0 1 1 0 0 1 1
• MIDDLE NEURON
X21: 0 0 0 0 1 1 1 1
X22: 0 0 1 1 0 0 1 1
X23: 0 1 0 1 0 1 0 1
OUT: 1 0/1 1 0/1 0/1 0 0/1 0
• BOTTOM NEURON
X21: 0 0 0 0 1 1 1 1
X22: 0 0 1 1 0 0 1 1
X23: 0 1 0 1 0 1 0 1
OUT: 1 0 1 1 0 0 1 0
Perceptrons
• The most influential work on neural nets in the 60's went under
the heading of 'Perceptrons' a term coined by Frank Rosenblatt.
The Perceptrons turns out to be an MCP model ( neuron with
weighted inputs ) with some additional, fixed, pre--processing.
Units labeled A1, A2, Aj , Ap are called association units and
their task is to extract specific, localized featured from the input
images. Perceptrons mimic the basic idea behind the
mammalian visual system. They were mainly used in pattern
recognition even though their capabilities extended a lot more.
The learning process
• The brain basically learns from experience. The
memorisation of patterns and the subsequent
response of the network can be categorized into two
general paradigms:
• Perhaps the most exciting aspect of neural networks is the possibility that
some day 'conscious' networks might be produced. There is a number of
scientists arguing that consciousness is a 'mechanical' property and that
'conscious' neural networks are a realistic possibility.
• Finally, I would like to state that even though neural networks have a huge
potential we will only get the best of them when they are integrated with
computing, AI, fuzzy logic and related subjects.
THANK YOU…