NN Mat Workshop
NN Mat Workshop
By
Dr.M.Sivachitra
Dr.M.Karthik
Department of EEE
Contents
• Introduction to ANN
• Dataset information
• Dataset preparation- sample
• ELM for Prediction(Wind profile)
• ELM for Classification(Habermans,EEG and IS
data)– both normalised and unnormalised
• Complex valued neural networks(CVNN)
• PE_CELM for Classification(IS data)
• Conclusion
Introduction to Artificial Neural
Networks
A new sort of computer
• What are (everyday) computer systems good at... and
not so good at?
Good at Not so good at
Rule-based systems: Dealing with noisy data
doing what the programmer
wants them to do Dealing with unknown
environment data
Massive parallelism
Fault tolerance
Adapting to circumstances
Neural networks
• Neural network: information processing
paradigm inspired by biological nervous
systems, such as our brain
• Structure: large number of highly
interconnected processing elements (neurons)
working together
• Like people, they learn from experience (by
example)
Neural networks
• Neural networks are configured for a specific
application, such as pattern recognition or
data classification, through a learning process
• In a biological system, learning involves
adjustments to the synaptic connections
between neurons
same for artificial neural networks (ANNs)
Where can neural network systems help
8
ASSOCIATION OF BIOLOGICAL NET
WITH ARTIFICIAL NET
9
BUILDING BLOCKS OF ARTIFICIAL
NEURAL NET
Network Architecture (Connection
between Neurons)
Activation Function
10
ARTIFICIAL NEURAL NET
W1
X1 Y
W2
X2
11
OPERATION OF A NEURAL NET
- Bias
x0 w0j
x1 w1j
å f
Output y
xn wnj
12
Architecture
Zk k
Wj,k
Zj j
Vi,j
Xi i
April 2007 14
Basics of NN
• Input Vector
• Output Vector
• Activation
• Signal function or Activation function
• Weight
• Bias
PROCESSING OF AN ARTIFICIAL NET
The neuron is the basic information processing
unit of a NN. It consists of:
1. A set of links, describing the neuron inputs, with
weights W1, W2, …, Wm.
2. An adder function (linear combiner) for
computing the weighted sum of the inputs (real
numbers): m
u W j X j
j1
y (u b)
16
MULTI LAYER ARTIFICIAL
NEURAL NET
INPUT: records without class attribute with
normalized attributes values.
INPUT VECTOR: X = { x1, x2, …, xn}
where n is the number of (non-class) attributes.
INPUT LAYER: there are as many nodes as non-
class attributes, i.e. as the length of the input vector.
HIDDEN LAYER: the number of nodes in the hidden
layer and the number of hidden layers depends on
implementation.
17
LAYER PROPERTIES
• Input Layer: Each input unit may be
designated by an attribute value possessed
by the instance.
18
TRAINING PROCESS
Supervised Training - Providing the network
with a series of sample inputs and
comparing the output with the expected
responses.
Activation functions:
(A)Identity
(B)Binary step
(C)Bipolar step
(D)Binary sigmoidal
(E)Bipolar sigmoidal
(F)Ramp
April 2007 20
FEW APPLICATIONS OF NEURAL
NETWORKS
21
CONSTRUCTING ANN
•Determine the network properties:
• Network topology
• Types of connectivity
• Order of connections
• Weight range
•Determine the node properties:
• Activation range
•Determine the system dynamics
• Weight initialization scheme
• Activation – calculating formula
• Learning rule
22
PROBLEM SOLVING
Select a suitable NN model based on the nature
of the problem.
23
Applications
25
SAMPLE - INFORMATION
Data has been collected from UCI machine
learning repository
No of Classes -2
Total Samples - 305
Total Training Samples - 244
Total Testing Samples - 61
Attribute Domain
Age [30,83]
Year [1958,1970]
Positive [0,52]
Survival [Positive,Negative]
26
Planning and relaxed data- UCI
dataset
Input attributes :
12 wavelet coefficients obtained through
wavelet packet analysis in the frequency band of 7
to 13 Hz
Output attribute:
Survival status (class attribute)
1 = Planning state
2 = Relaxed state
27
DATASET INFORMATION
learning repository
Class- 2
Total Samples- 182
Total Training Samples - 91
Total Testing Samples – 91
Training data -50% and testing data -50% with two
fold cross validation
28
Wind Profile Prediction dataset information
‘http://mesonet.agron.iastate.edu/request
/awos/1min.php’
The data were averaged over 1-h intervals.
Inputs
Targets
a1+ib1 a2+ib2 …. a5+ib5 a6+ib6
(1st hr) (2nd hr ) …. (5th hr ) ( 6th hr )
30
EXTREME LEARNING MACHINE
31
ELM ARCHITECTURE
32
LEARNING STEPS FOR ELM
33
Introduction to Complex Valued Neural Networks(CVNNs)
34
Synthetic Function Approximation Problem