0% found this document useful (0 votes)
6 views

Week 3 - Lab

Uploaded by

saad.bashir1431
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
6 views

Week 3 - Lab

Uploaded by

saad.bashir1431
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 5

[Introduction to Fuzzy/Neural Systems]

3
[On Hebbian Learning Process methods]

Module 3 Exp 3: On Hebbian Learning Process


methods

Course Learning Outcomes:


C4. Conduct experiments using advanced tools related to fuzzy and neural
systems.
C5. Extend existing knowledge in designing a feedback fuzzy controller and
single/multilayer neural networks based on the given Industrial
requirements.
C7. Interpret and evaluate through written and oral forms.
C8. Lead multiple groups and projects with decision making responsibilities.
Topics
Exp 3: On Hebbian Learning Process methods
Exp 3: On Hebbian Learning Process methods
1. Aim/Objective:
What is the objective of this experiment?
2. Theory
Provide explanation On Hebbian Learning Process methods
3. Required Components

Python programming software/ https://www.kaggle.com/


4. Procedure:
Step by step implementation
5. Program
Python program to implement the experiment
6. Result
State the output with figures
7. Conclusion
Summarize what you have learned and performed in the experiment

Course Module
Hebb's rule

Hebb's rule has been proposed as a conjecture in 1949 by the Canadian psychologist
Donald Hebb to describe the synaptic plasticity of natural neurons. A few years after its
publication, this rule was confirmed by neurophysiological studies, and many research
studies have shown its validity in many application, of Artificial Intelligence. Before
introducing the rule, it's useful to describe the generic Hebbian neuron, as shown in the
following diagram:

Generic Hebbian neuron with a vectorial input

The neuron is a simple computational unit that receives an input vector x, from the pre-
synaptic units (other neurons or perceptive systems) and outputs a single scalar value, y.
The internal structure of the neuron is represented by a weight vector, w, that models the
strength of each synapse. For a single multi-dimensional input, the output is obtained as
follows:

Python program
x1=int(input("enter x1"))
x2=int(input("enter x2"))
w11=int(input("enter w11"))
w21=int(input("enter w21"))

yin1=(1+(x1*w11+x2*w21))
print("Yin1 is", yin1)
y=1/(1+(2.71828**yin1))
print("y is ", y)
wi1=y*x1
wi2=y*x2
w11new=w11+wi1
[Introduction to Fuzzy/Neural Systems]
3
[On Hebbian Learning Process methods]

w21new=w21+wi2

print("w11new is ", w11new)


print("w21new is ", w21new)

yin1new=(1+(x1*w11new+x2*w21new))
ynew=1/(1+(2.71828**yin1new))
print("value of y in second iteration is: ", ynew)
or
#Learning Rules #
import math

def computeNet(input, weights):


net = 0
for i in range(len(input)):
net = net + input[i]*weights[i]
print "NET:"
print net
return net

def computeFNetBinary(net):
f_net = 0
if(net>0):
f_net = 1
if(net<0):
f_net = -1
return f_net

def computeFNetCont(net):
f_net = 0
f_net = (2/(1+math.exp(-net)))-1
return f_net

def hebb(f_net):
return f_net

def perceptron(desired, actual):


return (desired-actual)

def widrow(desired, actual):


return (desired-actual)

Course Module
def adjustWeights(inputs, weights, last, binary, desired, rule):
c=1
if(last):
print "COMPLETE"
return
current_input = inputs[0]
inputs = inputs[1:]
if desired :
current_desired = desired[0]
desired = desired[1:]
if len(inputs) == 0:
last = True
net = computeNet(current_input, weights)
if(binary):
f_net = computeFNetBinary(net)
else:
f_net = computeFNetCont(net)
if rule == "hebb":
r = hebb(f_net)
elif rule == "perceptron":
r = perceptron(current_desired, f_net)
elif rule == "widrow":
r = widrow(current_desired, net)
del_weights = []
for i in range(len(current_input)):
x = (c*r)*current_input[i]
del_weights.append(x)
weights[i] = x
print("NEW WEIGHTS:")
print(weights)
adjustWeights(inputs, weights, last, binary, desired, rule)

if __name__=="__main__":
#total_inputs = (int)raw_input("Enter Total Number of Inputs)
#vector_length = (int)raw_input("Enter Length of vector)
total_inputs = 3
vector_length = 4
#for i in range(vector_length):
#weight.append(raw_input("Enter Initial Weight:")
weights = [1,-1,0,0.5]
inputs = [[1,-2,1.5,0],[1,-0.5,-2,-1.5],[0,1,-1,1.5]]
desired = [1,2,1,-1]
print("BINARY HEBB!")
adjustWeights(inputs, [1,-1,0,0.5], False, True, None, "hebb")
print("CONTINUOUS HEBB!")
adjustWeights(inputs, [1,-1,0,0.5], False, False, None, "hebb")
print("PERCEPTRON!")
adjustWeights(inputs, [1,-1,0,0.5], False, True, desired, "perceptron")
print("WIDROW HOFF!")
adjustWeights(inputs, [1,-1,0,0.5], False, True, desired, "widrow")
[Introduction to Fuzzy/Neural Systems]
3
[On Hebbian Learning Process methods]

<Exercise 1.
Lab Activity: Simulation
Design and develop the neural network system for the following experiment
Experiment 1: Hebbian learning
1. Design and train a neural network system which can perform AND and OR
operation.
2. Tune the neural network model and minimize the error by updating the
weights and perform the testing.
3. Run the simulation in group and explain the working principles of the
algorithm.
4. Interpret the output of the designed neural network system by varying the
inputs.

References and Supplementary Materials


Books and Journals

1. Van Rossum, G. (2007). Python Programming Language. In USENIX annual technical


conference (Vol. 41, p. 36).
2. SN, S. (2003). Introduction to artificial neural networks.
3. Rashid, T. (2016). Make your own neural network. CreateSpace Independent Publishing
Platform.

Online Supplementary Reading Materials

1. Chaitanya Singh; How to Install Python. (n.d.). Retrieved 14 May 2020, from
https://beginnersbook.com/2018/01/python-installation/; 14-05-2020

Course Module

You might also like