Week 3 - Lab
Week 3 - Lab
3
[On Hebbian Learning Process methods]
Course Module
Hebb's rule
Hebb's rule has been proposed as a conjecture in 1949 by the Canadian psychologist
Donald Hebb to describe the synaptic plasticity of natural neurons. A few years after its
publication, this rule was confirmed by neurophysiological studies, and many research
studies have shown its validity in many application, of Artificial Intelligence. Before
introducing the rule, it's useful to describe the generic Hebbian neuron, as shown in the
following diagram:
The neuron is a simple computational unit that receives an input vector x, from the pre-
synaptic units (other neurons or perceptive systems) and outputs a single scalar value, y.
The internal structure of the neuron is represented by a weight vector, w, that models the
strength of each synapse. For a single multi-dimensional input, the output is obtained as
follows:
Python program
x1=int(input("enter x1"))
x2=int(input("enter x2"))
w11=int(input("enter w11"))
w21=int(input("enter w21"))
yin1=(1+(x1*w11+x2*w21))
print("Yin1 is", yin1)
y=1/(1+(2.71828**yin1))
print("y is ", y)
wi1=y*x1
wi2=y*x2
w11new=w11+wi1
[Introduction to Fuzzy/Neural Systems]
3
[On Hebbian Learning Process methods]
w21new=w21+wi2
yin1new=(1+(x1*w11new+x2*w21new))
ynew=1/(1+(2.71828**yin1new))
print("value of y in second iteration is: ", ynew)
or
#Learning Rules #
import math
def computeFNetBinary(net):
f_net = 0
if(net>0):
f_net = 1
if(net<0):
f_net = -1
return f_net
def computeFNetCont(net):
f_net = 0
f_net = (2/(1+math.exp(-net)))-1
return f_net
def hebb(f_net):
return f_net
Course Module
def adjustWeights(inputs, weights, last, binary, desired, rule):
c=1
if(last):
print "COMPLETE"
return
current_input = inputs[0]
inputs = inputs[1:]
if desired :
current_desired = desired[0]
desired = desired[1:]
if len(inputs) == 0:
last = True
net = computeNet(current_input, weights)
if(binary):
f_net = computeFNetBinary(net)
else:
f_net = computeFNetCont(net)
if rule == "hebb":
r = hebb(f_net)
elif rule == "perceptron":
r = perceptron(current_desired, f_net)
elif rule == "widrow":
r = widrow(current_desired, net)
del_weights = []
for i in range(len(current_input)):
x = (c*r)*current_input[i]
del_weights.append(x)
weights[i] = x
print("NEW WEIGHTS:")
print(weights)
adjustWeights(inputs, weights, last, binary, desired, rule)
if __name__=="__main__":
#total_inputs = (int)raw_input("Enter Total Number of Inputs)
#vector_length = (int)raw_input("Enter Length of vector)
total_inputs = 3
vector_length = 4
#for i in range(vector_length):
#weight.append(raw_input("Enter Initial Weight:")
weights = [1,-1,0,0.5]
inputs = [[1,-2,1.5,0],[1,-0.5,-2,-1.5],[0,1,-1,1.5]]
desired = [1,2,1,-1]
print("BINARY HEBB!")
adjustWeights(inputs, [1,-1,0,0.5], False, True, None, "hebb")
print("CONTINUOUS HEBB!")
adjustWeights(inputs, [1,-1,0,0.5], False, False, None, "hebb")
print("PERCEPTRON!")
adjustWeights(inputs, [1,-1,0,0.5], False, True, desired, "perceptron")
print("WIDROW HOFF!")
adjustWeights(inputs, [1,-1,0,0.5], False, True, desired, "widrow")
[Introduction to Fuzzy/Neural Systems]
3
[On Hebbian Learning Process methods]
<Exercise 1.
Lab Activity: Simulation
Design and develop the neural network system for the following experiment
Experiment 1: Hebbian learning
1. Design and train a neural network system which can perform AND and OR
operation.
2. Tune the neural network model and minimize the error by updating the
weights and perform the testing.
3. Run the simulation in group and explain the working principles of the
algorithm.
4. Interpret the output of the designed neural network system by varying the
inputs.
1. Chaitanya Singh; How to Install Python. (n.d.). Retrieved 14 May 2020, from
https://beginnersbook.com/2018/01/python-installation/; 14-05-2020
Course Module