0% found this document useful (0 votes)
1K views12 pages

Module 2 Hebb Net

This document discusses Hebb neural networks. It provides an overview of Hebb's rule from 1949, which is better suited for bipolar data rather than binary data. It also describes the architecture and training algorithm for Hebbian learning. Finally, it provides examples of using a Hebb network to learn the AND and XOR functions from bipolar input-output pairs, and exercises for designing a Hebb net for the OR function and classifying letter patterns.

Uploaded by

Vighnesh M
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
1K views12 pages

Module 2 Hebb Net

This document discusses Hebb neural networks. It provides an overview of Hebb's rule from 1949, which is better suited for bipolar data rather than binary data. It also describes the architecture and training algorithm for Hebbian learning. Finally, it provides examples of using a Hebb network to learn the AND and XOR functions from bipolar input-output pairs, and exercises for designing a Hebb net for the OR function and classifying letter patterns.

Uploaded by

Vighnesh M
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 12

MODULE:2

HEBB NEURAL NETWORK


HEBB NETWORK
 Donald Hebb in 1949
 Hebb rule is more suited for bipolar data

 limitations : Hebb rule not applicable for binary data

 weight update in Hebb rule:

wi(new) = wi(old) + ∆w
∆w weight change
ARCHITECTURE
HEBBIAN LEARNING-TRAINING
ALGORITHM
 Step 0: First initialize the weights. Basically in this
network they may be zero, i.e., w= 0 for i= 1to n
where "n" may be the total number of input neurons.
 Step 1: Steps 2-4 have to be performed for each input
training vector and target output pair, s: t.
 Step 2: Input units activations are set.

xi =si for i= 1to n


 Step 3: Output units activations are set: y= t
 Step 4: Weight & bias calculations:

wi(new) = wi(old)+ xiy


b(new) = b(old) + y
SOLVED PROBLEM-1
 Realize a Hebb net for the AND function with bipolar
inputs and targets.

x1 x2 y(target)
1 1 1

1 -1 -1

-1 1 -1

-1 -1 -1
 Initialize :
w1=w2=0 and bias, b=0

 Change in weight and bias


∆wi=xiy

∆b=y
SOLVED PROBLEM-1
inputs target Change in Change in bias New New New
weights weight weight bias

x1 x2 y ∆w1 ∆w2 ∆b=y w1 w2 b


=x1y =x2y

1 1 1 1 1 1 1 1 1
1 -1 -1 -1 1 -1 0 2 0
-1 1 -1 1 -1 -1 1 1 -1
-1 -1 -1 1 1 -1 2 2 -2

testing W1=2, w2=2, b=-2 (1, 1)


X1w1+x2w2+b=2  y=f(yin) = 1-- y=t

W1=2, w2=2, b=-2 (1, -1)


X1w1+x2w2+b= -2  y=f(yin) = -1-- y=t
SOLVED PROBLEM-2
 Realize a Hebb net for the XOR function with bipolar
inputs and targets.

x1 x2 y (target)
1 1 -1

1 -1 1

-1 1 1

-1 -1 -1
SOLVED PROBLEM-2

x1 x2 y ∆w1= ∆w2 ∆b=y W1 =0 w2 =0 b =0


x1 y =x2y

1 1 -1 -1 -1 -1 -1 -1 -1
1 -1 1 1 -1 1 0 -2 0
-1 1 1 -1 1 1 -1 -1 1
-1 -1 -1 1 1 -1 0 0 0
EXERCISE PROBLEMS
1. Design a Hebb net to implement OR function (consider
bipolar inputs and targets).
EXERCISE PROBLEMS
1. Using the hebb rule, find the weights required to
perform the following classifications:
vectors (1 1 1 1) and (-1 1 -1 -1) are the members of class
(with target value 1) ;
vectors (1 1 1 -1) and (1 -1 -1 1) are the members of class
(with target value -1)
x1 x2 x3 x4 t
1 1 1 1 1
-1 1 -1 -1 1
1 1 1 -1 -1
1 -1 -1 1 -1
APPLICATION BASED QUESTION
 Classify the two-dimensional input pattern using Hebb
neural network (letters I-J pattern)

You might also like