0% found this document useful (0 votes)
56 views47 pages

Chp2 PDF

The document discusses neural network structures, including the McCulloch-Pitts model and its ability to perform basic logic functions through linear separation. It notes that the exclusive OR (XOR) function cannot be realized with a single neuron due to its nonlinear nature. The document then introduces multilayer feedforward neural networks as a way to realize arbitrary logic functions and approximations through the combination of linear separable regions formed by hidden layers.

Uploaded by

qxkurt_641211805
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
56 views47 pages

Chp2 PDF

The document discusses neural network structures, including the McCulloch-Pitts model and its ability to perform basic logic functions through linear separation. It notes that the exclusive OR (XOR) function cannot be realized with a single neuron due to its nonlinear nature. The document then introduces multilayer feedforward neural networks as a way to realize arbitrary logic functions and approximations through the combination of linear separable regions formed by hidden layers.

Uploaded by

qxkurt_641211805
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 47

NEURAL NETWORK STRUCTURES

McCullcoch-Pitts Model

Unipolar/bipolar case

Linear Separation

Logic Functions

Basic Limitation : EXOR Function

General Case

Multilayer Structures : Feedforward Case

Feedback Structures : Memory

inputs

weights

nonlinear block

output

Basic assumptions : (unipolar case)

Simple extension : (bipolar case)

v=0

input space

Basic logic operations are LINEARLY SEPARABLE

could be realized by McCulloch-Pitts neuron!


UNIPOLAR CASE (

, threshold fn.)

All (unipolar) logic operations can be realized by (possibly cascading)


McCulloch-Pitts neuron model !
Relation with Boolean Algebra

Computers

All of these operations are LINEARLY SEPARABLE !

Inspired von Neumann

Claim : Any (unipolar) logic operation can be realized by at


most 3 layers of neurons !

NOT

AND

order of blocks may change!


not unique!
how many neurons do we have in each block?

Can we eliminate NOT block? (weight is -1 !)

OR

Unipolar AND combined with NOT

p : number of NOT operations

Unipolar OR combined with NOT

p : number of NOT operations

Claim : Any (unipolar) logic operation can be realized by at 2


layers of neurons !
Reason : Can eliminate NOT block (weight is -1 !)

NOT

AND

OR

instead of NOT, use weight -1, change threshold

BIPOLAR CASE (

, signum fn.)

Bipolar AND combined with NOT

p : number of NOT operations

Bipolar OR combined with NOT

p : number of NOT operations

What can/cannot we perform with a SINGLE NEURON?


Basic limitation is LINEAR SEPARABILITY

Can we find a SINGLE NEURON realizing this logic function ?


(Note that with more that a single neuron, this is always possible!)

Contradiction!
Not realizable !

Not linearly separable !

Single neuron is not sufficient !


One Extension : SINGLE LAYER NETWORK

-1

-1

MULTILAYER NEURAL NETWORKS

-1

-1

Why multilayer?
Arbitrary logic function realization.
Single Neuron : LINEAR SEPARATION .
Remember EXOR Function

We can realize each output separately

logically

analytically

+
+

Relation with EXOR

+
+
+

mathematical equations?

-1

componentwise equations

Linear Sum

Output

vectorial notation

input

ith neuron weight

linear sum

output

number of neurons

-1

extended notation

extended weight matrix

-1

extended input

input

Hidden Layer 1

Hidden Layer 2

Could also use extended notation

Output Layer

+
+
+
OR : UNION OF REGIONS

Lines

AND : INTERSECTION OF REGIONS

And

Linearly Separable Patterns :


Class 1

R1

Class 2
Class 3
Line 2
R5

R2
R3
Line 1
R6

Line 3

R4

At Most 3 layers are sufficient to classify Linearly Separable patterns

Line Formation

Region Formation
AND

Region Union
OR

R1

Line 2
R5

R2
R3
Line 1
R6

Line 3

R4

Line Weights

AND

OR

Nonlinear Functions

Approximation !

Approximation !

with signum

with bipolar
sigmoidal

f(v)=sgn(v)

perturbed pattern

memory pattern

Noisy/perturbed pattern

Memorized pattern

Pattern recovery

You might also like