0% found this document useful (0 votes)
52 views3 pages

The Perceptron Exercises: Part I: Pattern Weight X Value Weight Y Value SSE 0

This document discusses a perceptron and how changing the weights of its connections can reduce response errors to patterns it is trying to classify. The goal is to eliminate errors for all patterns by adjusting the weight vector. An exercise is described where the reader acts as the learning rule, changing the position of the weight vector endpoint in a spreadsheet to observe the effect on the sum of squared error calculated for eight different patterns.

Uploaded by

Hans Usurin
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as XLS, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
52 views3 pages

The Perceptron Exercises: Part I: Pattern Weight X Value Weight Y Value SSE 0

This document discusses a perceptron and how changing the weights of its connections can reduce response errors to patterns it is trying to classify. The goal is to eliminate errors for all patterns by adjusting the weight vector. An exercise is described where the reader acts as the learning rule, changing the position of the weight vector endpoint in a spreadsheet to observe the effect on the sum of squared error calculated for eight different patterns.

Uploaded by

Hans Usurin
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as XLS, PDF, TXT or read online on Scribd
You are on page 1/ 3

The Perceptron

Exercises: Part I

Perceptrons are of interest to us because they learn to classify patterns. Learning involves changing the
weights of the connections in the perceptron in such a way to reduce response errors to (ideally) zero. For
any particular pattern, the response error is simply the desired response to that pattern minus the
perceptron's actual response to that pattern (desired - actual). In general, we are interested in eliminating
response errors to all patterns. Because of this, in this exercise we will be examining the effect of changin
the weight vector w on the sum of squared error for eight different patterns.

1) In this exercise, you will be the "learning rule" for the perceptron. Go to Sheet 1 of the spreadsheet.
Take your mouse and change the position of the endpoint of the weight vector. (If you prefer, go to the
table at the bottom of the spreadsheet and type in different weight coordinates.) Record the weight vector,
and the sum of squared error that it produces in the perceptron. Move the weight vector to a different
position, exploring another part of the pattern space. Use the table below to record your observations:

Pattern
Weight X Value
Weight Y Value
SSE

0
-0.6
0.8
10

2
SSE
10
Pattern Space
2 WT
-0.67

1
Output Error For Each Pattern

0
-2 -1 0 1 2 2

-1 WT 0
A
B -2
A Patterns B Patterns
-2

ACTUAL DESIRED SQUARED


Name X Y NET OUT OUT ERROR ERROR
A1 0.3 0.7 0.422 1 1 0 0
A2 0.4 0.9 0.533 1 1 0 0
A3 0.5 0.5 0.11 1 1 0 0
A4 0.7 0.3 -0.202 -1 1 2 6
B1 -0.6 0.3 0.669 1 -1 -2 2
B2 -0.4 -0.2 0.09 1 -1 -2 2
B3 0.3 -0.4 -0.557 -1 -1 0 0
B4 -0.2 -0.8 -0.578 -1 -1 0 0
ORIGIN 0 0
WT -0.67 0.89 0 10 TOTAL
WT
0.89

Error For Each Pattern

tterns B Patterns

You might also like