0% found this document useful (0 votes)
34 views3 pages

Digit Recognition: Yash Jain - 1412022

This document describes digit recognition using a perceptron neural network. It defines a perceptron with 10 input neurons and 10 output neurons to classify 10 different digits. The network is trained on input and output data from a training set to classify digits in a test set. The network is trained for 1000 epochs with a goal of 0 error. The trained network is then used to simulate the test data and determine if each test digit is correctly classified.

Uploaded by

yashj517
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
34 views3 pages

Digit Recognition: Yash Jain - 1412022

This document describes digit recognition using a perceptron neural network. It defines a perceptron with 10 input neurons and 10 output neurons to classify 10 different digits. The network is trained on input and output data from a training set to classify digits in a test set. The network is trained for 1000 epochs with a goal of 0 error. The trained network is then used to simulate the test data and determine if each test digit is correctly classified.

Uploaded by

yashj517
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 3

Digit Recognition

Yash Jain - 1412022

net = newp(pr,s,tf,lf)
This is the main function that is being used in this code in which the training of the network is
done.

Description
Perceptrons are used to solve simple (i.e. linearly separable) classification problems.
net = newp(PR,S,TF,LF) takes these inputs,

PR - R x 2 matrix of min and max values for R input elements.

S - Number of neurons.

TF - Transfer function, default = 'hardlim'.

LF - Learning function, default = 'learnp'.

and returns a new perceptron.


The transfer function TF can be hardlim or hardlims. The learning function LF can be learnp or
learnpn.
Call newp without input arguments to define the network's attributes in a dialog window.

Properties
Perceptrons consist of a single layer with the dotprod weight function, the netsum net input
function, and the specified transfer function.
The layer has a weight from the input and a bias.
Weights and biases are initialized with initzero.
Adaption and training are done with adaptwb and trainwb, which both update weight and bias
values with the specified learning function. Performance is measured with mae.
Code
clc;
clear all;
cd = open('reg.mat');
input = [cd.input_data(:,1)' ;cd.input_data(:,2)' ;cd.input_data(:,3)' ;cd.input_data(:,4)' ;cd.input_data(:,5)'
;cd.input_data(:,6)' ;cd.input_data(:,7)' ;cd.input_data(:,8)' ;cd.input_data(:,9)' ;cd.input_data(:,10)']';
for i = 1:10
for j = 1:10
if i == j
output(i,j) = 1;
else
output(i,j) = 0;
end
end
end
for i = 1:15
for j = 1:2
if j==1
aw(i,j) = 0;
else
aw(i,j) = 1;
end
end
end
test = [cd.test_data(:,1)';cd.test_data(:,2)';cd.test_data(:,3)';cd.test_data(:,4)';cd.test_data(:,5)']';
net = newp(aw,10,'hardlim');
net.trainparam.epochs = 1000;
net.trainparam.goal = 0;
net = train(net,input,output);
y = sim(net,test);
x = y';
for i = 1:5
k = 0;
l = 0;
for j = 1:10
if x(i,j) == 1
k=k+1;
l=j;
end
end
if k == 1
s = sprintf('Test Pattern %d is Recognized as %d', i, l-1);
disp(s);
else
s = sprintf('Test Pattern %d is not Recognized', i);
disp(s);
end
end

You might also like