NN Matlab - Examples
NN Matlab - Examples
)
1
WT
2-layer NN x1 2 y1
(.)
y W (V x)
T T
x2 y2
3
(.)
RVFL NN has V= random
xn ym
1-layer NN has W= I
L
y (V x)T
inputs (.) outputs
hidden layer
Training Two-Layer Neural Network
2
+ o
I. Training 1
+ o
0
X
1 2 1 2 2 1 1 2
0 0 0 0 1 1 1 1
Y
0 0 1 1 0 0 1 1
MATLAB Code
R=[-2 2;-2 2]; % define 2-D input space
netp=newp(R,2); % define 2-neuron NN
p1=[1 1]'; p2=[1 2]'; p3=[2 -1]'; p4=[2 -2]'; p5=[-1 2]'; p6=[-2 1]'; p7=[-1 -1]'; p8=[-2 -2]‘;
t1=[0 0]'; t2=[0 0]'; t3=[0 1]'; t4=[0 1]'; t5=[1 0]'; t6=[1 0]'; t7=[1 1]'; t8=[1 1]‘;
P=[p1 p2 p3 p4 p5 p6 p7 p8];
T=[t1 t2 t3 t4 t5 t6 t7 t8];
netp.trainParam.epochs = 20; % train for max 20 epochs
netp = train(netp,P,T);
result
3 1 1
y
T
x
1 2 0
Solution. 1-a)
clc;clear all;close all; Y=sim(netp,P)
W=netp.IW{1,1}
% Patterns B=netp.b{1}
P(:,1)=[0.1 1.2]'; P(:,2)=[0.7
1.8]';P(:,3)=[0.8 1.6]'; % Group 1 % Training phase
P(:,4)=[0.8 0.6]'; P(:,5)=[1 0.8]'; % Group netp.trainParam.epochs = 20;
2 netp = train(netp,P,T);
P(:,6)=[0.3 0.5]'; P(:,7)=[0
0.2]';P(:,8)=[-0.3 0.8]'; % Group 3 % Verifying the output, weights and bias
P(:,9)=[-.5 -1.5]'; P(:,10)=[-1.5 -1.3]'; % of the net after training.
Group 4 Y=sim(netp,P)
% Targets W=netp.IW{1,1}
T(:,1)=[0 0]'; T(:,2)=[0 0]';T(:,3)=[0 0]'; B=netp.b{1}
T(:,4)=[0 1]'; T(:,5)=[0 1]';
T(:,6)=[1 0]'; T(:,7)=[1 0]';T(:,8)=[1 0]'; % Plotting data and Boundaries on the
T(:,9)=[1 1]'; T(:,10)=[1 1]'; same plot
figure;
% Building a persepton neural net plotpv(P,T)
netp=newp(minmax(P),2,'hardlim','learnp'); plotpc(W,B)
1
P(2)
-1
-2
Performance is 0, Goal is 0
1
0.9
0.8
0.6
0.5
0.4
0.3
0.2
0.1
0
0 1 2 3 4 5 6 7 8 9 10 11
11 Epochs
Solution. 1-b)
clc;clear all;close all; T(:,9)=[1 1]'; T(:,10)=[1 1]';
% Patterns
P(:,1)=[0.1 1.2]'; P(:,2)=[0.7 % Training phase and verifying the output
1.8]';P(:,3)=[0.8 1.6]'; % Group 1 W=T*pinv(P)
P(:,4)=[0.8 0.6]'; P(:,5)=[1 0.8]'; % Group 2 B=[0;0];
P(:,6)=[0.3 0.5]'; P(:,7)=[0 0.2]';P(:,8)=[- Y=hardlim(W*P)
0.3 0.8]'; % Group 3
P(:,9)=[-.5 -1.5]'; P(:,10)=[-1.5 -1.3]'; % % Plotting data and Boundaries on the
Group 4 same plot
% Targets plotpv(P,T)
T(:,1)=[0 0]'; T(:,2)=[0 0]';T(:,3)=[0 0]'; plotpc(W,B)
T(:,4)=[0 1]'; T(:,5)=[0 1]';
T(:,6)=[1 0]'; T(:,7)=[1 0]';T(:,8)=[1 0]';
Vectors to be Classified
1
P(2)
-1
-2
Note that the result is not good. Hebbian learning here couldn’t achieve zero error because the patterns are not orthonormal.
Prepared by: Murad Abu-Khalaf
Sunday, October 24, 2004
Solution. 1-c):
clc;clear all;close all; net.inputWeights{1}.learnFcn
W=net.IW{1,1} 3
B=net.b{1}
1
P(2)
-1
-2
0
10
-1
10
0 50 100 150 200 250 300 350 400 450 500
500 Epochs
Prepared by: Murad Abu-Khalaf
Sunday, October 24, 2004
Solution. 1-d):
clc;clear all;close all; net.trainParam.min_grad=1e-100;
% Patterns net.trainParam.goal = 1e-15;
P(:,1)=[0.1 1.2]'; P(:,2)=[0.7 1.8]';P(:,3)=[0.8 net.trainParam.epochs = 5000;
1.6]'; % Group 1 net = train(net,P,T);
P(:,4)=[0.8 0.6]'; P(:,5)=[1.0 0.8]'; % Group 2 % Verifying the output, weights and bias of the
P(:,6)=[0.3 0.5]'; P(:,7)=[0.0 0.2]';P(:,8)=[-0.3 net after training.
0.8]'; % Group 3 Y=sim(net,P)
P(:,9)=[-0.5 -1.5]'; P(:,10)=[-1.5 -1.3]'; % Group 4 W=net.IW{1,1}
% Targets B=net.b{1}
T(:,1)=[-1 -1]'; T(:,2)=[-1 -1]';T(:,3)=[-1 -1]'; hardlims(sim(net,P) )
T(:,4)=[-1 1]'; T(:,5)=[-1 1]'; % Plotting data and Boundaries on the same plot
T(:,6)=[1 -1]'; T(:,7)=[1 -1]';T(:,8)=[1 -1]'; criterion=sum(sum(abs(T-Y)')');
T(:,9)=[1 1]'; T(:,10)=[1 1]'; legend(['criterion=' num2str(criterion)]);
% Building a feedforward neural net % Plotting data and Boundaries on the same plot
net=newff(minmax(P),[2],{'tansig'}); figure;
net.inputWeights{1}.learnFcn T(:,1)=[0 0]'; T(:,2)=[0 0]';T(:,3)=[0 0]';
% Verifying the output, weights and bias of the net T(:,4)=[0 1]'; T(:,5)=[0 1]';
before training. T(:,6)=[1 0]'; T(:,7)=[1 0]';T(:,8)=[1 0]';
Y=sim(net,P) T(:,9)=[1 1]'; T(:,10)=[1 1]';
W=net.IW{1,1} plotpv(P,T)
B=net.b{1} plotpc(W,B)
Vectors to be Classified
3
% Training phase
Excellent performance.
2
Better than a).
Decision boundaries are less
sensitive to noise than in a). 1
P(2)
-1
-2
criterion=3.4528e-005
0
10
-2
10
-4
10
Training-Blue
-6
10
-8
10
-10
10
0 2 4 6 8 10 12 14
14 Epochs
Prepared by: Murad Abu-Khalaf
Sunday, October 24, 2004
Solution. 1-e)
clc;clear all;close all; % Verifying the output, weights and bias
of the net before training.
% Patterns Y=sim(net,P)
P(:,1)=[0.1 1.2]'; P(:,2)=[0.7 1.8]';P(:,3)=[0.8 W=net.IW{1,1}
1.6]'; % Group 1 B=net.b{1}
P(:,4)=[0.8 0.6]'; P(:,5)=[1.0 0.8]'; % Group 2 % Training phase
P(:,6)=[0.3 0.5]'; P(:,7)=[0.0 0.2]';P(:,8)=[- net.trainParam.goal = 0;
0.3 0.8]'; % Group 3 net.trainParam.epochs = 400;
P(:,9)=[-0.5 -1.5]'; P(:,10)=[-1.5 -1.3]'; % net = train(net,P,T);
Group 4 % Verifying the output, weights and bias
% Targets of the net after training.
T(:,1)=[-1 -1]'; T(:,2)=[-1 -1]';T(:,3)=[-1 - Y=sim(net,P)
1]'; W=net.IW{1,1}
T(:,4)=[-1 1]'; T(:,5)=[-1 1]'; B=net.b{1}
T(:,6)=[1 -1]'; T(:,7)=[1 -1]';T(:,8)=[1 -1]'; hardlims(sim(net,P) )
T(:,9)=[1 1]'; T(:,10)=[1 1]'; % Plotting data and Boundaries on the
same plot
% Building a feedforward neural net criterion=sum(sum(abs(T-Y)')');
net=newff(minmax(P),[10 2],{'radbas' 'tansig'}); legend(['criterion='
net.inputWeights{1}.learnFcn num2str(criterion)]);
criterion=2.0558e-005
0
10
-2
10
-4
10
Training-Blue
-6
10
-8
10
-10
10
-12
10
0 5 10 15 20 25 30 35 40 45
48 Epochs
Pattern 2:
Pattern 3:
Solution. 2-a)
clc;clear all;close all; % Training phase
netp.trainParam.epochs = 20;
% Patterns netp = train(netp,P,T);
P(:,1)= [1 -1 1 1]'; P(:,2)= [1 1 -1
1]';P(:,3)= [-1 -1 -1 1]'; % Verifying the output, weights and bias of
the net after training.
% Targets Y=sim(netp,P)
T(:,1)=[-1 -1]'; T(:,2)=[1 -1]';T(:,3)=[1 - W=netp.IW{1,1}
1]'; B=netp.b{1}
% Building a persepton neural net res=Y-T
netp=newp([-1 1;-1 1;-1 1;-1 % Plotting data and Boundaries on the same
1],2,'hardlims','learnp'); plot
criterion=sum(sum(abs(T-Y)')');
% Verifying the output, weights and bias of legend(['criterion=' num2str(criterion)]);
the net before training. % Testing pt
Y=sim(netp,P) pt=[1 -1 1 -1]';
W=netp.IW{1,1} Y=sim(netp,pt)
B=netp.b{1} Performance is 0, Goal is 0
10
res=Y-T criterion=0
7
Training-Blue Goal-Black
0
0 0.2 0.4 0.6 0.8 1 1.2 1.4 1.6 1.8 2
2 Epochs
Prepared by: Murad Abu-Khalaf
Sunday, October 24, 2004
Solution. 2-b)
The test pattern produces t1, which corresponds to the first pattern.
Design Hopfield network that will converge to the input patterns. Plot the input pattern. Test the
network with the input patterns, and than with the test matrix. Ton make the test matrix using the following
Matlab command: test=P+rand(size(P))/10; Plot the error points for the both cases.
Solution. 3
clc;clear all;close all; % Plotting trajectory
figure;hold on;axis([-2 2 -2 2]);
% Patterns title('Convergence to Stored Patterns in
P(:,1)= [1 1]'; P(:,2)= [1 1]'; Hopfield Network');
P(:,3)= [1 -1]'; P(:,4)= [1 -1]'; xlabel('P(1)');ylabel('P(2)');
P(:,5)= [-1 1]'; P(:,6)= [-1 1]'; plot(P(1,:),P(2,:),'ko','MarkerSize',10)
P(:,7)= [-1 -1]'; P(:,8)= [-1 1]'; plot(Ai{1}(1,:),Ai{1}(2,:),'rx','MarkerSi
ze',10)
% Targets legend('Stored Pattern','Distorted
T=P; Pattern');
1.5
0.5
P(2)
-0.5
-1
-1.5
-2
-2 -1.5 -1 -0.5 0 0.5 1 1.5 2
P(1)
Prepared by: Murad Abu-Khalaf
Sunday, October 24, 2004
Solution. 4
clc;clear all;close all; xlabel('time');ylabel('g(t)');
t=0:Ts:4;
% Patterns and Targets p=2;Pp2=[p*ones(max(size(t)),1)';t];Tp2=0
Ts=0.05; .5+sin(pi*p.*t);
t=0:Ts:4; Y=sim(net,Pp2);
p=1;P=[p*ones(max(size(t)),1)';t]; subplot(2,1,1);hold
T=[0.5+sin(pi*p.*t)]; on;plot(t,Y(1,:),'b');
p=2;P=[P [p*ones(max(size(t)),1)';t]];T=[T subplot(2,1,1);plot(t,Tp2,'r');
0.5+sin(pi*p.*t)]; subplot(2,1,2);plot(t,Y(1,:)-Tp2);
p=3;P=[P [p*ones(max(size(t)),1)';t]];T=[T
0.5+sin(pi*p.*t)]; figure(4);hold on;axis([-2 2 -2 2]);
p=4;P=[P [p*ones(max(size(t)),1)';t]];T=[T title('Convergence to Stored Patterns in
0.5+sin(pi*p.*t)]; Hopfield Network');
xlabel('time');ylabel('g(t)');
% Building a persepton neural net t=0:Ts:4;
net = newff([minmax(P)],[25 1],{'radbas' p=3;Pp3=[p*ones(max(size(t)),1)';t];Tp3=0
'purelin'}); .5+sin(pi*p.*t);
net.trainParam.goal= 0.0; Y=sim(net,Pp3);
net.trainParam.epochs = 500; subplot(2,1,1);hold
on;plot(t,Y(1,:),'b');
% Train subplot(2,1,1);plot(t,Tp3,'r');
net = train(net,P,T); subplot(2,1,2);plot(t,Y(1,:)-Tp3);
0
10
Training-Blue
-1
10
-2
10
-3
10
0 50 100 150 200 250 300 350 400 450 500
500 Epochs
Prepared by: Murad Abu-Khalaf
Sunday, October 24, 2004
p=1
2
1.5
0.5
-0.5
-1
0 0.5 1 1.5 2 2.5 3 3.5 4
0.15
0.1
0.05
-0.05
-0.1
-0.15
-0.2
0 0.5 1 1.5 2 2.5 3 3.5 4
p=2
2
1.5
0.5
-0.5
-1
0 0.5 1 1.5 2 2.5 3 3.5 4
0.2
0.1
-0.1
-0.2
0 0.5 1 1.5 2 2.5 3 3.5 4
Prepared by: Murad Abu-Khalaf
Sunday, October 24, 2004
p=3
2
1.5
0.5
-0.5
-1
0 0.5 1 1.5 2 2.5 3 3.5 4
0.3
0.2
0.1
-0.1
-0.2
0 0.5 1 1.5 2 2.5 3 3.5 4
p=4
2
1.5
0.5
-0.5
-1
0 0.5 1 1.5 2 2.5 3 3.5 4
0.6
0.4
0.2
-0.2
-0.4
-0.6
-0.8
0 0.5 1 1.5 2 2.5 3 3.5 4