0% found this document useful (0 votes)
66 views5 pages

Lab 4 Etapi 2.mlx Part

This document defines a distributed delay neural network with 1 input, 2 layers, and 1 output. It loads in input and target data, trains the network using backpropagation for 100 epochs with a goal of 0.0001, and evaluates the trained network on the input data. The mean squared error of the network outputs versus the targets is 0.2940.

Uploaded by

tinashe murwira
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
66 views5 pages

Lab 4 Etapi 2.mlx Part

This document defines a distributed delay neural network with 1 input, 2 layers, and 1 output. It loads in input and target data, trains the network using backpropagation for 100 epochs with a goal of 0.0001, and evaluates the trained network on the input data. The mean squared error of the network outputs versus the targets is 0.2940.

Uploaded by

tinashe murwira
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 5

k1 = 0:0.

01:1;
p1 = sin(4 * pi * k1);
t1(1:length(k1)) = -1;
k2 = 1.86:0.01:3.86;
p2 = sin(sin(k2).*k2.^2+5*k2)

p2 = 1×201
0.0495 0.1248 0.1991 0.2718 0.3427 0.4113 0.4773 0.5403

t2(1:length(k2)) = 1;
R = [4,3,0];

P = [repmat(p1,1,R(1)),p2,repmat(p1,1,R(2)),p2,repmat(p1,1,R(3)),p2];
T = [repmat(t1,1,R(1)),t2,repmat(t1,1,R(2)),t2,repmat(t1,1,R(3)),t2];
net = distdelaynet({0:4,0:4},8);
display(net);

net =

Neural Network

name: 'Distributed Delay Neural Network'


userdata: (your custom info)

dimensions:

numInputs: 1
numLayers: 2
numOutputs: 1
numInputDelays: 4
numLayerDelays: 4
numFeedbackDelays: 4
numWeightElements: 8
sampleTime: 1

connections:

biasConnect: [1; 1]
inputConnect: [1; 0]
layerConnect: [0 0; 1 0]
outputConnect: [0 1]

subobjects:

input: Equivalent to inputs{1}


output: Equivalent to outputs{2}

inputs: {1x1 cell array of 1 input}


layers: {2x1 cell array of 2 layers}
outputs: {1x2 cell array of 1 output}
biases: {2x1 cell array of 2 biases}
inputWeights: {2x1 cell array of 1 weight}
layerWeights: {2x2 cell array of 1 weight}

functions:

adaptFcn: 'adaptwb'
adaptParam: (none)
derivFcn: 'defaultderiv'
divideFcn: 'dividerand'

1
divideParam: .trainRatio, .valRatio, .testRatio
divideMode: 'time'
initFcn: 'initlay'
performFcn: 'mse'
performParam: .regularization, .normalization
plotFcns: {'plotperform', 'plottrainstate', 'ploterrhist',
'plotregression', 'plotresponse', 'ploterrcorr',
'plotinerrcorr'}
plotParams: {1x7 cell array of 7 params}
trainFcn: 'trainlm'
trainParam: .showWindow, .showCommandLine, .show, .epochs,
.time, .goal, .min_grad, .max_fail, .mu, .mu_dec,
.mu_inc, .mu_max

weight and bias values:

IW: {2x1 cell} containing 1 input weight matrix


LW: {2x2 cell} containing 1 layer weight matrix
b: {2x1 cell} containing 2 bias vectors

methods:

adapt: Learn while in continuous use


configure: Configure inputs & outputs
gensim: Generate Simulink model
init: Initialize weights & biases
perform: Calculate performance
sim: Evaluate network outputs given inputs
train: Train network with examples
view: View diagram
unconfigure: Unconfigure inputs & outputs

[Ps,Pi,Ai,Ts] = preparets(net,con2seq(P),con2seq(T));
net.trainFcn = 'trainbr';
net.trainParam.epochs = 100;
net.trainParam.goal = 1e-5; net = init(net);
net = train(net,Ps,Ts); Y = net(Ps,Pi,Ai);

2
W = net.IW{1}

W = 8×5
-3.2007 -1.1119 -0.1764 1.2976 3.1607
-2.3329 -2.1010 0.0572 1.3002 1.3344
-1.5540 1.1652 2.6261 0.6276 -4.2965
2.4393 1.9143 0.6022 -1.0530 -2.5124
-1.4265 0.2956 1.3185 0.1702 -2.9374
1.1486 -1.5957 -2.9059 -2.1267 3.4036
5.3593 2.4238 0.0411 -3.1355 -4.5925
5.5014 3.3610 0.4847 -2.8451 -6.4112

LW = net.LW{2,1}

3
LW = 1×40
3.3651 -4.7345 -2.7035 1.6227 -1.1030 -1.4919 -4.6019 1.7306

b1 = net.b{1}

b1 = 8×1
-0.9468
0.1339
-1.2418
-1.5586
2.2757
1.8330
-2.4115
2.8177

b2 = net.b{2}

b2 = -1.5148

error = cell2mat(Y)-cell2mat(Ts);
mse_error = sqrt(mse(error))

mse_error = 0.2940

X = 1:length(Y);
plot(X,cell2mat(Ts),X,cell2mat(Y)),grid;
legend('reference','output');

plot(X,error),grid;
legend('error');

4
5

You might also like