Moo SC 32

Download as pdf or txt
Download as pdf or txt
You are on page 1of 3

DEPARTMENT OF

COMPUTER SCIENCE & ENGINEERING

Experiment-3.2

Student Name: Mudit UID: 21BCS10092


Branch: CSE Section/Group: SC-906-A
Semester: 6th Date:03-04-24
Subject Name: Soft Computing LAB Subject Code: 21CSP-377

1. Aim: To write a MATLAB program to train and test the back propagation neural networkfor
the generation of XOR function.

2. Objective:. Generation of XOR Function using back propagation algorithm.


3. Algorithm:
1. Set up input and output data: Define the input-output pairs for the XOR function.
2. Initialize the neural network: Create a neural network with two input neurons, two hidden
neurons, and one output neuron. Initialize random weights and biases.
3. Train the neural network:
- Loop until the error becomes small enough.
- Feed forward: Pass inputs through the network to get predicted outputs.
- Compute the error: Measure how much the predicted outputs differ from the actual
outputs.
- Backpropagation: Adjust the weights and biases to minimize the error.
- Update weights and biases based on the error and learning rate.
4. Test the trained network: Use the trained network to predict the output for the XOR inputs.
5. Display the predicted output: Show the predicted outputs obtained from the trained neural
network.

4. Script and output:

inputs = [0 0; 0 1; 1 0; 1 1];
targets = [0; 1; 1; 0];
DEPARTMENT OF
COMPUTER SCIENCE & ENGINEERING
input_neurons = 2;
hidden_neurons = 2;
output_neurons = 1;
learning_rate = 0.1;

hidden_weights = randn(input_neurons, hidden_neurons);


hidden_bias = randn(1, hidden_neurons);
output_weights = randn(hidden_neurons, output_neurons);
output_bias = randn(1, output_neurons);

error_threshold = 0.01;

error = Inf;
while error > error_threshold

hidden_activation = sigmoid(inputs * hidden_weights + hidden_bias);


output_activation = sigmoid(hidden_activation * output_weights + output_bias);

error = sum((targets - output_activation).^2) / numel(targets);

output_error = targets - output_activation;


output_delta = output_error .* sigmoid_derivative(output_activation);

hidden_error = output_delta * output_weights';


hidden_delta = hidden_error .* sigmoid_derivative(hidden_activation);

output_weights = output_weights + learning_rate * hidden_activation' * output_delta;


output_bias = output_bias + learning_rate * sum(output_delta);

hidden_weights = hidden_weights + learning_rate * inputs' * hidden_delta;


hidden_bias = hidden_bias + learning_rate * sum(hidden_delta);
end
disp("Mudit 21BCS10092");
hidden_activation = sigmoid(inputs * hidden_weights + hidden_bias);
output_activation = sigmoid(hidden_activation * output_weights + output_bias);
disp('Predicted output:');
disp(output_activation);

function sig = sigmoid(x)


sig = 1 ./ (1 + exp(-x));
end

function sig_prime = sigmoid_derivative(x)


DEPARTMENT OF
COMPUTER SCIENCE & ENGINEERING
sig_prime = x .* (1 - x);
end

5. OUTPUT

You might also like