0% found this document useful (0 votes)
3 views1 page

Ann 2221 3.ipynb - Colab

The document presents a Python implementation of a neural network using backpropagation to solve the XOR problem. It includes functions for the sigmoid activation and its derivative, initializes weights, and iteratively trains the network over 10,000 iterations while printing the error at every 1,000 iterations. The final output values indicate the network's predictions for the XOR inputs after training.

Uploaded by

Mridul Das
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
3 views1 page

Ann 2221 3.ipynb - Colab

The document presents a Python implementation of a neural network using backpropagation to solve the XOR problem. It includes functions for the sigmoid activation and its derivative, initializes weights, and iteratively trains the network over 10,000 iterations while printing the error at every 1,000 iterations. The final output values indicate the network's predictions for the XOR inputs after training.

Uploaded by

Mridul Das
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 1

30/07/2024, 15:36 Ann_2221_3.

ipynb - Colab

Write a Neural Network to Train a neural network using backpropagation.

import numpy as np

def sigmoid(x):
return 1 / (1 + np.exp(-x))

def sigmoid_derivative(x):
return x * (1 - x)

inputs = np.array([[0, 0],


[0, 1],
[1, 0],
[1, 1]])

expected_output = np.array([[0],
[1],
[1],
[0]])

np.random.seed(1)

weights0 = 2 * np.random.random((2, 2)) - 1


weights1 = 2 * np.random.random((2, 1)) - 1

num_iterations = 10000

lr = 1

for i in range(num_iterations):
layer0 = inputs
layer1 = sigmoid(np.dot(layer0, weights0))
layer2 = sigmoid(np.dot(layer1, weights1))

layer2_error = expected_output - layer2


layer2_delta = layer2_error * sigmoid_derivative(layer2)
layer1_error = layer2_delta.dot(weights1.T)
layer1_delta = layer1_error * sigmoid_derivative(layer1)

weights1 += layer1.T.dot(layer2_delta) * lr
weights0 += layer0.T.dot(layer1_delta) * lr

if i % 1000 == 0:
print("Error after {} iterations: {}".format(i, np.mean(np.abs(layer2_error))))

print("Final Values:")
print(layer2)

Error after 0 iterations: 0.4993115461156098


Error after 1000 iterations: 0.2985255058351594
Error after 2000 iterations: 0.26474389606301
Error after 3000 iterations: 0.2502001031123378
Error after 4000 iterations: 0.24134872623191073
Error after 5000 iterations: 0.23514022766390627
Error after 6000 iterations: 0.23042991061348972
Error after 7000 iterations: 0.22667301824757588
Error after 8000 iterations: 0.22357082172432846
Error after 9000 iterations: 0.2209431280441855
Final Values:
[[0.04336566]
[0.86002551]
[0.86003252]
[0.55139487]]

https://colab.research.google.com/drive/1ZjWkvJ9Q1nMi28zhYZbcvxaULZi20nNS?authuser=1#scrollTo=fSJ2wkHf4Yxt&printMode=true 1/2

You might also like