0% found this document useful (0 votes)
25 views3 pages

Ann Practical 5

The document outlines a Python implementation of an Artificial Neural Network (ANN) using forward and backward propagation. It includes a SimpleNN class with methods for forward propagation, backward propagation, and training the network on a given dataset. The example demonstrates training the ANN on a XOR problem and outputs predictions after training.

Uploaded by

ichisamui
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
25 views3 pages

Ann Practical 5

The document outlines a Python implementation of an Artificial Neural Network (ANN) using forward and backward propagation. It includes a SimpleNN class with methods for forward propagation, backward propagation, and training the network on a given dataset. The example demonstrates training the ANN on a XOR problem and outputs predictions after training.

Uploaded by

ichisamui
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 3

PRACTICAL 5

Problem Statement:- Implement Artificial Neural Network training process in


Python by using Forward Propagation, Back Propagation.

CODE:-
import numpy as np

def sigmoid(x):
return 1 / (1 + np.exp(-x))

def sigmoid_derivative(x):
return x * (1 - x)

# Neural Network class


class SimpleNN:
def __init__(self, input_size, hidden_size, output_size):
self.W1 = np.random.randn(input_size, hidden_size)
self.b1 = np.zeros((1, hidden_size))
self.W2 = np.random.randn(hidden_size, output_size)
self.b2 = np.zeros((1, output_size))

def forward(self, X):


self.Z1 = np.dot(X, self.W1) + self.b1
self.A1 = sigmoid(self.Z1)
self.Z2 = np.dot(self.A1, self.W2) + self.b2
self.A2 = sigmoid(self.Z2)
return self.A2
def backward(self, X, y, learning_rate):
m = X.shape[0]
dZ2 = (self.A2 - y) * sigmoid_derivative(self.A2)
dW2 = np.dot(self.A1.T, dZ2) / m
db2 = np.sum(dZ2, axis=0, keepdims=True) / m

dA1 = np.dot(dZ2, self.W2.T)


dZ1 = dA1 * sigmoid_derivative(self.A1)
dW1 = np.dot(X.T, dZ1) / m
db1 = np.sum(dZ1, axis=0, keepdims=True) / m

self.W1 -= learning_rate * dW1


self.b1 -= learning_rate * db1
self.W2 -= learning_rate * dW2
self.b2 -= learning_rate * db2

def train(self, X, y, epochs, learning_rate):


for epoch in range(epochs):
self.forward(X)
self.backward(X, y, learning_rate)
if epoch % 1000 == 0:
loss = np.mean((y - self.A2) ** 2)
print(f"Epoch {epoch}, Loss: {loss}")

X = np.array([[0, 0], [0, 1], [1, 0], [1, 1]]) # Input


y = np.array([[0], [1], [1], [0]]) # Output
nn = SimpleNN(input_size=2, hidden_size=4, output_size=1)
nn.train(X, y, epochs=25000, learning_rate=0.1)

print("\nPredictions after training:")


print(nn.forward(X))

OUTPUT:-

You might also like