FFNN
FFNN
Implement a simple feed forward neural network with a single hidden layer for classification a
set of data points based on their features and levels.
Solution:
import numpy as np
def sigmoid(x):
return 1 / (1 + np.exp(-x))
def sigmoid_derivative(x):
np.random.seed(42)
# Define hyperparameters
input_size = 2
hidden_size = 4
output_size = 1
learning_rate = 0.1
num_epochs = 1000
W1 = np.random.randn(input_size, hidden_size)
b1 = np.zeros((1, hidden_size))
W2 = np.random.randn(hidden_size, output_size)
b2 = np.zeros((1, output_size))
# Training loop
# Forward pass
# Backward pass
if (epoch + 1) % 100 == 0:
print(predictions)
In this code example, we first define the training data points (features) and their
corresponding labels (labels).
Next, we define the sigmoid activation function and its derivative, which will be used in the
forward and backward passes of the neural network.
We set the random seed for reproducibility and define the hyperparameters such as the
learning rate and number of epochs.
We initialize the weights (W1, W2) and biases (b1, b2) with random values.
The training loop iterates over the specified number of epochs. In each iteration, it performs a
forward pass to compute the output of the hidden layer and the output layer. Then, it
computes the loss using the mean squared error. After that, it performs a backward pass to
compute the gradients of the weights and biases using the chain rule. Finally, it updates the
weights and biases based on the gradients and the learning rate.