Neural Networks MATH Explained
Neural Networks MATH Explained
Weights - W_hy
bias b_y
Input Layer
X1 Output Layer
Y 1
0
Output
X2 Layer
Weights - W_xh
bias b_h
The Math
Forward propagation
The Math contd.
Loss Function
Where
The Math contd.
Input Outer Weight Inner weight
matrix matrix
Output
The Math contd.
Weight Derivatives Bias Terms
The Math contd.
Forward propagation
The Math contd.
def sigmoid_derivative(x):
return x*(1.0 - x)
Implement a basic two layer NN with Single Output : Hidden layer with 4 nodes
Input to Hidden layer Weights1 = [Input X dim, 4] Hidden to output layer Weights2 = [4,1]
file:///Users/anishroychowdhury/Downloads/Deep_Learning_1_ANN_From_Scratch.html 1/3
05/06/2021 Deep_Learning_1_ANN_From_Scratch
class NeuralNetwork:
def __init__(self,x,y):
self.input = x
self.weights1 = np.random.randn(self.input.shape[1],4)
self.weights2 = np.random.randn(4,1)
self.y = y
self.output = np.zeros(self.y.shape)
def forwardprop(self):
self.layer1 = sigmoid(np.dot(self.input,self.weights1))
self.output = sigmoid(np.dot(self.layer1,self.weights2))
def backprop(self):
# Chain rule applied for Loss func der wrt W[hy] and W[xh] weigh
t matrices
# define intermediate computes for d_weights2 computation
delta2 = 2*(self.y-self.output)*sigmoid_derivative(self.output)
# compute derivatives for weights2 matrix
d_weights2 = np.dot(self.layer1.T,delta2)
delta1 = np.dot(delta2,self.weights2.T)*sigmoid_derivative(self.
layer1)
d_weights1 = np.dot(self.input.T,delta1)
# update weights
self.weights1 += d_weights1
self.weights2 += d_weights2
# Init call
nn = NeuralNetwork(X,y)
file:///Users/anishroychowdhury/Downloads/Deep_Learning_1_ANN_From_Scratch.html 2/3
05/06/2021 Deep_Learning_1_ANN_From_Scratch
[[0.01431983]
[0.97929455]
[0.97575642]
[0.02558122]]
In [ ]:
file:///Users/anishroychowdhury/Downloads/Deep_Learning_1_ANN_From_Scratch.html 3/3