Multi Layer Perceptron
Multi Layer Perceptron
Neural Network
• Weights initialization –
– it is necessary to set initial weights for the first forward
pass. Two basic opt ions are t o set weight s t o zero or t o
randomize them.
– However, this can result in a vanishing or exploding
gradient, which will make it difficult to train the model.
– To mitigate this problem, you can use a heuristic (a
formula tied to the number of neuron layers) to
determine the weights.
– A common heuristic used for the Tanh activation is
called Xavier initialization.
Hyperparameters of training algo
• https://missinglink.ai
• https://machinelearningmastery.com
• https://www.allaboutcircuits.com
• https://medium.com