ANN notes
ANN notes
SUBJECT
Artificial Neural Networks
SECTION
Alpha
dataset.
Hidden Layers – Number of layers between input and
output.
9. Forward Propagation
The process of calculating output from inputs using weights
and activation functions.
updating weights.
Normalization: Scaling inputs for better performance.
new data.
Underfitting: Model is too simple, doesn't learn well.
6. Leaky ReLU
A modified version of ReLU that allows small negative values instead
of zero.
9. Swish Function
A self-gated activation function proposed by Google.