Unit 5 - Notes
Unit 5 - Notes
In a fully connected Deep neural network, there is an input layer and one or more
hidden layers connected one after the other. Each neuron receives input from the
previous layer neurons or the input layer. The output of one neuron becomes
the input to other neurons in the next layer of the network, and this process
continues until the final layer produces the output of the network
https://www.geeksforgeeks.org/introduction-deep-learning/
Activation Function :
It decides whether a neuron should be activated or not by calculating the weighted sum and further
adding bias to it. It also the model to learn complex patterns.
Weights: Parameters that are learned during training. They represent the strength of the
connection between neurons.
Biases: Additional parameters that allow the model to fit the data better.
The below are the popular Activation Functions used in Deep Learning - - - - >
During forward propagation, the input is fed into the neural network, and the network calculates the
output. During backward propagation, the error between the predicted output and the actual output is
calculated, and the weights and biases of each neuron are adjusted to reduce the error
Forward Propagation
● The process where input data passes through the network layer by layer to generate an output.
Backward Propagation
● The process of updating the weights and biases by propagating the error back through the
network.