Deep Learning QB-1
Deep Learning QB-1
[ QB ,UT-1 ]
11. Numerical to find Output of single neuron ( Refer class notes for
example)
12. Numerical to find Output of Neural Network ( ( Refer class notes for
example )
13. Different types of Activation function?
Ans - Activation Functions in Neural Networks
Activation functions introduce non-linearity into neural networks,
allowing them to learn complex patterns. Here are some common types:
Sigmoid: Returns values between 0 and 1, often used in output
layers for classification tasks.
ReLU (Rectified Linear Unit): Returns the input if it's positive,
otherwise returns 0. It's popular due to its computational efficiency
and ability to avoid the vanishing gradient problem.
Tanh: Returns values between -1 and 1, similar to sigmoid but with
a wider range.
Leaky ReLU: A variant of ReLU that allows for a small slope for
negative inputs, helping to avoid the dying ReLU problem.
Softmax: Often used in the output layer for multi-class
classification, ensuring that the outputs sum to 1 and represent
probabilities.
The choice of activation function depends on the specific task and
network architecture.