Neural Network Quiz Questions
Neural Network Quiz Questions
D. Dropout
A. Hidden Layers
B. Activation Functions
A. Hyperbolic Tangent
B. Sigmoid
C. SoftMax
D. RELU
6. Which of the following is FALSE about sigmoid and tanh activation function?
A. SoftMax is usually used for hidden layers and sigmoid for outer layers
B. Sigmoid is usually used for hidden layers and SoftMax for outer layers
B. Dropout is a hyper-parameter
D. Dropout should be implemented only during training phase, not in testing phase
D. Increasing number of hidden layers above a certain point may lead to overfit
B. RELU
C. Swish
D. Sigmoid
A. Convolutional Layer
B. Pooling Layer
C. Code Layer
A. Normalization
B. Whitening
C. PCA
D. Regularization
A. input
B. activation value
C. weight
D. bias
A. Xavier
B. SoftMax
C. PCA
D. Whitening
15. Which of the following is FALSE about ML (Machine Learning) and DL (Deep Learning) algorithms?
B. It helps in retaining the most useful information and throwing away useless information
18. Which of the following is TRUE about SoftMax and Sigmoid function?
A. SoftMax is usually used for hidden layers and sigmoid for outer layers
B. Sigmoid is usually used for hidden layers and SoftMax for outer layers
A. Filters (Kernels)
B. Forget Gates
D. Strides
C. Hyperparameter Optimization
22. Which of the following is FALSE about local and global minima?
B. Using random weights and adding noise sometimes help in getting global minima
B. Vanishing and exploding gradient issues may arise due to wrong weight initialization
24. Which of the following functions can be used as an activation function in the output layer if we wish to p
A. SoftMax
B. RELU
C. Sigmoid
D. Tanh
A. 0 to 1
B. -1 to 1
C. -1 to 0
D. 0 to 9