0% found this document useful (0 votes)
37 views6 pages

Neural Network Quiz Questions

Uploaded by

Hikmet Can Kara
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
37 views6 pages

Neural Network Quiz Questions

Uploaded by

Hikmet Can Kara
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 6

Neural Network and Deep Learning Quiz Questions

1. Which of the following is NOT a hyperparameter in a neural network?

A. Number of hidden layers

B. Number of neurons in a layer

C. Weights and bias

D. Dropout

2. Which of the following components make a neural network non-linear in nature?

A. Hidden Layers

B. Activation Functions

C. Weights and Bias

D. Regularization and Dropout

3. Which of the following is FALSE about Deep Neural Networks?

A. These are computationally more complex as compared to shallow networks

B. These have less generalization capabilities as compared to shallow networks

C. These may suffer from overfitting problem

D. These may suffer from vanishing gradients problem

4. Output of which of the following activation functions is zero centered?

A. Hyperbolic Tangent

B. Sigmoid

C. SoftMax

D. RELU

5. Which of the following is FALSE about Weights and Bias?

A. Biases are typically initialized to 0 (or close to 0)


B. Weights are typically initialized using Xavier technique

C. Both weight and bias are hyperparameters

D. Weights are updated by backpropagation

6. Which of the following is FALSE about sigmoid and tanh activation function?

A. Both are non-linear activation functions

B. Output of sigmoid ranges from -1 to 1 while output of tanh ranges from 0 to 1

C. Output of both sigmoid and tanh is smooth, continuous and differentiable

D. Both have killing gradient problem

7. Which of the following is TRUE about SoftMax and Sigmoid function?

A. SoftMax is usually used for hidden layers and sigmoid for outer layers

B. Sigmoid is usually used for hidden layers and SoftMax for outer layers

C. SoftMax function is usually used for binary classification problem

D. They are both activation functions

8. Which of the following is FALSE about Dropout?

A. Dropout randomly switches off some neurons in the network

B. Dropout is a hyper-parameter

C. Dropout can be used in input, hidden and output layers

D. Dropout should be implemented only during training phase, not in testing phase

9. Which of the following is FALSE about Hidden Layers in Neural Networks?

A. Abstract representation of the training data is stored in hidden layers

B. Feature extraction happens at hidden layers

C. Increasing number of hidden layers always lead to higher accuracy

D. Increasing number of hidden layers above a certain point may lead to overfit

10. Which of the following layers is NOT an activation function?


A. SoftMax

B. RELU

C. Swish

D. Sigmoid

11. Which of the following layer is NOT a part of CNN?

A. Convolutional Layer

B. Pooling Layer

C. Code Layer

D. Fully connected Layer

12. Which of the following is NOT data preprocessing in a neural network?

A. Normalization

B. Whitening

C. PCA

D. Regularization

13. Gradient Descent computes derivative of loss function to find the:

A. input

B. activation value

C. weight

D. bias

14. Which of the following is weight initialization technic?

A. Xavier

B. SoftMax

C. PCA

D. Whitening
15. Which of the following is FALSE about ML (Machine Learning) and DL (Deep Learning) algorithms?

A. Deep Learning algorithms work efficiently on high amount of data

B. Feature Extraction needs to be done manually in both ML and DL algorithms

C. Deep Learning algorithms are best suited for unstructured data

D. Deep Learning algorithms require high computational power

16. Which of the following is FALSE about Pooling Layer in CNN?

A. We can use Max, Min, Average or Sum pooling in CNN

B. It helps in retaining the most useful information and throwing away useless information

C. It reduces resolution and dimension and hence reduces computational complexity

D. Backpropagation cannot be applied when using pooling layers

17. What is the purpose of an activation function?

A. To decide whether a neuron will fire or not

B. To increase the depth of a neural network

C. To create connectivity among hidden layers

D. To normalize the inputs

18. Which of the following is TRUE about SoftMax and Sigmoid function?

A. SoftMax is usually used for hidden layers and sigmoid for outer layers

B. Sigmoid is usually used for hidden layers and SoftMax for outer layers

C. SoftMax function is usually used for binary classification problem

D. They are both activation functions

19. Which of the following is FALSE about Batch Normalization?

A. It may lead to overfitting

B. It solves dying RELU problem

C. It helps in faster convergence


D. It reduces internal covariate shift

20. Which of the following terms is NOT associated with CNN?

A. Filters (Kernels)

B. Forget Gates

C. Zero and Valid Padding

D. Strides

21. Which of the following is NOT about training of a neural network?

A. Regularization (Dropout etc.)

B. Prediction of new comers

C. Hyperparameter Optimization

D. Transfer learning / fine-tuning

22. Which of the following is FALSE about local and global minima?

A. Using constant learning rate is a good way to avoid local minima

B. Using random weights and adding noise sometimes help in getting global minima

C. We can avoid local minima by proper tuning of hyper-parameters

D. We will not get optimal weights if SGD is stuck in local minima

23. Which of the following is FALSE about Weight Initialization?

A. Training period may increase due to wrong weight initialization

B. Vanishing and exploding gradient issues may arise due to wrong weight initialization

C. Initially, we should set all weights to zero while training

D. Model may never converge due to wrong weight initialization

24. Which of the following functions can be used as an activation function in the output layer if we wish to p

A. SoftMax

B. RELU
C. Sigmoid

D. Tanh

25. Output of sigmoid activation function ranges from:

A. 0 to 1

B. -1 to 1

C. -1 to 0

D. 0 to 9

You might also like