Assignment - Week 6 (Neural Networks) Type of Question: MCQ/MSQ
Assignment - Week 6 (Neural Networks) Type of Question: MCQ/MSQ
1. In training a neural network, we notice that the loss does not increase in the first few starting
epochs: What is the reason for this?
Answer: D
The problem can occur due to any one of the reasons above.
A) I, II, III, IV
B) IV, III, II, I
C) III, I, II, IV
D) I, IV, III, II
Answer: D
D is the correct sequence.
3. Suppose you have inputs as x, y, and z with values -2, 5, and -4 respectively. You have a
neuron ‘q’ and neuron ‘f’ with functions:
q=x+y
f=q*z
A) (-3, 4, 4)
B) (4, 4, 3)
C) (-4, -4, 3)
D) (3, -4, -4)
Answer: C
To calculate gradient, we should find out (df/dx), (df/dy) and (df/dz).
4. A neural network can be considered as multiple simple equations stacked together. Suppose
we want to replicate the function for the below mentioned decision boundary.
Answer: C
As you can see, combining h1 and h2 in an intelligent way can get you a complex equation.
5. Which of the following is true about model capacity (where model capacity means the
ability of neural network to approximate complex functions)?
Answer: A
Option A is correct.
6. First Order Gradient descent would not work correctly (i.e. may get stuck) in which of the
following graphs?
A)
B)
NPTEL Online Certification Courses
Indian Institute of Technology Kharagpur
C)
D) None of These.
Answer: B
This is a classic example of saddle point problem of gradient descent.
Answer: A
Pattern recognition is what single layer neural networks are best at but they do not have
the ability to find the parity of a picture or to determine whether two shapes are connected
or not.
8. The network that involves backward links from outputs to the inputs and hidden layers is
called as
A) Self-organizing Maps
B) Perceptron
C) Recurrent Neural Networks
D) Multi-Layered Perceptron
Answer: C
End