Deep Learning - IIT Ropar - Unit 11 - Week 8
Deep Learning - IIT Ropar - Unit 11 - Week 8
(https://swayam.gov.in) (https://swayam.gov.in/nc_details/NPTEL)
Week 5 ()
2) What are the challenges associated with using the Tanh(x) activation function? 1 point
Week 6 ()
It is not zero centered
Computationally expensive
Week 7 ()
Non-differentiable at 0
Week 8 () Saturation
https://onlinecourses.nptel.ac.in/noc24_cs114/unit?unit=107&assessment=296 1/4
10/27/24, 1:13 PM Deep Learning - IIT Ropar - - Unit 11 - Week 8
Better 4) We train a feed-forward neural network and notice that all the weights for a 1 point
initialization particular neuron are equal. What could be the possible causes of this issue?
strategies
(unit? Weights were initialized randomly
unit=107&less
Weights were initialized to high values
on=111)
Weights were initialized to equal values
Batch
Weights were initialized to zero
Normalization
(unit? Yes, the answer is correct.
unit=107&less Score: 1
on=112) Accepted Answers:
Weights were initialized to equal values
Lecture
Weights were initialized to zero
Material for
Week 8 (unit?
unit=107&less 5) Which of the following best describes the concept of saturation in deep learning? 1 point
on=113)
When the activation function output approaches either 0 or 1 and the gradient is close to
Week 8 zero.
Feedback
When the activation function output is very small and the gradient is close to zero.
Form: Deep
Learning - IIT When the activation function output is very large and the gradient is close to zero.
Ropar (unit? None of the above.
unit=107&less
on=191)
Partially Correct.
Score: 0.33
Quiz: Week 8 Accepted Answers:
: Assignment When the activation function output approaches either 0 or 1 and the gradient is close to zero.
8 When the activation function output is very small and the gradient is close to zero.
(assessment? When the activation function output is very large and the gradient is close to zero.
name=296)
6) Which of the following is true about the role of unsupervised pre-training in deep 1 point
Week 9 ()
learning?
https://onlinecourses.nptel.ac.in/noc24_cs114/unit?unit=107&assessment=296 2/4
10/27/24, 1:13 PM Deep Learning - IIT Ropar - - Unit 11 - Week 8
Score: 1
Download Accepted Answers:
Videos () It is used to initialize the weights of a deep neural network
8) What is the main cause of the Dead ReLU problem in deep learning? 1 point
High variance
High negative bias
Overfitting
Underfitting
9) What is the mathematical expression for the ReLU activation function? 1 point
https://onlinecourses.nptel.ac.in/noc24_cs114/unit?unit=107&assessment=296 3/4
10/27/24, 1:13 PM Deep Learning - IIT Ropar - - Unit 11 - Week 8
Score: 1
Accepted Answers:
To ensure that the distribution of the inputs at different layers doesn't change
https://onlinecourses.nptel.ac.in/noc24_cs114/unit?unit=107&assessment=296 4/4