0% found this document useful (0 votes)
110 views4 pages

Deep Learning - IIT Ropar - Unit 11 - Week 8

Uploaded by

vmcse09
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
110 views4 pages

Deep Learning - IIT Ropar - Unit 11 - Week 8

Uploaded by

vmcse09
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 4

10/27/24, 1:13 PM Deep Learning - IIT Ropar - - Unit 11 - Week 8

(https://swayam.gov.in) (https://swayam.gov.in/nc_details/NPTEL)

[email protected]

NPTEL (https://swayam.gov.in/explorer?ncCode=NPTEL) » Deep Learning - IIT Ropar (course)

Course Week 8 : Assignment 8


outline The due date for submitting this assignment has passed.
Due on 2024-09-18, 23:59 IST.
About
NPTEL ()
Assignment submitted on 2024-09-14, 19:14 IST
How does an 1) We have observed that the sigmoid neuron has become saturated. What might be 1 point
NPTEL the possible output values at this neuron?
online
course 0.02
work? () 0.5
1
Week 1 ()
0.97

Week 2 () Yes, the answer is correct.


Score: 1
Week 3 () Accepted Answers:
0.02
1
week 4 ()
0.97

Week 5 ()
2) What are the challenges associated with using the Tanh(x) activation function? 1 point
Week 6 ()
It is not zero centered
Computationally expensive
Week 7 ()
Non-differentiable at 0
Week 8 () Saturation

Yes, the answer is correct.


A quick recap
Score: 1
of training
Accepted Answers:
deep neural
Computationally expensive
networks
(unit?
Saturation

https://onlinecourses.nptel.ac.in/noc24_cs114/unit?unit=107&assessment=296 1/4
10/27/24, 1:13 PM Deep Learning - IIT Ropar - - Unit 11 - Week 8

unit=107&less 3) What makes batch normalization effective in deep networks? 1 point


on=108)
It reduces the covariance shift
Unsupervised
pre-training It accelerates training
(unit? It introduces regularization
unit=107&less
It reduces the internal shift in activations
on=109)
No, the answer is incorrect.
Better Score: 0
activation Accepted Answers:
functions It reduces the covariance shift
(unit? It accelerates training
unit=107&less
It reduces the internal shift in activations
on=110)

Better 4) We train a feed-forward neural network and notice that all the weights for a 1 point
initialization particular neuron are equal. What could be the possible causes of this issue?
strategies
(unit? Weights were initialized randomly
unit=107&less
Weights were initialized to high values
on=111)
Weights were initialized to equal values
Batch
Weights were initialized to zero
Normalization
(unit? Yes, the answer is correct.
unit=107&less Score: 1
on=112) Accepted Answers:
Weights were initialized to equal values
Lecture
Weights were initialized to zero
Material for
Week 8 (unit?
unit=107&less 5) Which of the following best describes the concept of saturation in deep learning? 1 point
on=113)
When the activation function output approaches either 0 or 1 and the gradient is close to
Week 8 zero.
Feedback
When the activation function output is very small and the gradient is close to zero.
Form: Deep
Learning - IIT When the activation function output is very large and the gradient is close to zero.
Ropar (unit? None of the above.
unit=107&less
on=191)
Partially Correct.
Score: 0.33
Quiz: Week 8 Accepted Answers:
: Assignment When the activation function output approaches either 0 or 1 and the gradient is close to zero.
8 When the activation function output is very small and the gradient is close to zero.
(assessment? When the activation function output is very large and the gradient is close to zero.
name=296)

6) Which of the following is true about the role of unsupervised pre-training in deep 1 point
Week 9 ()
learning?

week 10 () It is used to replace the need for labeled data


It is used to initialize the weights of a deep neural network
Week 11 ()
It is used to fine-tune a pre-trained model
Week 12 () It is only useful for small datasets

Yes, the answer is correct.

https://onlinecourses.nptel.ac.in/noc24_cs114/unit?unit=107&assessment=296 2/4
10/27/24, 1:13 PM Deep Learning - IIT Ropar - - Unit 11 - Week 8

Score: 1
Download Accepted Answers:
Videos () It is used to initialize the weights of a deep neural network

Books () 7) Which of the following is an advantage of unsupervised pre-training in deep 1 point


learning?
Text
It helps in reducing overfitting
Transcripts
() Pre-trained models converge faster
It improves the accuracy of the model
Problem It requires fewer computational resources
Solving
Session - Yes, the answer is correct.
Score: 1
July 2024 ()
Accepted Answers:
It helps in reducing overfitting
Pre-trained models converge faster
It improves the accuracy of the model

8) What is the main cause of the Dead ReLU problem in deep learning? 1 point

High variance
High negative bias
Overfitting
Underfitting

No, the answer is incorrect.


Score: 0
Accepted Answers:
High negative bias

9) What is the mathematical expression for the ReLU activation function? 1 point

f(x) = x if x < 0, 0 otherwise

f(x) = 0 if x > 0, x otherwise


f(x) = max(0,x)
f(x) = min(0,x)

Yes, the answer is correct.


Score: 1
Accepted Answers:
f(x) = max(0,x)

10) What is the purpose of Batch Normalization in Deep Learning? 1 point

To improve the generalization of the model


To reduce overfitting
To reduce bias in the model
To ensure that the distribution of the inputs at different layers doesn't change

Yes, the answer is correct.

https://onlinecourses.nptel.ac.in/noc24_cs114/unit?unit=107&assessment=296 3/4
10/27/24, 1:13 PM Deep Learning - IIT Ropar - - Unit 11 - Week 8

Score: 1
Accepted Answers:
To ensure that the distribution of the inputs at different layers doesn't change

https://onlinecourses.nptel.ac.in/noc24_cs114/unit?unit=107&assessment=296 4/4

You might also like