0% found this document useful (0 votes)
473 views6 pages

Improving Deep Neural Networks: Hyperparameter Tuning, Regularization and Optimization (Week 1) Quiz

This document is a quiz on improving deep neural networks through hyperparameter tuning, regularization, and optimization. It contains multiple choice questions about splitting data into train, development and test sets, identifying high bias or variance in a neural network model, and techniques to address them such as adding regularization, getting more data, or adjusting the network architecture.

Uploaded by

Tony Veas
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
473 views6 pages

Improving Deep Neural Networks: Hyperparameter Tuning, Regularization and Optimization (Week 1) Quiz

This document is a quiz on improving deep neural networks through hyperparameter tuning, regularization, and optimization. It contains multiple choice questions about splitting data into train, development and test sets, identifying high bias or variance in a neural network model, and techniques to address them such as adding regularization, getting more data, or adjusting the network architecture.

Uploaded by

Tony Veas
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 6

11/30/2020 Improving Deep Neural Networks: Hyperparameter tuning, Regularization and Optimization (Week 1) Quiz - APDaga DumpBox : The

mpBox : The Th…

HOME ABOUT CONTACT DISCLAIMER PRIVACY POLICY ADVERTISE      

 CATEGORIES  INTERNET OF THINGS  ARTIFICIAL INTELLIGENCE  YOUTUBE SHOPBY.IN OTHER 

 Recent Posts CELONIS Celonis - Data Engineer (Theory MCQ Questions) OPEN DIARY Is this the best time to start investing in direct Mutual

TOTAL PAGEVIEWS
Home / Arti cial Intelligence / Deep Learning / Machine Learning / Q&A / Improving Deep Neural Networks: Hyperparameter
tuning, Regularization and Optimization (Week 1) Quiz 1 0 9 5 6 4 7 5

Improving Deep Neural Networks:


Hyperparameter tuning,
Regularization and Optimization
(Week 1) Quiz
 Akshay Daga (APDaga) January 06, 2020  Arti cial Intelligence, Deep Learning, Machine Learning, Q&A

▸Practical aspects of deep learning :

CATEGORIES

Arduino Artificial Intelligence Celonis

Deep Learning Freescale/NXP Hadoop

IoT (Internet of Things) Machine Learning

MATLAB NodeMCU Open Diary Python

Improving Deep Neural Networks Week-1 (MCQ) Q&A Raspberry Pi SQL Troubleshoots

ZStar
1. If you have 10,000,000 examples, how would you split the train/dev/test set?

LINKS MONETIZED BY
 98% train . 1% dev . 1% test

 60% train . 20% dev . 20% test

YOUTUBE CHANNEL
 33% train . 33% dev . 33% test
APDaga DumpBox

YouTube 6K

2. The dev and test set should:

https://www.apdaga.com/2020/01/improving-deep-neural-networks-hyperparameter-tuning-regularization-and-optimization-week-1-quiz.html 1/6
11/30/2020 Improving Deep Neural Networks: Hyperparameter tuning, Regularization and Optimization (Week 1) Quiz - APDaga DumpBox : The Th…
 Come from the same distribution

 Come from different distributions

 Be identical to each other (same (x,y) pairs)

 Have the same number of examples

POPULAR POSTS

Coursera: Machine Learning (We


[Assignment Solution] - Andrew

Coursera: Machine Learning (We


[Assignment Solution] - Andrew

Coursera: Machine Learning (We


[Assignment Solution] - Andrew

3. If your Neural Network model seems to have high bias, what of the following would be Coursera: Machine Learning (We
[Assignment Solution] - Andrew
promising things to try? (Check all that apply.)

 Increase the number of units in each hidden layer Coursera: Machine Learning (We
[Assignment Solution] - Andrew

 Add regularization

 Get more training data SOCIAL COUNTER

 Make the Neural Network deeper  3.5k Likes  1.7k F

 Get more test data  2.8k Subscribes  520 F

RECENT POSTS COMMENTS

3. If your Neural Network model seems to have high variance, what of the following All Internet of things (IOT) Tu
would be promising things to try? sequence using Arduino, No
& Raspberry Pi
 Akshay Daga (APDaga)  Jun

 Make the Neural Network deeper


[Solved] How to install 100%
Netflix on Honor 9x Pro & Hua
 Get more training data Mate 30 Pro | Simple and Sec
 Akshay Daga (APDaga)  Jun

 Add regularization Celonis - Data Analyst (Theo


Questions)
 Akshay Daga (APDaga)  Ma
 Get more test data

Celonis - Data Engineer (The


 Increase the number of units in each hidden layer MCQ Questions)
 Akshay Daga (APDaga)  Ma

YOUTUBE CHANNEL

https://www.apdaga.com/2020/01/improving-deep-neural-networks-hyperparameter-tuning-regularization-and-optimization-week-1-quiz.html 2/6
11/30/2020 Improving Deep Neural Networks: Hyperparameter tuning, Regularization and Optimization (Week 1) Quiz - APDaga DumpBox : The Th…
APDaga DumpBox

YouTube 6K

FOLLOWERS

Seguidores (25) Siguiente


4. You are working on an automated check-out kiosk for a supermarket, and are building
a classifier for apples, bananas and oranges. Suppose your classifier obtains a training
set error of 0.5%, and a dev set error of 7%. Which of the following are promising
things to try to improve your classifier? (Check all that apply.)

 Increase the regularization parameter lambda Seguir

 Decrease the regularization parameter lambda FACEBOOK

 Get more training data

 Use a bigger neural network

5. What is weight decay?

 A technique to avoid vanishing gradient by imposing a ceiling on the


values of the weights.

 A regularization technique (such as L2 regularization) that


results in gradient descent shrinking the weights on every iteration.

 The process of gradually decreasing the learning rate during training.

 Gradual corruption of the weights in the neural network if it is trained on


noisy data.

6. What happens when you increase the regularization hyperparameter lambda?

 Weights are pushed toward becoming smaller (closer to 0)

 Weights are pushed toward becoming bigger (further from 0)

 Doubling lambda should roughly result in doubling the weights

https://www.apdaga.com/2020/01/improving-deep-neural-networks-hyperparameter-tuning-regularization-and-optimization-week-1-quiz.html 3/6
11/30/2020 Improving Deep Neural Networks: Hyperparameter tuning, Regularization and Optimization (Week 1) Quiz - APDaga DumpBox : The Th…

 Gradient descent taking bigger steps with each iteration (proportional to


lambda)

7. With the inverted dropout technique, at test time:

 You apply dropout (randomly eliminating units) and do not keep the
1/keep_prob factor in the calculations used in training

 You do not apply dropout (do not randomly eliminate units) and
do not keep the 1/keep_prob factor in the calculations used in
training

 You do not apply dropout (do not randomly eliminate units), but keep
the 1/keep_prob factor in the calculations used in training.

 You apply dropout (randomly eliminating units) but keep the


1/keep_prob factor in the calculations used in training.

8. Increasing the parameter keep_prob from (say) 0.5 to 0.6 will likely cause the
following: (Check the two that apply)

 Increasing the regularization effect

 Reducing the regularization effect

 Causing the neural network to end up with a higher training set error

 Causing the neural network to end up with a lower training set


error

Check-out our free tutorials on IOT (Internet of Things):

APDaga DumpBox

YouTube 6K

https://www.apdaga.com/2020/01/improving-deep-neural-networks-hyperparameter-tuning-regularization-and-optimization-week-1-quiz.html 4/6
11/30/2020 Improving Deep Neural Networks: Hyperparameter tuning, Regularization and Optimization (Week 1) Quiz - APDaga DumpBox : The Th…
9. Which of these techniques are useful for reducing variance (reducing overfitting)?
(Check all that apply.)

 Gradient Checking

 L2 regularization

 Xavier initialization

 Exploding gradient

 Dropout

 Vanishing gradient

 Data augmentation

10. Why do we normalize the inputs x?

 Normalization is another word for regularization–It helps to reduce


variance

 It makes it easier to visualize the data

 It makes the cost function faster to optimize

 It makes the parameter initialization faster

Click here to see solutions for all Machine Learning Coursera Assignments.
&
Click here to see more codes for Raspberry Pi 3 and similar Family.
&
Click here to see more codes for NodeMCU ESP8266 and similar Family.
&
Click here to see more codes for Arduino Mega (ATMega 2560) and similar Family.

Feel free to ask doubts in the comment section. I will try my best to answer it.
If you find this helpful by any mean like, comment and share the post.
This is the simplest way to encourage me to keep doing such work.

Thanks & Regards,


- APDaga DumpBox

SHARE THIS  Facebook  Twitter  

ARTIFICIAL INTELLIGENCE

Improving Deep Neural


CELONIS CELONIS
Networks: Hyperparameter
Celonis - Data Analyst (Theory Celonis - Data Engineer (Theory tuning, Regularization and
MCQ Questions) MCQ Questions) Optimization (Week 3) Quiz

https://www.apdaga.com/2020/01/improving-deep-neural-networks-hyperparameter-tuning-regularization-and-optimization-week-1-quiz.html 5/6
11/30/2020 Improving Deep Neural Networks: Hyperparameter tuning, Regularization and Optimization (Week 1) Quiz - APDaga DumpBox : The Th…

PREVIOUS NEXT
Coursera: Machine Learning - All weeks solutions Coursera: Neural Networks and Deep Learning - All
[Assignment + Quiz] - Andrew NG weeks solutions [Assignment + Quiz] -
deeplearning.ai

 POST COMMENT BLOGGER FACEBOOK DISQUS

No comments

Enter your comment...

Comment as: tonyveas@gma Sign out

Publish Preview Notify me

Created By ThemeXpose      

https://www.apdaga.com/2020/01/improving-deep-neural-networks-hyperparameter-tuning-regularization-and-optimization-week-1-quiz.html 6/6

You might also like