0% found this document useful (0 votes)
7 views3 pages

Assignment Based on Deep Learning 1

The document outlines a comprehensive mid-semester assignment focused on deep learning topics, including perceptrons, regression, stochastic gradient descent, regularization, hyperparameter tuning, activation functions, and artificial neural networks (ANN). Each section poses multiple questions requiring detailed explanations and real-life applications. The assignment emphasizes understanding key concepts and their practical implications in the field of deep learning.

Uploaded by

shaikhshama0206
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
7 views3 pages

Assignment Based on Deep Learning 1

The document outlines a comprehensive mid-semester assignment focused on deep learning topics, including perceptrons, regression, stochastic gradient descent, regularization, hyperparameter tuning, activation functions, and artificial neural networks (ANN). Each section poses multiple questions requiring detailed explanations and real-life applications. The assignment emphasizes understanding key concepts and their practical implications in the field of deep learning.

Uploaded by

shaikhshama0206
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 3

MID SEMESTER Assignment Based on Deep Learning

1) What is a Perceptron Training Rule?


2) What is the application of it and explain application of Perceptron using five different examples?
3) Explain loss functions for regression and classification in details?
4) What are some possible problems with regression models?
5) Which one of these Cannot be used for regression problem?
6) How does regression help in data extraction?
7) How do you minimize a loss in a linear regression?
8) What are the 4 conditions for regression?
9) What are the limitations of regression?
10) How do you improve regression performance?
11) What are some real-life examples of regression?
12) What happens if you include too many variables in regression?
13) How do you know if a regression model is accurate?
14) Explain stochastic Gradian Descent in details?
15) Which problem is solved by stochastic gradient descent?
16) Explain 5 different application of stochastics using examples?
17) What is the time complexity of stochastic gradient descent?
18) Which models use stochastic gradient descent?
19) Explain optimization and regularization in details? Also explain 5 application using examples?
20) What kind of problems does regularization solve?
21) Why do we do regularization when we train an optimization-based model?
22) Why does regularization reduce overfitting?
23) What happens if regularization is too high?
24) Explainhyperparameter tuning in details?
25) Also explain 5 application using examples?
26) What is the purpose of hyperparameter tuning?
27) Which method is used for hyperparameter tuning?
28) Does hyperparameter tuning lead to Overfitting?
29) What dataset is used to tune hyperparameters?
30) Can hyperparameter tuning decrease accuracy?
31) Is validation set used for hyperparameter tuning?
32) Which algorithm uses learning rate as hyperparameter?
33) Are all hyperparameters equally important?
34) How do I choose a good hyperparameter?
35) Why hyperparameter tuning is difficult? Can hyperparameter tuning help with underfitting?
36) Explain Sigmoid neuron and fully connected network in details?
i) Discuss about at least 5 different applications of it using examples?
ii) Why might it be preferable to use sigmoid neurons in a neural network rather than
perceptron’s?
iii) Why sigmoid function is not used in neural network?
iv) Why the sigmoid function is important in artificial neurons and how this relates to a biological
neuron?
v) What is the maximum output value of the sigmoid function?
vi) What is the big problem of sigmoid activation function?
vii) Why is it called sigmoid?
viii) Why is the sigmoid function nonlinear?
ix) What type of function is sigmoid?
x) Can sigmoid be used in output layer if yes then how?
37. Explain Gradient Descent and Backpropagationin details?
a) Discuss at least about 5 different applications of it using examples?
b) Does gradient descent is used in backpropagation learning?
c) Does backpropagation learning is based on gradient descent error surface?
d) Is backpropagation an efficient method to do gradient descent?
e) Does bias change in backpropagation?
f) What are the limitations of backpropagation?
g) Which rule is used in backpropagation algorithm?
h) Why is momentum used with backpropagation?
i) Why backpropagation is called so?
j) What are alternatives to backpropagation?
k) What is Delta in backpropagation?
l) What's the difference between feedforward and backpropagation?
m) Explain overfitting and underfittingin details?
n) Discuss at least about 5 different applications of it using examples?
o) How do you identify overfitting and underfitting in machine learning?
p) Can we have overfitting and underfitting at the same time?
q) What are the effects of underfitting and overfitting on the performance of a machine learning
model?
r) Is overfitting high bias or variance?
s) Why is overfitting more likely to occur on smaller datasets?
t) Which technique reduces overfitting?
u) Which model is mostly prone to overfitting?
v) Is bias same as overfitting?
w) Can bagging eliminate overfitting?
x) Does cross-validation prevent overfitting?
y) Explain Feature Selection in details?
z) Discuss at least about 5 different applications of it using examples?
aa) How do you select feature selection in machine learning?
bb) Which feature selection method is best in machine learning?
cc) Which algorithm is used for feature selection?
dd) What is the goal of feature selection?
ee) Does feature selection reduce accuracy?
ff) Is feature selection always necessary?
gg) Does feature selection reduce dimensionality?
hh) Can clustering be used for feature selection?
ii)
38. Explain ANN in details?
39. Discuss at least about 5 different applications of it using examples?
40. Explain Activation functions in details? Discuss about at least 5 different applications of it using
examples?
41. Explain different types of activation function?
42. Which activation function is used for multiclass classification?
43. Which activation function is used for binary classification?
44. What are the applications of NLP?
45. Explain Batch and mini-batch Processing in details?
46. Discuss about at least 5 different applications of it using examples?
47.What is mini batch processing?
48. What is the difference between batch and mini batch?
49. Why are mini batches used?
50. What is the difference between batch gradient descent and mini batch gradient descent?
51.Why is the best mini-batch size usually not 1 and not m?
52. Why do we use multiples of 2 for the mini-batch size?
53.Explain Cross-Validation in details?
54. Discuss about at least 5 different applications of it using examples?
55. Explain different types of Cross-validation function?
56. What it means to apply 5 cross fold validation?
57. Explain Regularization (L2 Penalty,dropout, ensembles, data augmentation techniques) in details?
58. What is the purpose of regularization?
59. Discuss about at least 5 different applications of it using examples?
60. Differentiate between L1 And L2?
61. What is L2 loss?
62.Why is L2 better than L1?
63. Explain Gradient Descent and Backpropagationin details?
64. Discuss at least about 5 different applications of it using examples?
65. Does gradient descent is used in backpropagation learning?
66. Does backpropagation learning is based on gradient descent error surface?
67. Is backpropagation an efficient method to do gradient descent?
68. Does bias change in backpropagation?
69. What are the limitations of backpropagation?
70. Which rule is used in backpropagation algorithm?
71. Why is momentum used with backpropagation?
72. Why backpropagation is called so?
73. What are alternatives to backpropagation?
74.What is Delta in backpropagation?
75. What's the difference between feedforward and backpropagation?

You might also like