DeepLearning Module3 4&5
DeepLearning Module3 4&5
Department of CSE
Continuous Internal Examination- II
Course Code: 21CS743 Course Title: Deep Learning Semester: VII
Time: 1:30Hrs Max. Marks: 50
Module-3
Explain the concept of Empirical Risk Minimization (ERM) in the context of deep
learning. Discuss its advantages and limitations. Or
1 CO3 K3 M3 Given a practical machine learning scenario, explain how you would apply 10
Empirical Risk Minimization (ERM) to train a deep learning model. Discuss the
key steps involved and the challenges you might encounter.
How do vanishing and exploding gradients impact the training process? What
techniques can be employed to mitigate these issues? Or
2 CO3 K3 M3 Compare and contrast the effectiveness of gradient clipping and normalization 10
techniques in mitigating the vanishing and exploding gradient problems. Discuss
the trade-offs involved.
Compare and contrast SGD with batch gradient descent and mini-batch gradient
3 CO3 K3 M3 10
descent. Discuss the advantages and disadvantages of each approach.
4 CO3 K2 M3 Explain the importance of proper parameter initialization in deep learning models. 10
Describe the AdaGrad algorithm in detail. Or Explain the core concept of adaptive
5 CO3 K3 M3 learning rates in optimization algorithms. How does AdaGrad utilize this concept to 10
adjust learning rates for different parameters?
Explain the RMSProp algorithm and its relationship to the AdaGrad algorithm. Or
Analyze the differences between AdaGrad and RMSProp in terms of their update
6 CO3 K3 M3 10
rules and convergence behavior. Discuss the scenarios where one algorithm might
outperform the other.
Discuss the factors to consider when selecting an optimization algorithm for a deep
learning model. Or
7 CO3 K4 M3 Analyze the trade-offs between different optimization algorithms in terms of 10
convergence speed, generalization performance, and computational efficiency.
Discuss how to select the most appropriate algorithm for a given deep learning task.
How does momentum help accelerate convergence and overcome local minima?
8 CO3 K4 M3 10
Discuss the role of momentum in algorithms like SGD with Momentum and Adam.
Module-4
Describe the process of convolution in detail, using a 5x5 image and a 3x3 kernel as an
1 CO4 K3 M4 example. . Image - [3 3 2 1 0; 0 0 1 3 1; 3 1 2 2 3; 2 0 0 2 2 ; 2 0 0 0 1]; Kernel [ 0 10
1 2; 2 2 0; 0 1 2;]. Show the step by step results.
What is the purpose of pooling layers in CNNs? Describe the difference between max
2 CO4 K3 M4 10
pooling and average pooling with an example.
Apply Max pooling for the given feature map 1 2 3 4 5 6 7; 8 9 10 11 12 13 14; 15 16 17 18
3 CO4 K3 M4 19 20 21; 22 23 24 25 26 27 28; 29 30 31 32 33 34 35; 36 37 38 39 40 41 42; with a stride 10
of 1,2 and. Comment on the result.
How do convolution and pooling layers act as an infinitely strong prior for visual data?
4 CO4 K4 M4 10
What are the implications of this?
Discuss the following in the context of convolution - valid, same, full, unshared, tiled
5 CO4 K2 M4 10
convolution with required diagrams and mathematical equations.
6 CO4 K2 M4 10
What is Tiled Convolution, With neat diagram, explain in detail?
What do you understand by structured outputs. With neat diagram exlain recurrent
7 CO4 K3 M4 convolutional network. Or How can you adapt Recurrent Convolutional Network for pixel 10
Labelling. Discuss with neat diagram.
Beyond images, what other types of data can CNNs process? How are these data types
8 CO4 K3 M4
preprocessed and fed into the network? 10
9 CO4 K2 M4 Explain Data types in the context of Convolutional network with examples. 10
10 CO4 K2 M4 10
Write short notes on efficient convolution algorithms
11 CO4 K4 M4 Describe the architecture of LeNet-5. How did it contribute to the development of CNNs? 10
What were the key innovations in AlexNet that significantly improved image classification
12 CO4 K4 M4 10
accuracy? How did it overcome the limitations of previous CNN architectures?
Module-5
1 CO5 K2 M5 10
Explain the concept of unfolding a computational graph in the context of RNNs
2 CO5 K2 M5 10
Discuss the advantages and disadvantages of unfolding RNNs for training and inference.
3 CO5 K3 M5 10
Describe the basic architecture of a Recurrent Neural Network (RNN).
How do RNNs handle sequential data compared to traditional feedforward neural
4 CO5 K3 M5
networks? 10
5 CO5 K2 M5 10
Explain Bidirectional RNNs (Bi-RNNs) with a neat diagram.
How do Bi-RNNs improve the performance of RNNs, especially for tasks like language
6 CO5 K3 M5 10
modeling and machine translation?
7 CO5 K2 M5 10
Discuss the challenges and limitations of Bi-RNNs
8 CO5 K2 M5 Explain the concept of Recursive Neural Networks (RecNNs). 10
9 CO5 K2 M5
Discuss the role of the forget gate, input gate, and output gate in LSTM cells. 10
Write short note on preprocessing, Contrast normalization and dataset augmentation in
10 CO5 K2 M5
the context of computer vision. 10
11 CO5 K2 M5 10
Discuss the techniques used in deep learning for speech recognition?
12 CO5 K2 M5 10
Write short notes on NLP and NLMs. List out the key differences between NLP and NLMs.
13 CO5 K2 M5 Explain recommender system 10
Discuss the challenges and strategies for training large-scale deep learning models. How
15 CO5 can techniques like distributed training and model parallelism be used to address these 10
challenges?