0% found this document useful (0 votes)
75 views5 pages

CM412 - DL - Model Paper

DL model paper

Uploaded by

Somasekhar Lalam
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
75 views5 pages

CM412 - DL - Model Paper

DL model paper

Uploaded by

Somasekhar Lalam
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 5

CM412(R-20)

R.V.R&J.C COLLEGE OF ENGINEERING, GUNTUR-522019


(Autonomous)
B.TECH. Semester-VI [Third year] Degree Examination
MODEL QUESTION PAPER
Subject Name: DEEP LEARNING
Time: Three Hours Maximum Marks: 70
All Questions carry equal marks.
Answer Question No.1 compulsory (14 x 1 = 14)
Answer One Question from each unit. (4 x 14 = 56)

1. Answer the following Marks CO BTL


a) What is meant by shallow learning? 1M CO1 L1
b) Define cost function. 1M CO1 L1
c) What are the challenges motivating Deep Learning? 1M CO1 L2
d) Give examples of Computational Graphs. 1M CO1 L3
e) What is Dataset Augmentation? 1M CO2 L2
f) Give the use of Norm Penalties. 1M CO2 L1
g) Define empirical risk minimization 1M CO2 L3
h) Illustrate Polyak Averaging. 1M CO2 L1
i) Define convolution. 1M CO3 L2
j) Define Pooling. 1M CO3 L1
k) What are Structured Outputs? 1M CO3 L2
l) Differentiate between Neural Networks and Recurrent neural 1M CO4 L1
networks.
m) What are Echo State Networks? 1M CO4 L3
n) Draw the Block diagram of the LSTM recurrent network “cell.” 1M CO4 L2

UNIT – I

2. a) Illustrate the working of deep learning. (7 M) CO1 L2


b) Explain the back-propagation algorithm with an example to train a (7 M) CO1 L3
multilayer perceptron.

(OR)

3. a) Discuss the technical forces for driving advances in machine (7 M) CO1 L2


learning.
b) Identify the role of Symbol-to-Symbol Derivatives. (7 M) CO1 L3

UNIT – II
4. a) Discus Multi-Task Learning with suitable illustrations. (7 M) CO2 L2
b) Summarize several of the most prominent challenges involved in (7 M) CO2 L2
optimization for training deep models.

(OR)

5. a) Develop the concepts of Early stopping and Dropout. Further, (7 M) CO2 L3


identify their advantages with normal stopping with no Dropout.
b) How Learning Differs from Pure Optimization? Discuss. (7 M) CO2 L3

UNIT – III

6. a) Illustrate the convolution operation. (7 M) CO3 L2


b) Discuss the different formats of data that can be used with (7 M) CO3 L3
convolutional networks.

(OR)

7. a) Explain the components of a typical convolutional neural network (7 M) CO3 L2


layer.
b) Explain the role of Neuroscientific Basis for Convolutional (7 M) CO3 L3
Networks.

UNIT – IV

8. a) Write about Unfolding Computational Graphs with suitable (7 M) CO4 L2


illustrations.
b) Explain briefly about Deep Recurrent Networks. (7 M) CO4 L2

(OR)

9. a) Discuss some examples of important design patterns for recurrent (7 M) CO4 L2


neural networks.
b) Write briefly about Bidirectional RNNs. (7 M) CO4 L2

*********
CM 412
L T P I E C
CM412 DEEP LEARNING 3 - - 30 70 3
Course Objectives:
At the end of the course, the student will understand and
1. Demonstrate the major technology trends driving Deep Learning
2. Build, train and apply fully connected deep neural networks
3. Implement efficient (vectorized) neural networks
4. Analyze the key parameters and hyper parameters in a neural network's architecture

Course Outcomes:
At the end of the course, the student will be able to
CO1. Differentiate architecture of deep neural network.
CO2. Analyze the challenges in optimization of a network.
CO3. Build a convolutional neural network.
CO4. Build and train RNN and LSTMs.
Course Content:
UNIT – I 12 Hours
What is deep learning? : Introduction, Historical trends in deep learning, Artificial intelligence, machine
learning, and deep learning - Artificial intelligence, Machine learning, Learning rules and representations
from data, The “deep” in “deep learning”, Understanding how deep learning works, Why deep learning? Why
now? Challenges Motivating Deep Learning.
Deep Feed-forward Networks: Example: Learning XOR, Gradient-Based Learning, Hidden Units,
Architecture Design, Back-Propagation and Other Differentiation Algorithms, Historical Notes.

UNIT – II 12 Hours
Regularization for Deep Learning: Parameter Norm Penalties, Norm Penalties as Constrained
Optimization, Regularization and Under-Constrained Problems, Dataset Augmentation, Noise
Robustness, Semi-Supervised Learning, Multi-Task Learning, Early Stopping, Parameter Tying and
Parameter Sharing, Sparse Representations, Bagging and Other Ensemble Methods, Dropout,
Adversarial Training, Tangent Distance, Tangent Prop and Manifold Tangent Classifier.
Optimization for Training Deep Models: Pure Optimization, Challenges in Neural Network
Optimization, Basic Algorithms, Parameter Initialization Strategies, Algorithms with Adaptive
Learning Rates, Approximate Second-Order Methods, Optimization Strategies and Meta-Algorithms.

UNIT – III 12 Hours


Convolutional Networks: The Convolution Operation, Pooling, Convolution and Pooling as an
Infinitely Strong Prior, Variants of the Basic Convolution Function, Structured Outputs, Data Types,
Efficient Convolution Algorithms, Random or Unsupervised Features, The Neuroscientific Basis for
Convolutional Networks, Convolutional Networks and the History of Deep Learning.
UNIT – IV 12 Hours
Sequence Modelling - Recurrent and Recursive Nets: Unfolding Computational Graphs, Recurrent
Neural Networks, Bidirectional RNNs, Encoder-Decoder Sequence-to-Sequence Architectures, Deep
Recurrent Networks, Recursive Neural Networks, The Challenge of Long-Term Dependencies, Echo
State Networks, Leaky Units and Other Strategies for Multiple Time Scales, The Long Short-Term
Memory and Other Gated RNNs, Optimization for Long-Term Dependencies, Explicit Memory.

Learning Resources:
Text Books:

1. Deep Learning, Ian Goodfellow and YoshuaBengio and Aaron Courville, MIT Press, 2016.

Reference Books:

1. Francois Chollet, “Deep Learning with Python”, Second Edition, Manning Publications, 2021.
2. Deep learning: a practitioners approach, Josh Patterson and Adam Gibson, O’reilly, 2017.
3. Neural Networks and Deep Learning, Charu C Aggarwal, Springer,2018.
List of Paper Setters:

Dr. G.Anuradh
Associate Professor, Dept. of CSE,
V.R.Siddhartha Engineering College,
Vijayawada.
Mail id: [email protected]
Mobile No : 9441819173
CM: DEEP LEARNING
Dr. K.Venkata Raju
Professor, Dept. of CSE,
KL University.
Mail id: [email protected]
Mobile No:9440477917
Dr. B.Tarakeswara arao
Professor, Dept. of CSE(AI&DS),
KHIT, Guntur.
Mail id: [email protected]
Mobile No: 9441045755

You might also like