0% found this document useful (0 votes)
32 views3 pages

20IT7301 - Deep Learning Syllabus

VRSEC deep learning syllabus

Uploaded by

dedipya123
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
32 views3 pages

20IT7301 - Deep Learning Syllabus

VRSEC deep learning syllabus

Uploaded by

dedipya123
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 3

20IT7301-DEEP LEARNING

Course Program Core 3


Credits:
Category:
Course Type: Theory Lecture-Tutorial-Practice: 2-0-2
Prerequisites: 20IT6302-Machine Learning Continuous Evaluation: 30
Semester end Evaluation: 70
Total Marks: 100
Course Upon successful completion of the course, the student will be able to:
Outcomes CO1 Analyze the performance of feed forward neural networks with different
hyper parameters
CO2 Apply CNN, Auto encoders, Attention mechanisms and GANs on image
processing applications
CO3 Design a suitable RNN model for time series applications
CO4 Create a suitable intelligent model for the given application
Contributio P P P P P P P P P P
n of Course PO PO PSO PSO
O O O O O O O O O O
Outcomes 2 3 1 2
1 4 5 6 7 8 9 10 11 12
towards
CO1 1 1 2 1 1 1 1 1
achievement
of Program CO2 2 2 2 1 1 1 2 2
Outcomes CO3 2 2 3 2 2 2 2 2 2
1-Low, 2- 3 2 3 2 3 2 2 3 3 3
Medium, 3- CO4
High)
Course UNIT I:
Content The Neural Network: Building Intelligent Machines, The Limits of
Traditional Computer Programs, The Mechanics of Machine Learning, The
Neuron, Expressing Linear Perceptrons as Neuron, Feed-Forward Neural
Networks, Linear Neurons and Their Limitations, Sigmoid, Tanh, and ReLU,
Softmax Output Layers
Training Feed-Forward Neural Network: Gradient Descent, The Delta Rule
and Learning Rates, Gradient Descent with Sigmoidal Neurons, The
Backpropagation Algorithm, Stochastic and Minibatch Gradient Descent, Test
Sets, Validation Sets, and Overfitting, Preventing Overfitting in Deep Neural
Networks
UNIT II:
Convolutional Neural Networks: Neurons in Human Vision, The
Shortcomings of Feature Selection, Filters and Feature Maps, Convolutional
Layer, Max Pooling, Full Architectural Description of Convolution Networks,
Image Preprocessing pipelines, Accelerating training with batch normalization
Embedding and Representation Learning: Learning Lower-Dimensional
Representations,Principal Component Analysis, Motivating the Autoencoder
Architecture, Denoising to Force Robust Representations, Sparsity in
Autoencoders

UNIT III:
Sequence Modeling: Recurrent and Recursive nets: Unfolding
Computational Graphs, Recurrent neural networks, Bidirectional RNNS,
Encoder-Decoder sequence-to –sequence architectures, Deep Recurrent
networks, Recursive neural networks.
The Challenge of Long-Term Dependencies: Echo State Networks, Leaky
Units &Other strategies for multiple timescales, The Long Short-Term memory
UNIT IV:
Advanced Topics in Deep Learning: Introduction, Attention Mechanisms,
Recurrent Models of Visual Attention, Attention Mechanisms for Machine
Translation, Neural Networks with External Memory-Neural Turing Machine
Generative Adversarial Networks: Training a Generative Adversarial
Network, Using GANs for Generating Image Data, Conditional Generative
Adversarial Networks, Competitive Learning, Limitations of Neural Networks

Content Beyond: The Transformer Neural Network


Text books Text Book(s):
and [1]. Nikhil Buduma, Nicholas Locascio, “Fundamentals of Deep Learning:
Reference Designing Next-Generation Machine Intelligence Algorithms”, O'Reilly
books Media, 2017
[2]. Ian Goodfellow, YoshuaBengio, Aaron Courville, ”Deep
Learning(Adaptive Computation and Machine Learning series”,MIT
Press, 2017
[3]. Charu C. Aggarwal, Neural Networks and Deep Learning, c Springer
International Publishing AG, part of Springer Nature 2018, ISBN 978-3-
319-94462-3 ISBN 978-3-319-94463-0 (eBook)
Reference (Book)s:
[1]. Li Deng and Dong Yu, “Deep learning Methods and Applications”, Now
publishers, 2013
[2]. Michael Nielsen,“Neural Networks and Deep Learning”, Determination
Press 2015
[3]. Vaswani A, Shazeer N, Parmar N, Uszkoreit J, Jones L, Gomez AN,
Kaiser Ł, Polosukhin I. Attention is all you need. Advances in neural
information processing systems. 2017; 30.
E- [1]. MiteshKhapra, “Deep Learning”, Sep 20, 2018,
resources https://www.youtube.com/ watch?v=4TC5s_xNKSs&list=PLH-
and other xYrxjfO2VsvyQXfBvsQsufAzvlqdg9
digital [2]. AfshineAmidi and ShervineAmidi ,”Deep Learning cheat sheets for
material Stanford's CS 230”, 2018, https://github.com/afshinea/stanford-cs-230-
deep-learning
[3]. YoshuaBengio, Deep learning: “Theoretical Motivations, Canadian
Institute for Advanced Research”, 2015
http://videolectures.net/deeplearning2015_bengio_theoretical_motivations
/
[4]. Geoffrey Hinton’s GoogleTech Talk,”Recent developments on Deep
Learning” March 2010, https://www.youtube.com/watch?v=VdIURAu1-
aU

Designation Name in Capitals Signature with Date


Course Coordinator DR. T.ANURADHA
Program Coordinator DR.G.KALYANI
Head of the Department DR.M.SUNEETHA

You might also like