01-NDL Theory and Lab Syllabus
01-NDL Theory and Lab Syllabus
Module – 3
DYNAMICALLY DRIVEN RECURRENT NETWORKS AND RECURRENT 6 Hours
HOPFIELD NETWORKS:
Introduction, Recurrent Network Architectures, Universal Approximation Theorem,
Controllability and Observability, Computational Power of Recurrent Networks,
Learning Algorithms, Back Propagation Through Time, Real-Time Recurrent
Learning. Operating Principles of the Hopfield Network, Stability Conditions of the
Hopfield Network, Associative Memories, Outer Product Method, Pseudoinverse Matrix
Method, Storage Capacity of Memories, Design Aspects of the Hopfield Network. Case
studies.
Text Book 1: 15.1-15.8, relevant journals. Text Book 2: 7.1-7.5, relevant journals.
Module – 4
FEEDFORWARD NEURAL NETWORKS: 5 Hours
Introduction, Pattern classification using perceptron, Multilayer feedforward neural
networks (MLFFNNs), Backpropagation learning, Empirical risk minimization,
Regularization, Autoencoders.
Text Book 3: Selected topics from chapter 6, chapter 7, chapter 14
Module- 5
DEEP NEURAL NETWORKS AND CONVOLUTION NEURAL NETWORKS: 8 Hours
Introduction, Difficulty of training DNNs, Greedy layerwise training, Optimization for
training DNNs, Newer optimization methods for neural networks (AdaGrad, RMSProp,
Adam), Second order methods for training, Regularization methods (dropout, drop
connect, batch normalization). Convolution neural networks (CNNs): Introduction to
CNNs – convolution, pooling, Deep CNNs, Different deep CNN architectures – LeNet,
AlexNet, VGG, PlacesNet, Training a CNNs: weights initialization, batch normalization,
hyperparameter optimization, Understanding and visualizing CNNs.
Text Book 3: Selected topics from chapter 7, chapter 9, chapter 15
Module -6
RECURRENT NEURAL NETWORKS: 5 Hours
Recurrent neural networks (RNNs): Sequence modeling using RNNs, Back propagation
through time, Long Short Term Memory (LSTM), Generative models: Restrictive
Boltzmann Machines (RBMs).
Text Book 3: Selected topics from chapter 10 and chapter 20
Course outcomes:
• After studying this course, students will be able:
• To understand the different types of learning, activation functions, and simple Perceptron.
• To analyze MLP and RBF networks.
• To model and design SOM systems for a given problem.
• To design deep learning networks.
• To analyse deep learning architectures.
Text Books:
1. Simon Haykin, Neural Networks and Learning Machines, 3rd ed, Pearson, Prentice Hall, 2010
2. Ivan Nunes da Silva, Danilo Hernane Spatti, Rogerio Andrade Flauzino, Luisa Helena Bartocci
Liboni, & Silas Franco dos Reis Alves, Artificial Neural Networks: A Practical Course,
Springer International Publishing, 2017
3. Ian Goodfellow, Yoshua Bengio and Aaron Courville, Deep learning, In preparation for MIT
Press, Available online: http://www.deeplearningbook.org
4. John Paul Mueller & Luca Massaron, Deep Learning For Dummies, Wiley & Sons, Inc., 2019
Reference Books:
1. Daniel Graupe, Principles of Artificial Neural Networks, 3rd ed, World Scientific Publishing,
2013
2. Eugene Charniak, Introduction to Deep Learning, MIT Press, 2018
NEURAL NETWORKS AND DEEPLEARNING LABORATORY
[ Revised Credit System ]
CREDITS - 01
Course objectives:
Understand implementation detail for an artificial neural network - the types of activation
function, learning algorithms, the types of perceptron and various other algorithms and its
functionality.
Implement and design the aspects of real time systems.
Design and Apply Deep neural networks for real time problem solving.
a Implement the basic neuron and update the learning parameter using different
1 activation functions. Show the graph.
b Implement the neuron using error correction learning algorithm and memory-based
learning algorithm.
2 Implement the neuron using Hebbian Learning algorithm and show the graph to
differentiate the hypothesis between Hebb and Co-variance learning.
a Implement the Gate Operations using Single Layer Perceptron.
3 b Implement the XOR gate operations using multi-layer perception and show the
error propagation by iterating the learning rate.
4 Implement the revision XOR operation using generalized radial bases function network.
Also show how the value changes when updated with regularization parameter.
5 Implement the single linear neuron model with a Hebbian adaptation to describe the
principal component of the arbitrary input distribution.
a Implement the features used in Self organizing maps using competitive learning
6 algorithm.
b Implement the back propagation algorithm for training a recurrent network using
temporal operation as a parameter into a layer feed forward network.
7 Implement the Hop-field network using associative memory method
8 Design a neural network system by taking a suitable data set and demonstrate training,
testing and performance of the systems.
9 Design a neural network system by taking a suitable data set and demonstrate training,
testing and performance of the systems.
10 Implement and apply optimization methods for neural networks (AdaGrad, RMSProp,
Adam) on any relevant dataset.
11 Apply, train and visualize Different deep CNN architectures like LeNet, AlexNet, VGG,
PlacesNet, on MNIST datasets.
a Implement a single forward step of the RNN-cell
12 b Code the forward propagation of the RNN
c Implement the LSTM cell
Course outcomes:
1. Ability to understand basic motivation and functioning of the most common type of
neural network and its activation functions.
2. Ability to understand by implementing basic learning algorithm.
3. Ability to understand the choices between types of perceptrons and limitations of a model
for a given setting.
4. Ability to implement critically evaluation of a model performance and interpret the
results using various methods and concepts.
1. All laboratory experiments (TWELVE nos) are to be included for practical examination.
2. Strictly follow the instructions as printed on the cover page of answer script
3. Marks distribution: Procedure + Conduction:15+25 (40)
4. Change of experiment is not allowed.
References:
1. Leonardo De Marchi and Laura Mitchell, Hands-On Neural Networks: Learn how to
build and train your first neural network model using Python, 1st ed, Packt Publishing,
2019.
2. Michael Taylor, A Neural network: A Visual Introduction for Beginners, Blue Windmill
Media, 2017.
3. Simon Haykin, Neural Networks and Learning Machines, 3rd ed, Pearson, Prentice Hall,
2009.
4. Jeff Heaton, Introduction to Neural Networks with Java, 2nd ed, Heaton Research, Inc.,
2008.