0% found this document useful (0 votes)
21 views4 pages

01-NDL Theory and Lab Syllabus

The document outlines the syllabus for a Neural Networks and Deep Learning course and its corresponding laboratory, effective from the academic year 2022. It details course objectives, modules covering various neural network architectures, learning algorithms, and practical implementations, along with the expected outcomes for students. Additionally, it lists textbooks and references for further reading and study.

Uploaded by

ayraf raihan
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
21 views4 pages

01-NDL Theory and Lab Syllabus

The document outlines the syllabus for a Neural Networks and Deep Learning course and its corresponding laboratory, effective from the academic year 2022. It details course objectives, modules covering various neural network architectures, learning algorithms, and practical implementations, along with the expected outcomes for students. Additionally, it lists textbooks and references for further reading and study.

Uploaded by

ayraf raihan
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 4

NEURAL NETWORKS AND DEEP LEARNING

[Revised Credit System]


(Effective from the academic year 2022 onwards)
SEMESTER - VI

Subject Code CSE_ 3273 IA Marks 50


Number of Lecture Hours/Week 03 Exam Marks 50
Total Number of Lecture Hours 36 Exam Hours 03
CREDITS – 03

Course objectives: This course will enable students to


 To understand different neural network architectures.
 To train and test the neural networks.
 To design a neural network system for a given problem.
 To appreciate the need for deep learning
Module -1 Teaching
Hours
INTRODUCTION TO NEURAL NETWORKS AND ROSENBLATT’S 6 Hours
PERCEPTRON, LMS & MULTILAYER PERCEPTRON:

What is a Neural Network? The Human Brain, Models of a Neuron, Network


Architectures, Learning Processes and Learning Tasks. Rosenblatt’s Perceptron,
Perceptron Convergence Theorem, Wiener Filter, Least Mean Square Algorithm, Markov
Model Portraying the deviation. Introduction to Multilayer Perceptron, Some
Preliminaries, Batch Learning and On-Line Learning, The Back-Propagation Algorithm,
XOR Problem, Heuristics for Making the Back-Propagation Algorithm Perform Better,
Multilayer Perceptron Applications.

Textbook 1: Selected topics from chapter 1 to 4


Module -2
KERNEL METHODS & RADIAL-BASIS FUNCTION NETWORKS 6 Hours
AND SELF-ORGANIZING MAPS :
Introduction, Cover’s Theorem on the Separability of Patterns, The Interpolation
Problem, Radial-Basis-Function Networks, K-Means Clustering, Recursive Least-
Squares Estimation of the Weight Vector, Hybrid Learning Procedure for RBF Networks.
Feature-Mapping Models, Self-Organizing Map, Properties of the Feature Map, Case
Studies.

Text Book 1: 5.1-5.7 Text Book 2: Chapter 15, relevant journals.

Module – 3
DYNAMICALLY DRIVEN RECURRENT NETWORKS AND RECURRENT 6 Hours
HOPFIELD NETWORKS:
Introduction, Recurrent Network Architectures, Universal Approximation Theorem,
Controllability and Observability, Computational Power of Recurrent Networks,
Learning Algorithms, Back Propagation Through Time, Real-Time Recurrent
Learning. Operating Principles of the Hopfield Network, Stability Conditions of the
Hopfield Network, Associative Memories, Outer Product Method, Pseudoinverse Matrix
Method, Storage Capacity of Memories, Design Aspects of the Hopfield Network. Case
studies.
Text Book 1: 15.1-15.8, relevant journals. Text Book 2: 7.1-7.5, relevant journals.
Module – 4
FEEDFORWARD NEURAL NETWORKS: 5 Hours
Introduction, Pattern classification using perceptron, Multilayer feedforward neural
networks (MLFFNNs), Backpropagation learning, Empirical risk minimization,
Regularization, Autoencoders.
Text Book 3: Selected topics from chapter 6, chapter 7, chapter 14

Module- 5
DEEP NEURAL NETWORKS AND CONVOLUTION NEURAL NETWORKS: 8 Hours
Introduction, Difficulty of training DNNs, Greedy layerwise training, Optimization for
training DNNs, Newer optimization methods for neural networks (AdaGrad, RMSProp,
Adam), Second order methods for training, Regularization methods (dropout, drop
connect, batch normalization). Convolution neural networks (CNNs): Introduction to
CNNs – convolution, pooling, Deep CNNs, Different deep CNN architectures – LeNet,
AlexNet, VGG, PlacesNet, Training a CNNs: weights initialization, batch normalization,
hyperparameter optimization, Understanding and visualizing CNNs.
Text Book 3: Selected topics from chapter 7, chapter 9, chapter 15

Module -6
RECURRENT NEURAL NETWORKS: 5 Hours
Recurrent neural networks (RNNs): Sequence modeling using RNNs, Back propagation
through time, Long Short Term Memory (LSTM), Generative models: Restrictive
Boltzmann Machines (RBMs).
Text Book 3: Selected topics from chapter 10 and chapter 20
Course outcomes:
• After studying this course, students will be able:
• To understand the different types of learning, activation functions, and simple Perceptron.
• To analyze MLP and RBF networks.
• To model and design SOM systems for a given problem.
• To design deep learning networks.
• To analyse deep learning architectures.
Text Books:
1. Simon Haykin, Neural Networks and Learning Machines, 3rd ed, Pearson, Prentice Hall, 2010
2. Ivan Nunes da Silva, Danilo Hernane Spatti, Rogerio Andrade Flauzino, Luisa Helena Bartocci
Liboni, & Silas Franco dos Reis Alves, Artificial Neural Networks: A Practical Course,
Springer International Publishing, 2017
3. Ian Goodfellow, Yoshua Bengio and Aaron Courville, Deep learning, In preparation for MIT
Press, Available online: http://www.deeplearningbook.org
4. John Paul Mueller & Luca Massaron, Deep Learning For Dummies, Wiley & Sons, Inc., 2019

Reference Books:
1. Daniel Graupe, Principles of Artificial Neural Networks, 3rd ed, World Scientific Publishing,
2013
2. Eugene Charniak, Introduction to Deep Learning, MIT Press, 2018
NEURAL NETWORKS AND DEEPLEARNING LABORATORY
[ Revised Credit System ]

(Effective from the academic year 2022 onwards)


SEMESTER - VI
Subject Code CSE_3283 IA Marks 60

Number of Practical Hours/Week 03 Exam Marks 40

Total Number of Practical Hours 36 Exam Hours 02

CREDITS - 01

Course objectives:

This laboratory course enables students to

 Understand implementation detail for an artificial neural network - the types of activation
function, learning algorithms, the types of perceptron and various other algorithms and its
functionality.
 Implement and design the aspects of real time systems.
 Design and Apply Deep neural networks for real time problem solving.

Implement the different learning algorithms of an artificial neural network.

a Implement the basic neuron and update the learning parameter using different
1 activation functions. Show the graph.
b Implement the neuron using error correction learning algorithm and memory-based
learning algorithm.
2 Implement the neuron using Hebbian Learning algorithm and show the graph to
differentiate the hypothesis between Hebb and Co-variance learning.
a Implement the Gate Operations using Single Layer Perceptron.
3 b Implement the XOR gate operations using multi-layer perception and show the
error propagation by iterating the learning rate.
4 Implement the revision XOR operation using generalized radial bases function network.
Also show how the value changes when updated with regularization parameter.
5 Implement the single linear neuron model with a Hebbian adaptation to describe the
principal component of the arbitrary input distribution.
a Implement the features used in Self organizing maps using competitive learning
6 algorithm.
b Implement the back propagation algorithm for training a recurrent network using
temporal operation as a parameter into a layer feed forward network.
7 Implement the Hop-field network using associative memory method
8 Design a neural network system by taking a suitable data set and demonstrate training,
testing and performance of the systems.
9 Design a neural network system by taking a suitable data set and demonstrate training,
testing and performance of the systems.
10 Implement and apply optimization methods for neural networks (AdaGrad, RMSProp,
Adam) on any relevant dataset.
11 Apply, train and visualize Different deep CNN architectures like LeNet, AlexNet, VGG,
PlacesNet, on MNIST datasets.
a Implement a single forward step of the RNN-cell
12 b Code the forward propagation of the RNN
c Implement the LSTM cell
Course outcomes:

At the end of this course, students will gain the

1. Ability to understand basic motivation and functioning of the most common type of
neural network and its activation functions.
2. Ability to understand by implementing basic learning algorithm.
3. Ability to understand the choices between types of perceptrons and limitations of a model
for a given setting.
4. Ability to implement critically evaluation of a model performance and interpret the
results using various methods and concepts.

Conduction of Practical Examination:

1. All laboratory experiments (TWELVE nos) are to be included for practical examination.
2. Strictly follow the instructions as printed on the cover page of answer script
3. Marks distribution: Procedure + Conduction:15+25 (40)
4. Change of experiment is not allowed.

References:

1. Leonardo De Marchi and Laura Mitchell, Hands-On Neural Networks: Learn how to
build and train your first neural network model using Python, 1st ed, Packt Publishing,
2019.
2. Michael Taylor, A Neural network: A Visual Introduction for Beginners, Blue Windmill
Media, 2017.
3. Simon Haykin, Neural Networks and Learning Machines, 3rd ed, Pearson, Prentice Hall,
2009.
4. Jeff Heaton, Introduction to Neural Networks with Java, 2nd ed, Heaton Research, Inc.,
2008.

You might also like