Deep Learning Handout

Download as pdf or txt
Download as pdf or txt
You are on page 1of 6

BIRLA INSTITUTE OF TECHNOLOGY & SCIENCE, PILANI

WORK INTEGRATED LEARNING PROGRAMMES


Digital
Part A: Content Design
Course Title Deep Learning

Course No(s) DSECLZG524

Credit Units 5

Credit Model 1 - 0.5 - 1.5.


1 unit for class room hours, 0.5 unit for webinars, 1.5 units for
Student preparation. 1 unit = 32 hours

Content Authors Ms. Seetha Parameswaran

Version 1.0

Date August 07th, 2019

Course Objectives
No Course Objective

CO1 Introduce students to the basic concepts and techniques of Deep Learning.

CO2 Students will be able apply deep learning models to applications.

CO3 Students will be able to evaluate deep learning algorithms.

Text Book(s)
T1 Deep Learning by Ian Goodfellow, Yoshua Bengio, Aaron Courville. MIT Press
2016.

Reference Book(s) & other resources


R1 Introduction to Deep Learning by Eugene Charniak. The MIT Press 2019

R2 Deep Learning with Python by Francois Chollet. 1st Edition. Manning Publications
Co 2018.

R3 Dive into Deep Learning by Aston Zhang, Zack C. Lipton, Mu Li, Alex J. Smola. 2019
Content Structure
1. Introduction
1.1. Objective of the course
1.2. Review of Machine Learning and Neural Network

2. Deep Feedforward Network


2.1. Multilayer Perceptron
2.2. Gradient based learning
2.3. Architecture design
2.4. Back propagation

3. Regularization for Deep models


3.1. L2 and L1 Regularization
3.2. Constrained Optimization and Under-Constrained problems
3.3. Early Stopping
3.4. Parameter Tying and Parameter Sharing
3.5. Sparse representations
3.6. Dropout

4. Optimization of Deep models


4.1. Challenges in Neural Network Optimization
4.2. Basic algorithms in optimization
4.3. Parameter Initialization Strategies
4.4. Algorithms with Adaptive Learning Rates
4.5. Approximate Second-Order Methods
4.6. Optimization Strategies and Meta-Algorithms

5. Convolutional Networks
5.1. The Convolution Operation
5.2. Pooling
5.3. Convolution and Pooling as an Infinitely Strong Prior
5.4. Structured Outputs

6. Recurrent and Recursive Nets


6.1. Computational Graphs
6.2. Recurrent Neural Networks
6.3. Bidirectional RNNs
6.4. Encoder-Decoder Sequence-to-Sequence Architectures
6.5. Deep Recurrent Networks
6.6. Recursive Neural Networks
6.7. The Long Short-Term Memory and Other Gated RNNs

7. Autoencoders
7.1. Regularized Autoencoders
7.2. Representational Power, Layer Size and Depth
7.3. Stochastic Encoders and Decoders
7.4. Denoising Autoencoders
7.5. Applications of Autoencoders
8. Generative Adversarial Networks
8.1. Overview
8.2. Applications of GAN

9. Applications
9.1. Computer Vision
9.2. Speech Recognition
9.3. Natural Language Processing
9.4. Recommender Systems

Learning Outcomes:
No Learning Outcomes

LO1 Able to understand the basics of Deep Learning.

LO2 Able to understand and apply techniques related to Deep Learning to


applications.

LO3 Able to identify appropriate tools to implement the solutions to problems related
to Deep Learning and implement solutions.

Part B: Learning Plan


Academic Term 2019 Semester 1

Course Title Deep Learning

Course No DSECL ZG524

Lead Instructor Dr. Sugata Ghosal

Session Study / HW
No. Topic Title Resource
Reference

Objective of the course, logistics, applications of neural networks,


1 T1 – Ch1
historical motivation and background, Perceptron and multilayer
perceptrons, characteristics of deep learning

Deep Feedforward Networks


2
Deep network for Universal Boolean function representation,
classification and Approximation, perceptron Learning, T1 – Ch6
Perceptron with differentiable activation functions, Optimization
Refresher

Deep Feedforward Networks T1 – Ch6


3
Gradient based learning, Cost function, output units, hidden
units, Computational Graph

Regularization for Deep models


4
L2 and L1 Regularization, Constrained Optimization and Under- T1 – Ch7
Constrained, Early Stopping, Parameter Tying and Parameter
Sharing, Sparse representations, Dropout

Optimization of Deep models


5
Challenges in Neural Network Optimization, Basic algorithms in T1 – Ch8
optimization, Stochastic Gradient Descent, Momentum,
Parameter Initialization Strategies

Optimization of Deep models (contd)


6
Algorithms with Adaptive Learning Rates, AdaGrad, RMSProp, T1 – Ch8
Adam, Approximate Second-Order Methods, Conjugate gradient,
Batch normalization

Convolutional Networks
7
The Convolution Operation, Pooling, Convolution and Pooling as T1 – Ch9
an Infinitely Strong Prior, Structured Outputs, Demo of CNN on R2 – Ch5
computer vision application

Books, Slides,
8 Review of Session 1 to 7 Web references

Convolutional Networks (contd)


9 Web references
Variants of CNN – ImageNet, VGG16, Inception, ResNet, AlexNet T1 – Ch10
Recurrent and Recursive Nets R2 – Ch6
Unfolding Computational Graphs, Recurrent Neural Networks

Recurrent and Recursive Nets (contd)


10 T1 – Ch10
Bidirectional RNNs, Deep Recurrent Networks, Encoder-Decoder R2 – Ch6
Sequence-to-Sequence, Recursive Neural Networks

Recurrent and Recursive Nets (contd)


11 T1 – Ch10
The Long Short-Term Memory, Optimization for Long-Term R2 – Ch6
Dependencies

Autoencoders
12 T1 – Ch14
PCA, ICA, Regularized Autoencoders, Sparse encoders, R2 – Ch8
Representational Power, Layer Size and Depth

Autoencoders (contd) T1 – Ch14


13
Stochastic Encoders and Decoders, Denoising Autoencoders, R2 – Ch8
Applications of Autoencoders

14 Generative Adversarial Networks


An overview, applications of GAN T1 – Ch20

Applications of Deep Learning


15
Case studies related to Natural Language Processing, T1 – Ch12
Recommender Systems, etc.

Books, Slides,
16 Review of session 9 to 15 Web references

Detailed Plan for Lab work

Lab Lab Sheet Access Session


Lab Objective Reference
No. URL
Introduction to Tensorflow and Keras 2, 3
1
Deep Neural Network with Back- 4
2 propagation and optimization
CNN 7
3
RNN 10
4
LSTM 11
5
Autoencoders 13
6

Evaluation Scheme:
Legend: EC = Evaluation Component; AN = After Noon Session; FN = Fore Noon Session

Name Type Duration Weight Day, Date, Session, Time


No
Quizzes Online 5%
EC-1
Assignments (2) Take Home 25%
EC-2
Mid-Semester Test Closed Book 1.5 Hrs 30%
EC-3
Comprehensive Open Book 2.5 Hrs 40%
EC-4
Exam

Note:
Syllabus for Mid-Semester Test (Closed Book): Topics in Session Nos. 1 to 8
Syllabus for Comprehensive Exam (Open Book): All topics (Session Nos. 1 to 16)
Important links and information:

Elearn portal: https://elearn.bits-pilani.ac.in or Canvas


Students are expected to visit the Elearn portal on a regular basis and stay up to date
with the latest announcements and deadlines.

Contact sessions: Students should attend the online lectures as per the schedule provided
on the Elearn portal.

Evaluation Guidelines:
1. EC-1 consists of two Quizzes. Students will attempt them through the course
pages on the Elearn portal. Announcements will be made on the portal, in a
timely manner.
2. EC-2 consists of either one or two Assignments. Students will attempt them
through the course pages on the Elearn portal. Announcements will be made
on the portal, in a timely manner.
3. For Closed Book tests: No books or reference material of any kind will be
permitted.
4. For Open Book exams: Use of books and any printed / written reference
material (filed or bound) is permitted. However, loose sheets of paper will
not be allowed. Use of calculators is permitted in all exams. Laptops/Mobiles
of any kind are not allowed. Exchange of any material is not allowed.
5. If a student is unable to appear for the Regular Test/Exam due to genuine
exigencies, the student should follow the procedure to apply for the Make-Up
Test/Exam which will be made available on the Elearn portal. The Make-Up
Test/Exam will be conducted only at selected exam centres on the dates to be
announced later.

It shall be the responsibility of the individual student to be regular in maintaining


the self-study schedule as given in the course hand-out, attend the online lectures,
and take all the prescribed evaluation components such as Assignment/Quiz, Mid-
Semester Test and Comprehensive Exam according to the evaluation scheme
provided in the hand-out.

You might also like