0% found this document useful (0 votes)
24 views31 pages

Lecture 4,5 ANN Cont.

Uploaded by

topendrabdr1
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
24 views31 pages

Lecture 4,5 ANN Cont.

Uploaded by

topendrabdr1
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 31

APEX INSTITUTE OF TECHNOLOGY

DEPARTMENT OF COMPUTER SCIENCE & ENGINEERING

DEEP LEARNING (20CSF-432)


Faculty: Dr. Amit Kukker(E16298)

Lecture – 4, 5
Functional units of ANN for pattern recognition tasks, Pattern DISCOVER . LEARN . EMPOWER
classification using perceptron, Multilayer feed forward
By: Dr. Amit Kukker 1
Deep Learning: Course Objectives
COURSE OBJECTIVES
The Course aims to:
1. Understand the key features in a neural network’s architecture
2. Understand the main fundamentals that drive Deep Learning
3. Be able to build, train and apply fully connected deep neural networks
4. Know how to implement efficient CNN, LSTM, Bi-LSTM, Autoencoder, RNN, Adversarial
Generative Networks etc.
5. Implementation the fundamental methods involved in deep learning, including the underlying
optimization concepts (gradient descent and backpropagation) and how they can be combined to
solve real-world problems.

By: Dr. Amit Kukker 2


COURSE OUTCOMES

On completion of this course, the students shall be able to:-


CO1 Understand neural network, its working and parameters, and various optimization methods for
neural networks.
CO2 Differentiate between the major types of neural network architectures and its use case for
different problems (classification/recognition) by these architectures.

CO3 Understand different deep neural network model architectures and its parameters tuning.

CO4 Design sequence model using different neural network architectures for new data problems based
on their requirements and problem characteristics and analyse their performance.

CO5 Describe latest research being conducted in the field and open problems that are yet to be solved.

By: Dr. Amit Kukker 3


Unit-1 Syllabus
Unit-1 Basics of Artificial Neural Network

Computational models of neurons, Structure of neural


networks. Functional units of ANN for pattern
recognition tasks, Pattern classification using
perceptron, Multilayer feed forward neural networks
Basics of Artificial
Neural Network (MLFFNNs), Backpropagation learning, Empirical risk
minimization, Regularization, Difficulty of training
DNNs, Greedy layer wise training, Optimization for
training DNNs, Newer optimization methods for neural
networks (AdaGrad, RMSProp, Adam)

By: Dr. Amit Kukker 4


SUGGESTIVE READINGS
TEXT BOOKS:
• T1: Deep Learning with Python by Francois Chollet, Publisher: Manning Publications
• T2: Deep Learning from Scratch: Building with Python from First Principles by Seth Weidman
published by O`Reilley
• T3: Deep Learning by Ian Goodfellow, Yoshua Bengio and Aaron Courville published by MIT Press.

REFERENCE BOOKS:
• R1 Fundamentals of Deep Learning: by Nithin Buduma, Nikhil Buduma and Joe Papa, OREILLY
Publication, Second Edition.
• R2 Deep Learning: A Practitioners Approach by Josh Patterson and Adam Gibson, OREILLY
Publication.
• R3 Deep Learning for Coders with fastai and PyTorch by Jeremy Howard and Sylvain Gugger, OREILLY
Publication.
• R4 Deep Learning Using Python by S Lovelyn Rose, L Ashok Kumar, D Karthika Renuka, Wiley
Publication
By: Dr. Amit Kukker 5
Pattern Recognition Problem

Functional units form building blocks for developing neural architectures to solve complex
pattern recognition problems.

• Pattern is everything around in this digital world. A pattern can either be seen physically
or it can be observed mathematically by applying algorithms.

Example: The colors on the clothes, speech pattern etc. In computer science, a pattern is
represented using vector feature values.

• In any pattern recognition task, we have a set of input patterns and the corresponding
output patterns.

• Depending on the nature of the output patterns and the nature of the task environment,
the problem could be identified as one of association or classification or mapping.

• The given set of input - output pattern pairs form only a few samples of an unknown
system.

• From these samples, the pattern recognition model should capture the characteristics of
the system. By: Dr. Amit Kukker 6
By: Dr. Amit Kukker 7
An example of pattern recognition is classification, which attempts to
assign each input value to one of a given set of classes (for example,
determine whether a given email is "spam" or "non-spam"). ... This
is opposed to pattern matching algorithms, which look for exact matches
in the input with pre-existing patterns.

A Pattern association have been widely used in distributed memory


modeling.

A Pattern mapping employs image processing and syntactic pattern


recognition principles to recognize a known pattern before and after
motion.

Speech spectra of steady vowels generated by a person, or hand-printed


characters, could be considered as examples of patterns for pattern
mapping problems
.
By: Dr. Amit Kukker 8
By: Dr. Amit Kukker 9
Pattern Association
Problem

By: Dr. Amit Kukker 10


Example of Hetero
Association

By: Dr. Amit Kukker 11


Example of Auto
Association

By: Dr. Amit Kukker 12


By: Dr. Amit Kukker 13
Basic functional units:
 There are three types of artificial neural networks.

 The simplest networks of each of these types form the basic functional
unit.

 They are functional because they can perform by themselves some


simple pattern recognition tasks.

 They are basic because they form building blocks for developing neural
networkarchitectures for complex pattern recognition tasks.

 They are:

1)Feed Forward NNs


2)Feedback NNs
3)Competitive NNs

By: Dr. Amit Kukker 14


By: Dr. Amit Kukker 15
By: Dr. Amit Kukker 16
By: Dr. Amit Kukker 17
By: Dr. Amit Kukker 18
By: Dr. Amit Kukker 19
By: Dr. Amit Kukker 20
By: Dr. Amit Kukker 21
By: Dr. Amit Kukker 22
By: Dr. Amit Kukker 23
By: Dr. Amit Kukker 24
By: Dr. Amit Kukker 25
By: Dr. Amit Kukker 26
By: Dr. Amit Kukker 27
By: Dr. Amit Kukker 28
References
Main text books:
• “Neural Networks: A Comprehensive Foundation”, S. Haykin (very
good -theoretical)
• “Pattern Recognition with Neural Networks”, C. Bishop (very good accessible)
• “Neural Network Design” by Hagan, Demuth and Beale (introductory)
• Books emphasizing the practical aspects:
• “Neural Smithing”, Reeds and Marks
• “Practical Neural Network Recipees in C++”’ T. Masters
• Seminal Paper (but now quite old!):
“Parallel Distributed Processing” Rumelhart and McClelland et al.
Deep Learning books and tutorials:
• http://www.deeplearningbook.org/
• Introduction to Learning Rules in Neural Network - DataFlair (data-flair.training) 29
Neural Networks Literature
Review Articles:
• R. P. Lippman, “An introduction to Computing with Neural Nets”’ IEEE
• ASP Magazine, 4-22, April 1987.
• T. Kohonen, “An Introduction to Neural Computing”, Neural Networks,
• 1, 3-16, 1988.
• A. K. Jain, J. Mao, K. Mohuiddin, “Artificial Neural Networks: A Tutorial”’
• IEEE Computer, March 1996’ p. 31-44.
Journals:
• IEEE Transactions on NN
• Neural Networks
• Neural Computation
• Biological Cybernetics
30
THANK YOU

For queries
Email: [email protected]

You might also like