ANN Self Slides
ANN Self Slides
Deep Learning
History of Computers Technology to process the
ANN’s
The exponential growth in storage and
processing capacities
Geoffery Hinton (Inventor of Deep Learnig in
80’s)
To mimic how the human brain acts
100 billion neurons in the human brain
Each neuron is connected to about 1000
neighbors
The Neuron
Basic Neuron Structure
Individual neurons are almost useless
Synapse (the connection)
The signal passing process
i/p and o/p
Input layer
Hidden Layers
Output Layer
Inputs are Independent Variables for one single
observation(e.g, one person: Age, job ,
qualification etc.)
One input row
Standardization/Normalization on input variables
Synapse
Learning?
To create a facility for the program to learn itself
Single Layer Feed Forward NN
(Perceptron)
y = Actual Value
Updation of weights
From a single
observations
We keep on iterating until the ‘C’ becomes
minimum
So in this simple example, it’s all about the
adjustment of the weights to predict the accurate
result
So we go back to update weights
All the rows share the weights
So we achieve the optimal weights
This whole process is called the back
propagation
https://stats.stackexchange.com/questions/154879/a-list-of-cost-functions-used-in-
neural-networks-alongside-applications
The Gradient Descent