0% found this document useful (0 votes)
68 views59 pages

ANN Self Slides

This document discusses the basics of artificial neural networks and deep learning. It covers the history and key concepts like neurons, layers, weights, activation functions, and learning algorithms. The main points are: - Artificial neural networks were inspired by the human brain and made possible by growing computing power. - A basic neural network has an input layer, hidden layers, and an output layer connected by neurons and weights. It learns by adjusting the weights through algorithms like gradient descent. - Common activation functions include sigmoid, tanh, and rectified linear unit (ReLU). The choice depends on whether the output is binary or real-valued. - Neural networks learn by minimizing a cost function through iterative weight updates based

Uploaded by

Haider
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
68 views59 pages

ANN Self Slides

This document discusses the basics of artificial neural networks and deep learning. It covers the history and key concepts like neurons, layers, weights, activation functions, and learning algorithms. The main points are: - Artificial neural networks were inspired by the human brain and made possible by growing computing power. - A basic neural network has an input layer, hidden layers, and an output layer connected by neurons and weights. It learns by adjusting the weights through algorithms like gradient descent. - Common activation functions include sigmoid, tanh, and rectified linear unit (ReLU). The choice depends on whether the output is binary or real-valued. - Neural networks learn by minimizing a cost function through iterative weight updates based

Uploaded by

Haider
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 59

Artificial Neural Networks

Deep Learning
 History of Computers Technology to process the
ANN’s
 The exponential growth in storage and
processing capacities
 Geoffery Hinton (Inventor of Deep Learnig in
80’s)
 To mimic how the human brain acts
 100 billion neurons in the human brain
 Each neuron is connected to about 1000
neighbors
The Neuron
Basic Neuron Structure
 Individual neurons are almost useless
 Synapse (the connection)
 The signal passing process
i/p and o/p
 Input layer
 Hidden Layers
 Output Layer
 Inputs are Independent Variables for one single
observation(e.g, one person: Age, job ,
qualification etc.)
 One input row
 Standardization/Normalization on input variables
Synapse

 The strength of a signal that passes along


 Weights are adjusted through the process of
learning
 (e,g. gradient descent, back propagation)
1st Step
Step 2: The Activation Function

The function applied to the weighted sum


Step 3

 Decision to either pass on the signal or not


 The process is repeated throughout the NN
hundreds and thousands of times
The Activation Functions
The Activation Functions
The Activation Functions
The Activation Functions
 Assuming the dependent variable is binary, which
of the discussed functions will we use for a single
neuron?
 Hidden Layer: Rectifier
 Output Layer: Sigmoid
 Above combination is very common in ANN’s
How do Neural Networks Work?

 The Case of Property


Evaluation
 A simple example of
calculating the price
 The main power of the
NN is hidden layer
A Simple Price Calculation (No Hidden Layer)
How do Neural Networks Work?

 Not all the i/p’s will be having the zero values


 First Neuron: Only X1 and X2 are critical.
 Others are not in the scope of this neuron
 Third Neuron:
 Fifth Neuron: Dealing with the age of the property
 A threshold of say 100 years
 Rectifier function can be applied
 Price will jump if this threshold is crossed
 All neurons combined together get the big power
to predict the price (y)
How do Neural Networks Learn

 Learning?
 To create a facility for the program to learn itself
Single Layer Feed Forward NN
(Perceptron)
 y = Actual Value
 Updation of weights
 From a single
observations
 We keep on iterating until the ‘C’ becomes
minimum
 So in this simple example, it’s all about the
adjustment of the weights to predict the accurate
result
 So we go back to update weights
 All the rows share the weights
 So we achieve the optimal weights
 This whole process is called the back
propagation
 https://stats.stackexchange.com/questions/154879/a-list-of-cost-functions-used-in-
neural-networks-alongside-applications
The Gradient Descent

 How the weights are adjusted?


 The Brute-force Approach
 Easier if there is only a single weight
 Calculate the slope and find the best weights
 Two dimensional
space
 Three dimensional
space
The Local Optimal points
Stochastic Gradient Descent
 https://iamtrask.github.io/2015/07/27/python-network-part2/
 http://neuralnetworksanddeeplearning.com/chap2.html
Training the ANN with Stochastic
Gradient Descent

You might also like