0% found this document useful (0 votes)
52 views49 pages

ANN Theory

This document discusses artificial neural networks and how they learn through backpropagation. It begins by explaining the biological basis of neural networks and how artificial neurons mimic biological neurons by having inputs, weights, and outputs. The document then describes how multi-layer neural networks are constructed and how gradient descent and backpropagation are used to adjust the weights to minimize error and optimize the network's predictions based on examples in the training data.

Uploaded by

gecol gecol
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
52 views49 pages

ANN Theory

This document discusses artificial neural networks and how they learn through backpropagation. It begins by explaining the biological basis of neural networks and how artificial neurons mimic biological neurons by having inputs, weights, and outputs. The document then describes how multi-layer neural networks are constructed and how gradient descent and backpropagation are used to adjust the weights to minimize error and optimize the network's predictions based on examples in the training data.

Uploaded by

gecol gecol
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 49

Machine Learning with Python

Dr. Ahmed Lawgali


Machine Learning with Python

Artificial Neural Network


Artificial Neural Network

❑ Artificial Neural Networks (ANN) actually have a


basis in biology!
Artificial Neural Network

❑ The biological neuron:


Artificial Neural Network

❑ The artificial neuron also has inputs and outputs!


Artificial Neural Network

❑ We have two inputs and an output


Artificial Neural Network

❑ Inputs will be values of features


Artificial Neural Network

❑ Inputs are multiplied by a weight


Artificial Neural Network

❑ Weights initially start off as random


Artificial Neural Network

❑ Inputs are now multiplied by weights


Artificial Neural Network

❑ Then these results are passed to an activation function


❑ Many activation functions to choose from
Artificial Neural Network

❑ Many activation functions to choose from


Artificial Neural Network

❑ Multiple Perceptrons Networks


Artificial Neural Network

Input layer .. 2 hidden layers .. Output layer


Artificial Neural Network

❑ Input layer
➢ Real values from the data
❑ Hidden Layers
➢ Layers in between input and output
❑ Output Layer
➢ Final estimate of the output
Artificial Neural Network

❑ How we can evaluate the performance of a neuron?


❑ We can use a cost function to measure how far off we
are from the expected value.
Artificial Neural Network

❑ We are still missing a key step, actually “learning”


❑ We need to figure out how we can use our neurons
and the measurement of error (our cost function) and
then attempt to correct our prediction, in other
words, “learn”
Artificial Neural Network

❑ Gradient descent is an optimization algorithm for


finding the minimum of a function.
❑ To find a local minimum, we take steps proportional to
the negative of the gradient.
Artificial Neural Network

❑ Gradient Descent
Artificial Neural Network

❑ Gradient Descent
Artificial Neural Network

❑ Gradient Descent
Artificial Neural Network

❑ Gradient Descent
Artificial Neural Network

❑ Visually we can see what parameter value to choose


to minimize our Cost!
Artificial Neural Network

❑ Using gradient descent we can figure out the best


parameters for minimizing our cost, for example,
finding the best values for the weights of the neuron
inputs
Artificial Neural Network

❑ How can we quickly adjust the optimal parameters or


weights across our entire network?
❑ This is where backpropagation comes in!
Artificial Neural Network

❑ Backpropagation is used to calculate the error


contribution of each neuron after a batch of data is
processed.
❑ It relies heavily on the chain rule to go back through
the network and calculate these errors
Artificial Neural Network

❑ Backpropagation works by calculating the error at the


output and then distributing back through the
network layers.
❑ It requires a known desired output for each input
value (supervised learning)
Artificial Neural Network
How do Neural Networks Learn?
Artificial Neural Network
How do Neural Networks Learn?
Artificial Neural Network
How do Neural Networks Learn?
Artificial Neural Network
How do Neural Networks Learn?
Artificial Neural Network
How do Neural Networks Learn?
Artificial Neural Network
How do Neural Networks Learn?
Artificial Neural Network
How do Neural Networks Learn?
Artificial Neural Network
Backpropagation
Artificial Neural Network
Backpropagation
Artificial Neural Network
Backpropagation
Artificial Neural Network
Backpropagation
Artificial Neural Network
Backpropagation
Artificial Neural Network
Backpropagation
Artificial Neural Network
Backpropagation
Artificial Neural Network
Backpropagation
Artificial Neural Network
Backpropagation
Artificial Neural Network
Backpropagation
Artificial Neural Network
Backpropagation
Artificial Neural Network
Backpropagation
Artificial Neural Network
Backpropagation
Artificial Neural Network
Backpropagation
Artificial Neural Network
Backpropagation

You might also like