7 - Neural Networks
7 - Neural Networks
Amer S. Zakaria
Department of Electrical Engineering
College of Engineering
Neurons
Neural Network
Implementation in Python
Brain versus Computer
3
𝑧 = 𝑤0 + 𝑥𝑖 𝑤𝑖
𝑖=1 weight on
weighted
sum ith input
index over
input connections
𝑧 = 𝑤0 + 𝑥𝑖 𝑤𝑖
𝑖 𝑦
0
𝑦=𝑓 𝑧 =𝑧
0
𝑧 = 𝑤0 + 𝑥𝑖 𝑤𝑖
𝑖
Linear Neurons: Example
13
Linear Neurons: Example
14
Linear Neurons: Example
15
Linear Neurons: Example
16
Linear Neurons: Example
17
Sigmoid Neurons: Used for Classification
18
y 0.5
0
0 z
Sigmoid Neurons: Example
19
Sigmoid Neurons: Example
20
Sigmoid Neurons: Example
21
Sigmoid Neurons: Example
22
Sigmoid Neurons: Example
23
Using Sigmoid Neurons for Classification
24
Output layer
Input layer
Hidden layer
Operation of Neural Networks
40
In the forward operation, you provide the feature vector representing an instance at the
input of the network.
The numbers in the feature vector are then multiplied by the weights of the network
(input to hidden layer weights), and bias is added for each of the nodes in the hidden
layer.
The results are then the “activation” (weighted sum) of each of the neurons in the
hidden layer. These activations are passed through a function (activation function, e.g.,
sigmoid, ReLU, etc.), and then the output of the neurons in the hidden layer becomes the
input to the neuron(s) in the proceeding layers.
These outputs are then multiplied by the hidden-to-output layer weights, biases are
added, and this makes up the activation(s) of the output neuron(s). These activations(s) are
then passed through the activation function of the output layer (e.g., sigmoid for
classification or linear for regression (enough at your level)).
For classification, if the output value is above or equal to 0.5, you say that you detected
Class 1. Otherwise, you say that you detected Class 0.
Note: You also compute the cost function value to see if your network is doing a good job.
One example cost function can be the mean squared error computed over the training,
validation, or testing examples.
If e.g., one input instance is from Class 1 (in a binary classification problem) and the output
of the network is 0.8. Then the squared error is (1-0.8)2 for this instance
Forward Operation
42
https://towardsdatascience.com/how-to-build-your-own-neural-network-from-scratch-in-python-68998a08e4f6
43
https://towardsdatascience.com/how-to-build-your-own-neural-network-from-scratch-in-python-68998a08e4f6
Backward Operation (Backpropagation)
44
# Initializing a regressor
from sklearn.neural_network import MLPRegressor
regr = MLPRegressor(max_iter=10000, random_state=1)
NNs Regression in Python : Example (cont.)
50
Input
Sepal Length, Sepal Output
NN Classifier
Width, Petal Length, Iris Species
Petal Width
TensorFlow Playground
52
McCulloch-Pitts (1943):
Firstcompute a weighted sum of the inputs. 1
Then send out a fixed size spike of activity if
output
the weighted sum exceeds a threshold.
McCulloch and Pitts thought that each spike is
like the truth value of a proposition and each 0
threshold
neuron combines truth values to compute the
truth value of another proposition! weighted input
Extra: Binary Threshold Neurons
55
𝑧 = 𝑤0 + 𝑥𝑖 𝑤𝑖 1
𝑖
𝑦
1 if 𝑧 ≥ 0
𝑦=ቊ
0 otherwise 0
0 𝑧
Extra: Binary Threshold Neurons: Example
56
Extra: Binary Threshold Neurons: Example
57
Extra: Binary Threshold Neurons: Example
58
Extra: Binary Threshold Neurons: Example
59
Extra: Binary Threshold Neurons: Example
60
Extra: Rectified Linear (ReLU) Neurons
(sometimes called Linear Threshold Neurons)
61
𝑧 = 𝑤0 + 𝑥𝑖 𝑤𝑖
𝑖
𝑧 if 𝑧 ≥ 0
𝑦
𝑦=ቊ 0 𝑧
0 otherwise
Extra: ReLU Neurons: Example
62
Extra: ReLU Neurons: Example
63
Extra: ReLU Neurons: Example
64
Extra: ReLU Neurons: Example
65
Extra: ReLU Neurons: Example
66