AIDL03 EvolutionOfAI
AIDL03 EvolutionOfAI
Neuron inputs x2 𝛴 f y
…
output
z >= 𝞱
xn aka
activation
𝛴
x1 x2 … xn OR
Compute x2 f y
0 0 … 0 0
…
1 0 … 0 1 xn
1 1 … 0 1
… … … … …
1 1 … 1 1
1
1 + e −x
Aka hardlims
sin(x) −x 2
e
x
x −x
e −e
e x + e −x
Weights
1
wo bias Activation f(z)
x1 w1 Logit
z >= 0 output
w2 z
inputs x2 Σ y
… wn
0
xn weights
z =zw=0w+1 w
x11 +
x1w+2w
x2 +
x2. +. .. +. .w+nw
xn xn
Application Decision
boundary
𝝨|f
x1
y2
𝝨|f
x2
Hidden layers
Inputs of neutrons Outputs
Image: fdeloche
Image: fdeloche
❖ Melanie Mitchell
❖ My mother said that the cat that flew with her sister to Hawaii the year before you
started at that new high school is now living with my cousin.
❖ Q: Who is living with my cousin?
❖ RNNs have trouble processing this - need longer memory
❖ In 1995, Hochreiter and Schmidhuber propose Long Short Term Memory
(LSTM) as a solution
❖ 2016 Google Translate used 8 encode + 8 decode layers of 1024 LSTMs
Classification
❖ playground.tensorflow.org
❖ Introduce noise, increase train-to-test ratio
❖ Change the pattern and observe
❖ Play with the controls and train the neural network to classify
❖ No. of input & output nodes, layers
x -1 0 1 2 3 4
Regression
y -3 -1 1 3 5 7
1 wo
output
w1
x Σ y
❖ Video: And the Turing Award for 2018 goes to… (upto 3:45)
❖ Geoffrey Hinton: “If you have an idea, and it seems to you it has to be right,
don’t let people tell you it is silly. Just ignore them!”