Introduction To Deep Learning: Internet of Things Group
Introduction To Deep Learning: Internet of Things Group
Anna Petrovicheva
IOTG Computer Vision
Image credit: DeepMind, Prisma, Yayvo, Google Translate, Redmond Pie, TechRepublic, Brit
Internet of Things Group *Other names and brands may be claimed as the property of others 3
Tesla autopilot
Image credit: Autopilot Full Self Driving Demonstration Nov 18 2016 Realtime Speed
Internet of Things Group *Other names and brands may be claimed as the property of others 4
Brief history
● 1965: first idea
● AI winter
● 1998: LeNet-5
● 2000’s: “The biggest issue of this paper, is that it relies on neural networks”
● 2012: groundbreaking results in ImageNet contest
○ Old algorithms
○ Big dataset
○ Compute power
neuron
w1
v1
input w2
output
v2 vnew
w3 dog
v3
layer
cat
W1 = Wstart + α * F’(Wstart)
α - learning rate
Too small: long training
Too large: training diverges
w1
W1 Wstart
w2 Woptimal
W1 = Wstart + α * F’(Wstart)
W2 = W1 + α * F’(W1)
W3 = W2 + α * F’(W2)
w1 W4 = W3 + α * F’(W3)
W5 = W4 + α * F’(W4)
Wstart
W1
w2 Woptimal
Parameter
update ΔW
v1 w11 b1
w21
fc1
v2
● 95 % of parameters in network
● “Classic” layer
● Usually used before the final
bm
classificator
wnm fcm
vn
Image credit: Feature Evaluation of Deep Convolutional Neural Networks for Object Recognition and Detection
Very deep
▪ 50 / 101 / 152 -convolution modifications
Image credit: Deep Residual Learning for Image Recognition
task-specific
input backbone output
layers
detection elephant
backbone
head tree
tree
Inception R-FCN
ResNet SSD
Image credit: Savanna
Image credit: DeepLab: Semantic Image Segmentation with Deep Convolutional Nets, Atrous Convolution, and Fully Connected CRFs
Image credit: Feature Space Optimization for Semantic Video Segmentation - CityScapes Demo 02
▪ Object detection
State-of-the-art: Mask-R-CNN
● Trained GAN:
○ Good generator of new objects
○ Good estimator of object quality
Image credit: StackGAN: Text to Photo-realistic Image Synthesis with Stacked Generative Adversarial Networks
4x
overfitting generalization
train
0.9 0.9
val
0.8 0.8
accuracy
accuracy
0.7 0.7
0.6 0.6
0.5 0.5
iterations iterations
Example:
Accuracy Performance
● Accuracy: optimizing metric
Model 1 98 % 2 seconds
● Time: satisficing metric
Model 2 93 % 0.5 second
Papers submitted to Arxiv categories cs.AI, cs.LG, cs.CV, cs.CL, cs.NE, stat.ML over time