0% found this document useful (0 votes)
20 views207 pages

Notes Conv Nets Slides

The document discusses the impact of Convolutional Neural Networks (CNNs) on image classification, particularly highlighting their performance in the ImageNet challenge since 2010. It outlines the potential applications of CNNs in various fields such as medical imaging and autonomous driving, and introduces the concept of using grayscale images as input for neural networks. The document also explains the convolutional layer's function in processing image data, emphasizing spatial locality and translation invariance.

Uploaded by

vinay thakar
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
20 views207 pages

Notes Conv Nets Slides

The document discusses the impact of Convolutional Neural Networks (CNNs) on image classification, particularly highlighting their performance in the ImageNet challenge since 2010. It outlines the potential applications of CNNs in various fields such as medical imaging and autonomous driving, and introduces the concept of using grayscale images as input for neural networks. The document also explains the convolutional layer's function in processing image data, emphasizing spatial locality and translation invariance.

Uploaded by

vinay thakar
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 207

6.

036: Convolutional Neural


Networks
(CNNs, ConvNets)
Prof. Tamara Broderick
EECS, MIT
Impact of CNNs
ImageNet results
0.5

• Since 2010: large-scale


0.4
image classification
challenge
Error rate

0.3 AlexNet

0.2

0.1

0.0
2011 2012 2013 2014 2015 2016
Year
1 [ https://en.wikipedia.org/wiki/ImageNet#History_of_the_ImageNet_Challenge ]
Impact of CNNs
ImageNet results
0.5

• Since 2010: large-scale


0.4
image classification
challenge
Error rate

0.3

0.2

0.1

0.0
2011 2012 2013 2014 2015 2016
Year
1 [ https://en.wikipedia.org/wiki/ImageNet#History_of_the_ImageNet_Challenge ]
Impact of CNNs
ImageNet results
0.5

• Since 2010: large-scale


0.4
image classification
challenge
Error rate

0.3 AlexNet

0.2

0.1

0.0
2011 2012 2013 2014 2015 2016
Year
1 [ https://en.wikipedia.org/wiki/ImageNet#History_of_the_ImageNet_Challenge ]
Impact of CNNs
ImageNet results
0.5

• Since 2010: large-scale


0.4
image classification
challenge
Error rate

0.3 AlexNet

0.2

0.1

0.0
2011 2012 2013 2014 2015 2016
Year
1 [ https://en.wikipedia.org/wiki/ImageNet#History_of_the_ImageNet_Challenge ]
Impact of CNNs
ImageNet results
0.5

• Since 2010: large-scale


0.4
image classification
challenge
Error rate

0.3 AlexNet • Recent AI boom

0.2

0.1

0.0
2011 2012 2013 2014 2015 2016
Year
1 [ https://en.wikipedia.org/wiki/ImageNet#History_of_the_ImageNet_Challenge ]
Impact of CNNs
ImageNet results
0.5

• Since 2010: large-scale


0.4
image classification
challenge
Error rate

0.3 AlexNet • Recent AI boom


• 1960s, 1980s, today:
0.2
neural networks
• Since 1980s: CNNs
0.1

0.0
2011 2012 2013 2014 2015 2016
Year
1 [ https://en.wikipedia.org/wiki/ImageNet#History_of_the_ImageNet_Challenge ]
Images
• Potential uses of image classification: Detect tumor
(type) from medical scans, image search online,
autonomous driving

2 [ https://en.wikipedia.org/wiki/Pixel#/media/File:Pixel-example.png ]
Images
• Potential uses of image classification: Detect tumor
(type) from medical scans, image search online,
autonomous driving

• Recall: images are made of pixels


2 [ https://en.wikipedia.org/wiki/Pixel#/media/File:Pixel-example.png ]
Images
• We’ll focus on grayscale
images

3
Images
• We’ll focus on grayscale
images

3
Images
• We’ll focus on grayscale
images
• Each pixel takes a value
between 0 and P

3
Images
• We’ll focus on grayscale
images
• Each pixel takes a value
between 0 and P
• Here, 0: black, 1: white

3
Images
• We’ll focus on grayscale
images
• Each pixel takes a value
between 0 and P
• Here, 0: black, 1: white
• Larger P in Lab Week 08

3
Images
• We’ll focus on grayscale
images
• Each pixel takes a value
between 0 and P
• Here, 0: black, 1: white
• Larger P in Lab Week 08

1 0 1 0 0
1 0 1 0 1
1 1 1 0 0
1 0 1 0 1
1 0 1 0 1
3
Images
• We’ll focus on grayscale
images
• Each pixel takes a value
between 0 and P
• Here, 0: black, 1: white
• Larger P in Lab Week 08

1 0 1 0 0
1 0 1 0 1
• How do we use an image
1 1 1 0 0 as an input for a neural
1 0 1 0 1 net?
1 0 1 0 1
3
Previous neural nets in this class
• Recall Lab Week 07

4
Previous neural nets in this class
• Recall Lab Week 07

input x
(n x 1)

4
Previous neural nets in this class
• Recall Lab Week 07

input x Fully connected layer: every input is


(n x 1) connected to every output by a weight

4
Previous neural nets in this class
• Recall Lab Week 07

input x Fully connected layer: every input is


(n x 1) connected to every output by a weight

But we know more about images:


• Spatial locality
• Translation invariance
4
Convolutional Layer: 1D example

5
Convolutional Layer: 1D example
A 1D image: 0 0 1 1 1 0 1 0 0 0

5
Convolutional Layer: 1D example
A 1D image: 0 0 1 1 1 0 1 0 0 0

A filter: -1 1 -1

5
Convolutional Layer: 1D example
A 1D image: 0 0 1 1 1 0 1 0 0 0

A filter: -1 1 -1

After
convolution*:

5
Convolutional Layer: 1D example
A 1D image: 0 0 1 1 1 0 1 0 0 0

A filter: -1 1 -1

After
convolution*:

5
Convolutional Layer: 1D example
A 1D image: 0 0 1 1 1 0 1 0 0 0

A filter: -1 1 -1

After
convolution*:

5
Convolutional Layer: 1D example
A 1D image: 0 0 1 1 1 0 1 0 0 0

A filter: -1 1 -1

After
convolution*:

5
Convolutional Layer: 1D example
A 1D image: 0 0 1 1 1 0 1 0 0 0

A filter: -1 1 -1 0 * -1

After
convolution*:

5
Convolutional Layer: 1D example
A 1D image: 0 0 1 1 1 0 1 0 0 0

A filter: -1 1 -1 0 * -1

After
convolution*:

5
Convolutional Layer: 1D example
A 1D image: 0 0 1 1 1 0 1 0 0 0

A filter: -1 1 -1 0 * -1 + 0 * 1

After
convolution*:

5
Convolutional Layer: 1D example
A 1D image: 0 0 1 1 1 0 1 0 0 0

A filter: -1 1 -1 0 * -1 + 0 * 1

After
convolution*:

5
Convolutional Layer: 1D example
A 1D image: 0 0 1 1 1 0 1 0 0 0

A filter: -1 1 -1 0 * -1 + 0 * 1 + 1 * -1

After
convolution*:

5
Convolutional Layer: 1D example
A 1D image: 0 0 1 1 1 0 1 0 0 0

A filter: -1 1 -1 0 * -1 + 0 * 1 + 1 * -1

After
convolution*:

5
Convolutional Layer: 1D example
A 1D image: 0 0 1 1 1 0 1 0 0 0

A filter: -1 1 -1 0 * -1 + 0 * 1 + 1 * -1 = -1

After
convolution*:

5
Convolutional Layer: 1D example
A 1D image: 0 0 1 1 1 0 1 0 0 0

A filter: -1 1 -1 0 * -1 + 0 * 1 + 1 * -1 = -1

After
convolution*: -1

5
Convolutional Layer: 1D example
A 1D image: 0 0 1 1 1 0 1 0 0 0

A filter: -1 1 -1

After
convolution*: -1

5
Convolutional Layer: 1D example
A 1D image: 0 0 1 1 1 0 1 0 0 0

A filter: -1 1 -1

After
convolution*: -1

5 *correlation
Convolutional Layer: 1D example
A 1D image: 0 0 1 1 1 0 1 0 0 0

A filter: -1 1 -1

After
convolution*: -1

5 *correlation
Convolutional Layer: 1D example
A 1D image: 0 0 1 1 1 0 1 0 0 0

A filter: -1 1 -1

After
convolution*: -1 0

5 *correlation
Convolutional Layer: 1D example
A 1D image: 0 0 1 1 1 0 1 0 0 0

A filter: -1 1 -1

After
convolution*: -1 0 -1

5 *correlation
Convolutional Layer: 1D example
A 1D image: 0 0 1 1 1 0 1 0 0 0

A filter: -1 1 -1

After
convolution*: -1 0 -1 0

5 *correlation
Convolutional Layer: 1D example
A 1D image: 0 0 1 1 1 0 1 0 0 0

A filter: -1 1 -1

After
convolution*: -1 0 -1 0 -2

5 *correlation
Convolutional Layer: 1D example
A 1D image: 0 0 1 1 1 0 1 0 0 0

A filter: -1 1 -1

After
convolution*: -1 0 -1 0 -2 1

5 *correlation
Convolutional Layer: 1D example
A 1D image: 0 0 1 1 1 0 1 0 0 0

A filter: -1 1 -1

After
convolution*: -1 0 -1 0 -2 1 -1

5 *correlation
Convolutional Layer: 1D example
A 1D image: 0 0 1 1 1 0 1 0 0 0

A filter: -1 1 -1

After
convolution*: -1 0 -1 0 -2 1 -1 0

5 *correlation
Convolutional Layer: 1D example
A 1D image: 0 0 1 1 1 0 1 0 0 0

A filter: -1 1 -1

After
convolution*: -1 0 -1 0 -2 1 -1 0

5 *correlation
Convolutional Layer: 1D example
A 1D image: 0 0 1 1 1 0 1 0 0 0

A filter: -1 1 -1

After
convolution*: -1 0 -1 0 -2 1 -1 0

After ReLU:

5 *correlation
Convolutional Layer: 1D example
A 1D image: 0 0 1 1 1 0 1 0 0 0

A filter: -1 1 -1

After
convolution*: -1 0 -1 0 -2 1 -1 0

After ReLU: 0

5 *correlation
Convolutional Layer: 1D example
A 1D image: 0 0 1 1 1 0 1 0 0 0

A filter: -1 1 -1

After
convolution*: -1 0 -1 0 -2 1 -1 0

After ReLU: 0 0

5 *correlation
Convolutional Layer: 1D example
A 1D image: 0 0 1 1 1 0 1 0 0 0

A filter: -1 1 -1

After
convolution*: -1 0 -1 0 -2 1 -1 0

After ReLU: 0 0 0

5 *correlation
Convolutional Layer: 1D example
A 1D image: 0 0 1 1 1 0 1 0 0 0

A filter: -1 1 -1

After
convolution*: -1 0 -1 0 -2 1 -1 0

After ReLU: 0 0 0 0

5 *correlation
Convolutional Layer: 1D example
A 1D image: 0 0 1 1 1 0 1 0 0 0

A filter: -1 1 -1

After
convolution*: -1 0 -1 0 -2 1 -1 0

After ReLU: 0 0 0 0 0

5 *correlation
Convolutional Layer: 1D example
A 1D image: 0 0 1 1 1 0 1 0 0 0

A filter: -1 1 -1

After
convolution*: -1 0 -1 0 -2 1 -1 0

After ReLU: 0 0 0 0 0 1

5 *correlation
Convolutional Layer: 1D example
A 1D image: 0 0 1 1 1 0 1 0 0 0

A filter: -1 1 -1

After
convolution*: -1 0 -1 0 -2 1 -1 0

After ReLU: 0 0 0 0 0 1 0

5 *correlation
Convolutional Layer: 1D example
A 1D image: 0 0 1 1 1 0 1 0 0 0

A filter: -1 1 -1

After
convolution*: -1 0 -1 0 -2 1 -1 0

After ReLU: 0 0 0 0 0 1 0 0

5 *correlation
Convolutional Layer: 1D example
A 1D image: 0 0 1 1 1 0 1 0 0 0

A filter: -1 1 -1

After
convolution*: -1 0 -1 0 -2 1 -1 0

After ReLU: 0 0 0 0 0 1 0 0

5 *correlation
Convolutional Layer: 1D example
A 1D image: 0 0 1 1 1 0 1 0 0 0

A filter: -1 1 -1

After
convolution*: -1 0 -1 0 -2 1 -1 0

After ReLU: 0 0 0 0 0 1 0 0

What does the filter do?

5 *correlation
Convolutional Layer: 1D example
A 1D image: 0 0 1 1 1 0 1 0 0 0

A filter: -1 1 -1

After
convolution*: -1 0 -1 0 -2 1 -1 0

After ReLU: 0 0 0 0 0 1 0 0

5 *correlation
Convolutional Layer: 1D example
A 1D image: 0 0 0 1 1 1 0 1 0 0 0 0

A filter: -1 1 -1

After
convolution*: -1 0 -1 0 -2 1 -1 0

After ReLU: 0 0 0 0 0 1 0 0

5 *correlation
Convolutional Layer: 1D example
A 1D image: 0 0 0 1 1 1 0 1 0 0 0 0

A filter: -1 1 -1

After
convolution*: -1 0 -1 0 -2 1 -1 0

After ReLU: 0 0 0 0 0 1 0 0

5 *correlation
Convolutional Layer: 1D example
A 1D image: 0 0 0 1 1 1 0 1 0 0 0 0

A filter: -1 1 -1

After
convolution*: 0 -1 0 -1 0 -2 1 -1 0

After ReLU: 0 0 0 0 0 1 0 0

5 *correlation
Convolutional Layer: 1D example
A 1D image: 0 0 0 1 1 1 0 1 0 0 0 0

A filter: -1 1 -1

After
convolution*: 0 -1 0 -1 0 -2 1 -1 0

After ReLU: 0 0 0 0 0 1 0 0

5 *correlation
Convolutional Layer: 1D example
A 1D image: 0 0 0 1 1 1 0 1 0 0 0 0

A filter: -1 1 -1

After
convolution*: 0 -1 0 -1 0 -2 1 -1 0 0

After ReLU: 0 0 0 0 0 1 0 0

5 *correlation
Convolutional Layer: 1D example
A 1D image: 0 0 0 1 1 1 0 1 0 0 0 0

A filter: -1 1 -1

After
convolution*: 0 -1 0 -1 0 -2 1 -1 0 0

After ReLU: 0 0 0 0 0 1 0 0

5 *correlation
Convolutional Layer: 1D example
A 1D image: 0 0 0 1 1 1 0 1 0 0 0 0

A filter: -1 1 -1

After
convolution*: 0 -1 0 -1 0 -2 1 -1 0 0

After ReLU: 0 0 0 0 0 0 1 0 0 0

5 *correlation
Convolutional Layer: 1D example
A 1D image: 0 0 0 1 1 1 0 1 0 0 0 0

A filter: -1 1 -1

After
convolution*:

After ReLU:

6 *correlation
Convolutional Layer: 1D example
A 1D image: 0 0 0 1 1 1 0 1 0 0 0 0

A filter: -1 1 -1 with bias +1

After
convolution*:

After ReLU:

6 *correlation
Convolutional Layer: 1D example
A 1D image: 0 0 0 1 1 1 0 1 0 0 0 0

A filter: -1 1 -1 with bias +1

After
convolution*:

After ReLU:

6 *correlation
Convolutional Layer: 1D example
A 1D image: 0 0 0 1 1 1 0 1 0 0 0 0

A filter: -1 1 -1 with bias +1

After
convolution*: 1

After ReLU:

6 *correlation
Convolutional Layer: 1D example
A 1D image: 0 0 0 1 1 1 0 1 0 0 0 0

A filter: -1 1 -1 with bias +1

After
convolution*: 1 0

After ReLU:

6 *correlation
Convolutional Layer: 1D example
A 1D image: 0 0 0 1 1 1 0 1 0 0 0 0

A filter: -1 1 -1 with bias +1

After
convolution*: 1 0

After ReLU:

6 *correlation
Convolutional Layer: 1D example
A 1D image: 0 0 0 1 1 1 0 1 0 0 0 0

A filter: -1 1 -1 with bias +1

After
convolution*: 1 0 1 0 1 -1 2 0 1 1

After ReLU:

6 *correlation
Convolutional Layer: 1D example
A 1D image: 0 0 0 1 1 1 0 1 0 0 0 0

A filter: -1 1 -1 with bias +1

After
convolution*: 1 0 1 0 1 -1 2 0 1 1

After ReLU:

6 *correlation
Convolutional Layer: 1D example
A 1D image: 0 0 0 1 1 1 0 1 0 0 0 0

A filter: -1 1 -1 with bias +1

After
convolution*: 1 0 1 0 1 -1 2 0 1 1

After ReLU: 1

6 *correlation
Convolutional Layer: 1D example
A 1D image: 0 0 0 1 1 1 0 1 0 0 0 0

A filter: -1 1 -1 with bias +1

After
convolution*: 1 0 1 0 1 -1 2 0 1 1

After ReLU: 1 0

6 *correlation
Convolutional Layer: 1D example
A 1D image: 0 0 0 1 1 1 0 1 0 0 0 0

A filter: -1 1 -1 with bias +1

After
convolution*: 1 0 1 0 1 -1 2 0 1 1

After ReLU: 1 0

6 *correlation
Convolutional Layer: 1D example
A 1D image: 0 0 0 1 1 1 0 1 0 0 0 0

A filter: -1 1 -1 with bias +1

After
convolution*: 1 0 1 0 1 -1 2 0 1 1

After ReLU: 1 0 1 0 1 0 2 0 1 1

6 *correlation
Convolutional Layer: 1D example
A 1D image: 0 0 0 1 1 1 0 1 0 0 0 0

A filter: -1 1 -1 with bias +1

After
convolution*:

After ReLU:

7 *correlation
Convolutional Layer: 1D example
A 1D image: 0 0 0 1 1 1 0 1 0 0 0 0

A filter: w1 w2 w3
<latexit sha1_base64="BGIvv1Que1aISVw+1pGEuT4uC1M=">AAAB6nicbVBNS8NAEJ34WetX1aOXxSJ4KkkV9Fj04rGi/YA2lM120y7dbMLuRCmhP8GLB0W8+ou8+W/ctjlo64OBx3szzMwLEikMuu63s7K6tr6xWdgqbu/s7u2XDg6bJk414w0Wy1i3A2q4FIo3UKDk7URzGgWSt4LRzdRvPXJtRKwecJxwP6IDJULBKFrp/qnn9Uplt+LOQJaJl5My5Kj3Sl/dfszSiCtkkhrT8dwE/YxqFEzySbGbGp5QNqID3rFU0YgbP5udOiGnVumTMNa2FJKZ+nsio5Ex4yiwnRHFoVn0puJ/XifF8MrPhEpS5IrNF4WpJBiT6d+kLzRnKMeWUKaFvZWwIdWUoU2naEPwFl9eJs1qxTuvVO8uyrXrPI4CHMMJnIEHl1CDW6hDAxgM4Ble4c2Rzovz7nzMW1ecfOYI/sD5/AELeI2j</latexit> <latexit sha1_base64="k9TH6JRVGzznxlg0BHK2AhK6Dh8=">AAAB6nicbVBNS8NAEJ34WetX1aOXxSJ4KkkV9Fj04rGi/YA2lM120y7dbMLuRCmhP8GLB0W8+ou8+W/ctjlo64OBx3szzMwLEikMuu63s7K6tr6xWdgqbu/s7u2XDg6bJk414w0Wy1i3A2q4FIo3UKDk7URzGgWSt4LRzdRvPXJtRKwecJxwP6IDJULBKFrp/qlX7ZXKbsWdgSwTLydlyFHvlb66/ZilEVfIJDWm47kJ+hnVKJjkk2I3NTyhbEQHvGOpohE3fjY7dUJOrdInYaxtKSQz9fdERiNjxlFgOyOKQ7PoTcX/vE6K4ZWfCZWkyBWbLwpTSTAm079JX2jOUI4toUwLeythQ6opQ5tO0YbgLb68TJrVindeqd5dlGvXeRwFOIYTOAMPLqEGt1CHBjAYwDO8wpsjnRfn3fmYt644+cwR/IHz+QMM/I2k</latexit>

<latexit sha1_base64="Jqap7piIcWW5up2+v3ATn2Y9lBE=">AAAB6nicbVDLTgJBEOzFF+IL9ehlIjHxRHbBRI9ELx4xyiOBDZkdemHC7OxmZlZDCJ/gxYPGePWLvPk3DrAHBSvppFLVne6uIBFcG9f9dnJr6xubW/ntws7u3v5B8fCoqeNUMWywWMSqHVCNgktsGG4EthOFNAoEtoLRzcxvPaLSPJYPZpygH9GB5CFn1Fjp/qlX7RVLbtmdg6wSLyMlyFDvFb+6/ZilEUrDBNW647mJ8SdUGc4ETgvdVGNC2YgOsGOppBFqfzI/dUrOrNInYaxsSUPm6u+JCY20HkeB7YyoGeplbyb+53VSE175Ey6T1KBki0VhKoiJyexv0ucKmRFjSyhT3N5K2JAqyoxNp2BD8JZfXiXNStmrlit3F6XadRZHHk7gFM7Bg0uowS3UoQEMBvAMr/DmCOfFeXc+Fq05J5s5hj9wPn8ADoCNpQ==</latexit>
with bias b

After
convolution*:

After ReLU:

7 *correlation
Convolutional Layer: 1D example
A 1D image: 0 0 0 1 1 1 0 1 0 0 0 0

A filter: w1 w2
<latexit sha1_base64="BGIvv1Que1aISVw+1pGEuT4uC1M=">AAAB6nicbVBNS8NAEJ34WetX1aOXxSJ4KkkV9Fj04rGi/YA2lM120y7dbMLuRCmhP8GLB0W8+ou8+W/ctjlo64OBx3szzMwLEikMuu63s7K6tr6xWdgqbu/s7u2XDg6bJk414w0Wy1i3A2q4FIo3UKDk7URzGgWSt4LRzdRvPXJtRKwecJxwP6IDJULBKFrp/qnn9Uplt+LOQJaJl5My5Kj3Sl/dfszSiCtkkhrT8dwE/YxqFEzySbGbGp5QNqID3rFU0YgbP5udOiGnVumTMNa2FJKZ+nsio5Ex4yiwnRHFoVn0puJ/XifF8MrPhEpS5IrNF4WpJBiT6d+kLzRnKMeWUKaFvZWwIdWUoU2naEPwFl9eJs1qxTuvVO8uyrXrPI4CHMMJnIEHl1CDW6hDAxgM4Ble4c2Rzovz7nzMW1ecfOYI/sD5/AELeI2j</latexit> <latexit sha1_base64="k9TH6JRVGzznxlg0BHK2AhK6Dh8=">AAAB6nicbVBNS8NAEJ34WetX1aOXxSJ4KkkV9Fj04rGi/YA2lM120y7dbMLuRCmhP8GLB0W8+ou8+W/ctjlo64OBx3szzMwLEikMuu63s7K6tr6xWdgqbu/s7u2XDg6bJk414w0Wy1i3A2q4FIo3UKDk7URzGgWSt4LRzdRvPXJtRKwecJxwP6IDJULBKFrp/qlX7ZXKbsWdgSwTLydlyFHvlb66/ZilEVfIJDWm47kJ+hnVKJjkk2I3NTyhbEQHvGOpohE3fjY7dUJOrdInYaxtKSQz9fdERiNjxlFgOyOKQ7PoTcX/vE6K4ZWfCZWkyBWbLwpTSTAm079JX2jOUI4toUwLeythQ6opQ5tO0YbgLb68TJrVindeqd5dlGvXeRwFOIYTOAMPLqEGt1CHBjAYwDO8wpsjnRfn3fmYt644+cwR/IHz+QMM/I2k</latexit>
with bias b

After
convolution*:

After ReLU:

7 *correlation
Convolutional Layer: 1D example
A 1D image: 0 0 0 1 1 1 0 1 0 0 0 0

A filter: w1 w2 w3 w4
<latexit sha1_base64="BGIvv1Que1aISVw+1pGEuT4uC1M=">AAAB6nicbVBNS8NAEJ34WetX1aOXxSJ4KkkV9Fj04rGi/YA2lM120y7dbMLuRCmhP8GLB0W8+ou8+W/ctjlo64OBx3szzMwLEikMuu63s7K6tr6xWdgqbu/s7u2XDg6bJk414w0Wy1i3A2q4FIo3UKDk7URzGgWSt4LRzdRvPXJtRKwecJxwP6IDJULBKFrp/qnn9Uplt+LOQJaJl5My5Kj3Sl/dfszSiCtkkhrT8dwE/YxqFEzySbGbGp5QNqID3rFU0YgbP5udOiGnVumTMNa2FJKZ+nsio5Ex4yiwnRHFoVn0puJ/XifF8MrPhEpS5IrNF4WpJBiT6d+kLzRnKMeWUKaFvZWwIdWUoU2naEPwFl9eJs1qxTuvVO8uyrXrPI4CHMMJnIEHl1CDW6hDAxgM4Ble4c2Rzovz7nzMW1ecfOYI/sD5/AELeI2j</latexit> <latexit sha1_base64="k9TH6JRVGzznxlg0BHK2AhK6Dh8=">AAAB6nicbVBNS8NAEJ34WetX1aOXxSJ4KkkV9Fj04rGi/YA2lM120y7dbMLuRCmhP8GLB0W8+ou8+W/ctjlo64OBx3szzMwLEikMuu63s7K6tr6xWdgqbu/s7u2XDg6bJk414w0Wy1i3A2q4FIo3UKDk7URzGgWSt4LRzdRvPXJtRKwecJxwP6IDJULBKFrp/qlX7ZXKbsWdgSwTLydlyFHvlb66/ZilEVfIJDWm47kJ+hnVKJjkk2I3NTyhbEQHvGOpohE3fjY7dUJOrdInYaxtKSQz9fdERiNjxlFgOyOKQ7PoTcX/vE6K4ZWfCZWkyBWbLwpTSTAm079JX2jOUI4toUwLeythQ6opQ5tO0YbgLb68TJrVindeqd5dlGvXeRwFOIYTOAMPLqEGt1CHBjAYwDO8wpsjnRfn3fmYt644+cwR/IHz+QMM/I2k</latexit>

<latexit sha1_base64="Jqap7piIcWW5up2+v3ATn2Y9lBE=">AAAB6nicbVDLTgJBEOzFF+IL9ehlIjHxRHbBRI9ELx4xyiOBDZkdemHC7OxmZlZDCJ/gxYPGePWLvPk3DrAHBSvppFLVne6uIBFcG9f9dnJr6xubW/ntws7u3v5B8fCoqeNUMWywWMSqHVCNgktsGG4EthOFNAoEtoLRzcxvPaLSPJYPZpygH9GB5CFn1Fjp/qlX7RVLbtmdg6wSLyMlyFDvFb+6/ZilEUrDBNW647mJ8SdUGc4ETgvdVGNC2YgOsGOppBFqfzI/dUrOrNInYaxsSUPm6u+JCY20HkeB7YyoGeplbyb+53VSE175Ey6T1KBki0VhKoiJyexv0ucKmRFjSyhT3N5K2JAqyoxNp2BD8JZfXiXNStmrlit3F6XadRZHHk7gFM7Bg0uowS3UoQEMBvAMr/DmCOfFeXc+Fq05J5s5hj9wPn8ADoCNpQ==</latexit>
<latexit sha1_base64="XtR+SHzWNCALzt4+lXIZepAP2i0=">AAAB7HicbVBNS8NAEJ34WetX1aOXxSJ4Kkkt6LHoxWMF0xbaUDbbbbt0swm7E6WE/gYvHhTx6g/y5r9x2+agrQ8GHu/NMDMvTKQw6Lrfztr6xubWdmGnuLu3f3BYOjpumjjVjPsslrFuh9RwKRT3UaDk7URzGoWSt8Lx7cxvPXJtRKwecJLwIKJDJQaCUbSS/9TLatNeqexW3DnIKvFyUoYcjV7pq9uPWRpxhUxSYzqem2CQUY2CST4tdlPDE8rGdMg7lioacRNk82On5NwqfTKItS2FZK7+nshoZMwkCm1nRHFklr2Z+J/XSXFwHWRCJSlyxRaLBqkkGJPZ56QvNGcoJ5ZQpoW9lbAR1ZShzadoQ/CWX14lzWrFu6xU72vl+k0eRwFO4QwuwIMrqMMdNMAHBgKe4RXeHOW8OO/Ox6J1zclnTuAPnM8f1JuOsg==</latexit>
with bias b

After
convolution*:

After ReLU:

7 *correlation
Convolutional Layer: 1D example
A 1D image: 0 0 0 1 1 1 0 1 0 0 0 0

A filter: w1 w2 w3
<latexit sha1_base64="BGIvv1Que1aISVw+1pGEuT4uC1M=">AAAB6nicbVBNS8NAEJ34WetX1aOXxSJ4KkkV9Fj04rGi/YA2lM120y7dbMLuRCmhP8GLB0W8+ou8+W/ctjlo64OBx3szzMwLEikMuu63s7K6tr6xWdgqbu/s7u2XDg6bJk414w0Wy1i3A2q4FIo3UKDk7URzGgWSt4LRzdRvPXJtRKwecJxwP6IDJULBKFrp/qnn9Uplt+LOQJaJl5My5Kj3Sl/dfszSiCtkkhrT8dwE/YxqFEzySbGbGp5QNqID3rFU0YgbP5udOiGnVumTMNa2FJKZ+nsio5Ex4yiwnRHFoVn0puJ/XifF8MrPhEpS5IrNF4WpJBiT6d+kLzRnKMeWUKaFvZWwIdWUoU2naEPwFl9eJs1qxTuvVO8uyrXrPI4CHMMJnIEHl1CDW6hDAxgM4Ble4c2Rzovz7nzMW1ecfOYI/sD5/AELeI2j</latexit> <latexit sha1_base64="k9TH6JRVGzznxlg0BHK2AhK6Dh8=">AAAB6nicbVBNS8NAEJ34WetX1aOXxSJ4KkkV9Fj04rGi/YA2lM120y7dbMLuRCmhP8GLB0W8+ou8+W/ctjlo64OBx3szzMwLEikMuu63s7K6tr6xWdgqbu/s7u2XDg6bJk414w0Wy1i3A2q4FIo3UKDk7URzGgWSt4LRzdRvPXJtRKwecJxwP6IDJULBKFrp/qlX7ZXKbsWdgSwTLydlyFHvlb66/ZilEVfIJDWm47kJ+hnVKJjkk2I3NTyhbEQHvGOpohE3fjY7dUJOrdInYaxtKSQz9fdERiNjxlFgOyOKQ7PoTcX/vE6K4ZWfCZWkyBWbLwpTSTAm079JX2jOUI4toUwLeythQ6opQ5tO0YbgLb68TJrVindeqd5dlGvXeRwFOIYTOAMPLqEGt1CHBjAYwDO8wpsjnRfn3fmYt644+cwR/IHz+QMM/I2k</latexit>

<latexit sha1_base64="Jqap7piIcWW5up2+v3ATn2Y9lBE=">AAAB6nicbVDLTgJBEOzFF+IL9ehlIjHxRHbBRI9ELx4xyiOBDZkdemHC7OxmZlZDCJ/gxYPGePWLvPk3DrAHBSvppFLVne6uIBFcG9f9dnJr6xubW/ntws7u3v5B8fCoqeNUMWywWMSqHVCNgktsGG4EthOFNAoEtoLRzcxvPaLSPJYPZpygH9GB5CFn1Fjp/qlX7RVLbtmdg6wSLyMlyFDvFb+6/ZilEUrDBNW647mJ8SdUGc4ETgvdVGNC2YgOsGOppBFqfzI/dUrOrNInYaxsSUPm6u+JCY20HkeB7YyoGeplbyb+53VSE175Ey6T1KBki0VhKoiJyexv0ucKmRFjSyhT3N5K2JAqyoxNp2BD8JZfXiXNStmrlit3F6XadRZHHk7gFM7Bg0uowS3UoQEMBvAMr/DmCOfFeXc+Fq05J5s5hj9wPn8ADoCNpQ==</latexit>
with bias b

After
convolution*:

After ReLU:

7 *correlation
Convolutional Layer: 1D example
A 1D image: 0 0 0 1 1 1 0 1 0 0 0 0

A filter: w1 w2 w3
<latexit sha1_base64="BGIvv1Que1aISVw+1pGEuT4uC1M=">AAAB6nicbVBNS8NAEJ34WetX1aOXxSJ4KkkV9Fj04rGi/YA2lM120y7dbMLuRCmhP8GLB0W8+ou8+W/ctjlo64OBx3szzMwLEikMuu63s7K6tr6xWdgqbu/s7u2XDg6bJk414w0Wy1i3A2q4FIo3UKDk7URzGgWSt4LRzdRvPXJtRKwecJxwP6IDJULBKFrp/qnn9Uplt+LOQJaJl5My5Kj3Sl/dfszSiCtkkhrT8dwE/YxqFEzySbGbGp5QNqID3rFU0YgbP5udOiGnVumTMNa2FJKZ+nsio5Ex4yiwnRHFoVn0puJ/XifF8MrPhEpS5IrNF4WpJBiT6d+kLzRnKMeWUKaFvZWwIdWUoU2naEPwFl9eJs1qxTuvVO8uyrXrPI4CHMMJnIEHl1CDW6hDAxgM4Ble4c2Rzovz7nzMW1ecfOYI/sD5/AELeI2j</latexit> <latexit sha1_base64="k9TH6JRVGzznxlg0BHK2AhK6Dh8=">AAAB6nicbVBNS8NAEJ34WetX1aOXxSJ4KkkV9Fj04rGi/YA2lM120y7dbMLuRCmhP8GLB0W8+ou8+W/ctjlo64OBx3szzMwLEikMuu63s7K6tr6xWdgqbu/s7u2XDg6bJk414w0Wy1i3A2q4FIo3UKDk7URzGgWSt4LRzdRvPXJtRKwecJxwP6IDJULBKFrp/qlX7ZXKbsWdgSwTLydlyFHvlb66/ZilEVfIJDWm47kJ+hnVKJjkk2I3NTyhbEQHvGOpohE3fjY7dUJOrdInYaxtKSQz9fdERiNjxlFgOyOKQ7PoTcX/vE6K4ZWfCZWkyBWbLwpTSTAm079JX2jOUI4toUwLeythQ6opQ5tO0YbgLb68TJrVindeqd5dlGvXeRwFOIYTOAMPLqEGt1CHBjAYwDO8wpsjnRfn3fmYt644+cwR/IHz+QMM/I2k</latexit>

<latexit sha1_base64="Jqap7piIcWW5up2+v3ATn2Y9lBE=">AAAB6nicbVDLTgJBEOzFF+IL9ehlIjHxRHbBRI9ELx4xyiOBDZkdemHC7OxmZlZDCJ/gxYPGePWLvPk3DrAHBSvppFLVne6uIBFcG9f9dnJr6xubW/ntws7u3v5B8fCoqeNUMWywWMSqHVCNgktsGG4EthOFNAoEtoLRzcxvPaLSPJYPZpygH9GB5CFn1Fjp/qlX7RVLbtmdg6wSLyMlyFDvFb+6/ZilEUrDBNW647mJ8SdUGc4ETgvdVGNC2YgOsGOppBFqfzI/dUrOrNInYaxsSUPm6u+JCY20HkeB7YyoGeplbyb+53VSE175Ey6T1KBki0VhKoiJyexv0ucKmRFjSyhT3N5K2JAqyoxNp2BD8JZfXiXNStmrlit3F6XadRZHHk7gFM7Bg0uowS3UoQEMBvAMr/DmCOfFeXc+Fq05J5s5hj9wPn8ADoCNpQ==</latexit>
with bias b

After
convolution*:

After ReLU:

• How many weights (including bias)?

7 *correlation
Convolutional Layer: 1D example
A 1D image: 0 0 0 1 1 1 0 1 0 0 0 0

A filter: w1 w2 w3
<latexit sha1_base64="BGIvv1Que1aISVw+1pGEuT4uC1M=">AAAB6nicbVBNS8NAEJ34WetX1aOXxSJ4KkkV9Fj04rGi/YA2lM120y7dbMLuRCmhP8GLB0W8+ou8+W/ctjlo64OBx3szzMwLEikMuu63s7K6tr6xWdgqbu/s7u2XDg6bJk414w0Wy1i3A2q4FIo3UKDk7URzGgWSt4LRzdRvPXJtRKwecJxwP6IDJULBKFrp/qnn9Uplt+LOQJaJl5My5Kj3Sl/dfszSiCtkkhrT8dwE/YxqFEzySbGbGp5QNqID3rFU0YgbP5udOiGnVumTMNa2FJKZ+nsio5Ex4yiwnRHFoVn0puJ/XifF8MrPhEpS5IrNF4WpJBiT6d+kLzRnKMeWUKaFvZWwIdWUoU2naEPwFl9eJs1qxTuvVO8uyrXrPI4CHMMJnIEHl1CDW6hDAxgM4Ble4c2Rzovz7nzMW1ecfOYI/sD5/AELeI2j</latexit> <latexit sha1_base64="k9TH6JRVGzznxlg0BHK2AhK6Dh8=">AAAB6nicbVBNS8NAEJ34WetX1aOXxSJ4KkkV9Fj04rGi/YA2lM120y7dbMLuRCmhP8GLB0W8+ou8+W/ctjlo64OBx3szzMwLEikMuu63s7K6tr6xWdgqbu/s7u2XDg6bJk414w0Wy1i3A2q4FIo3UKDk7URzGgWSt4LRzdRvPXJtRKwecJxwP6IDJULBKFrp/qlX7ZXKbsWdgSwTLydlyFHvlb66/ZilEVfIJDWm47kJ+hnVKJjkk2I3NTyhbEQHvGOpohE3fjY7dUJOrdInYaxtKSQz9fdERiNjxlFgOyOKQ7PoTcX/vE6K4ZWfCZWkyBWbLwpTSTAm079JX2jOUI4toUwLeythQ6opQ5tO0YbgLb68TJrVindeqd5dlGvXeRwFOIYTOAMPLqEGt1CHBjAYwDO8wpsjnRfn3fmYt644+cwR/IHz+QMM/I2k</latexit>

<latexit sha1_base64="Jqap7piIcWW5up2+v3ATn2Y9lBE=">AAAB6nicbVDLTgJBEOzFF+IL9ehlIjHxRHbBRI9ELx4xyiOBDZkdemHC7OxmZlZDCJ/gxYPGePWLvPk3DrAHBSvppFLVne6uIBFcG9f9dnJr6xubW/ntws7u3v5B8fCoqeNUMWywWMSqHVCNgktsGG4EthOFNAoEtoLRzcxvPaLSPJYPZpygH9GB5CFn1Fjp/qlX7RVLbtmdg6wSLyMlyFDvFb+6/ZilEUrDBNW647mJ8SdUGc4ETgvdVGNC2YgOsGOppBFqfzI/dUrOrNInYaxsSUPm6u+JCY20HkeB7YyoGeplbyb+53VSE175Ey6T1KBki0VhKoiJyexv0ucKmRFjSyhT3N5K2JAqyoxNp2BD8JZfXiXNStmrlit3F6XadRZHHk7gFM7Bg0uowS3UoQEMBvAMr/DmCOfFeXc+Fq05J5s5hj9wPn8ADoCNpQ==</latexit>
with bias b

After
convolution*:

After ReLU:

• How many weights (including bias)? 4

7 *correlation
Convolutional Layer: 1D example
A 1D image: 0 0 0 1 1 1 0 1 0 0 0 0

A filter: w1 w2 w3
<latexit sha1_base64="BGIvv1Que1aISVw+1pGEuT4uC1M=">AAAB6nicbVBNS8NAEJ34WetX1aOXxSJ4KkkV9Fj04rGi/YA2lM120y7dbMLuRCmhP8GLB0W8+ou8+W/ctjlo64OBx3szzMwLEikMuu63s7K6tr6xWdgqbu/s7u2XDg6bJk414w0Wy1i3A2q4FIo3UKDk7URzGgWSt4LRzdRvPXJtRKwecJxwP6IDJULBKFrp/qnn9Uplt+LOQJaJl5My5Kj3Sl/dfszSiCtkkhrT8dwE/YxqFEzySbGbGp5QNqID3rFU0YgbP5udOiGnVumTMNa2FJKZ+nsio5Ex4yiwnRHFoVn0puJ/XifF8MrPhEpS5IrNF4WpJBiT6d+kLzRnKMeWUKaFvZWwIdWUoU2naEPwFl9eJs1qxTuvVO8uyrXrPI4CHMMJnIEHl1CDW6hDAxgM4Ble4c2Rzovz7nzMW1ecfOYI/sD5/AELeI2j</latexit> <latexit sha1_base64="k9TH6JRVGzznxlg0BHK2AhK6Dh8=">AAAB6nicbVBNS8NAEJ34WetX1aOXxSJ4KkkV9Fj04rGi/YA2lM120y7dbMLuRCmhP8GLB0W8+ou8+W/ctjlo64OBx3szzMwLEikMuu63s7K6tr6xWdgqbu/s7u2XDg6bJk414w0Wy1i3A2q4FIo3UKDk7URzGgWSt4LRzdRvPXJtRKwecJxwP6IDJULBKFrp/qlX7ZXKbsWdgSwTLydlyFHvlb66/ZilEVfIJDWm47kJ+hnVKJjkk2I3NTyhbEQHvGOpohE3fjY7dUJOrdInYaxtKSQz9fdERiNjxlFgOyOKQ7PoTcX/vE6K4ZWfCZWkyBWbLwpTSTAm079JX2jOUI4toUwLeythQ6opQ5tO0YbgLb68TJrVindeqd5dlGvXeRwFOIYTOAMPLqEGt1CHBjAYwDO8wpsjnRfn3fmYt644+cwR/IHz+QMM/I2k</latexit>

<latexit sha1_base64="Jqap7piIcWW5up2+v3ATn2Y9lBE=">AAAB6nicbVDLTgJBEOzFF+IL9ehlIjHxRHbBRI9ELx4xyiOBDZkdemHC7OxmZlZDCJ/gxYPGePWLvPk3DrAHBSvppFLVne6uIBFcG9f9dnJr6xubW/ntws7u3v5B8fCoqeNUMWywWMSqHVCNgktsGG4EthOFNAoEtoLRzcxvPaLSPJYPZpygH9GB5CFn1Fjp/qlX7RVLbtmdg6wSLyMlyFDvFb+6/ZilEUrDBNW647mJ8SdUGc4ETgvdVGNC2YgOsGOppBFqfzI/dUrOrNInYaxsSUPm6u+JCY20HkeB7YyoGeplbyb+53VSE175Ey6T1KBki0VhKoiJyexv0ucKmRFjSyhT3N5K2JAqyoxNp2BD8JZfXiXNStmrlit3F6XadRZHHk7gFM7Bg0uowS3UoQEMBvAMr/DmCOfFeXc+Fq05J5s5hj9wPn8ADoCNpQ==</latexit>
with bias b

After
convolution*:

After ReLU:

• How many weights (including bias)? 4


• How many weights (including biases) for fully connected
layer with 10 inputs & 10 outputs?
7 *correlation
Convolutional Layer: 1D example
A 1D image: 0 0 0 1 1 1 0 1 0 0 0 0

A filter: w1 w2 w3
<latexit sha1_base64="BGIvv1Que1aISVw+1pGEuT4uC1M=">AAAB6nicbVBNS8NAEJ34WetX1aOXxSJ4KkkV9Fj04rGi/YA2lM120y7dbMLuRCmhP8GLB0W8+ou8+W/ctjlo64OBx3szzMwLEikMuu63s7K6tr6xWdgqbu/s7u2XDg6bJk414w0Wy1i3A2q4FIo3UKDk7URzGgWSt4LRzdRvPXJtRKwecJxwP6IDJULBKFrp/qnn9Uplt+LOQJaJl5My5Kj3Sl/dfszSiCtkkhrT8dwE/YxqFEzySbGbGp5QNqID3rFU0YgbP5udOiGnVumTMNa2FJKZ+nsio5Ex4yiwnRHFoVn0puJ/XifF8MrPhEpS5IrNF4WpJBiT6d+kLzRnKMeWUKaFvZWwIdWUoU2naEPwFl9eJs1qxTuvVO8uyrXrPI4CHMMJnIEHl1CDW6hDAxgM4Ble4c2Rzovz7nzMW1ecfOYI/sD5/AELeI2j</latexit> <latexit sha1_base64="k9TH6JRVGzznxlg0BHK2AhK6Dh8=">AAAB6nicbVBNS8NAEJ34WetX1aOXxSJ4KkkV9Fj04rGi/YA2lM120y7dbMLuRCmhP8GLB0W8+ou8+W/ctjlo64OBx3szzMwLEikMuu63s7K6tr6xWdgqbu/s7u2XDg6bJk414w0Wy1i3A2q4FIo3UKDk7URzGgWSt4LRzdRvPXJtRKwecJxwP6IDJULBKFrp/qlX7ZXKbsWdgSwTLydlyFHvlb66/ZilEVfIJDWm47kJ+hnVKJjkk2I3NTyhbEQHvGOpohE3fjY7dUJOrdInYaxtKSQz9fdERiNjxlFgOyOKQ7PoTcX/vE6K4ZWfCZWkyBWbLwpTSTAm079JX2jOUI4toUwLeythQ6opQ5tO0YbgLb68TJrVindeqd5dlGvXeRwFOIYTOAMPLqEGt1CHBjAYwDO8wpsjnRfn3fmYt644+cwR/IHz+QMM/I2k</latexit>

<latexit sha1_base64="Jqap7piIcWW5up2+v3ATn2Y9lBE=">AAAB6nicbVDLTgJBEOzFF+IL9ehlIjHxRHbBRI9ELx4xyiOBDZkdemHC7OxmZlZDCJ/gxYPGePWLvPk3DrAHBSvppFLVne6uIBFcG9f9dnJr6xubW/ntws7u3v5B8fCoqeNUMWywWMSqHVCNgktsGG4EthOFNAoEtoLRzcxvPaLSPJYPZpygH9GB5CFn1Fjp/qlX7RVLbtmdg6wSLyMlyFDvFb+6/ZilEUrDBNW647mJ8SdUGc4ETgvdVGNC2YgOsGOppBFqfzI/dUrOrNInYaxsSUPm6u+JCY20HkeB7YyoGeplbyb+53VSE175Ey6T1KBki0VhKoiJyexv0ucKmRFjSyhT3N5K2JAqyoxNp2BD8JZfXiXNStmrlit3F6XadRZHHk7gFM7Bg0uowS3UoQEMBvAMr/DmCOfFeXc+Fq05J5s5hj9wPn8ADoCNpQ==</latexit>
with bias b

After
convolution*:

After ReLU:

• How many weights (including bias)? 4


• How many weights (including biases) for fully connected
layer with 10 inputs & 10 outputs? 10 x 11 =
7 *correlation
Convolutional Layer: 1D example
A 1D image: 0 0 0 1 1 1 0 1 0 0 0 0

A filter: w1 w2 w3
<latexit sha1_base64="BGIvv1Que1aISVw+1pGEuT4uC1M=">AAAB6nicbVBNS8NAEJ34WetX1aOXxSJ4KkkV9Fj04rGi/YA2lM120y7dbMLuRCmhP8GLB0W8+ou8+W/ctjlo64OBx3szzMwLEikMuu63s7K6tr6xWdgqbu/s7u2XDg6bJk414w0Wy1i3A2q4FIo3UKDk7URzGgWSt4LRzdRvPXJtRKwecJxwP6IDJULBKFrp/qnn9Uplt+LOQJaJl5My5Kj3Sl/dfszSiCtkkhrT8dwE/YxqFEzySbGbGp5QNqID3rFU0YgbP5udOiGnVumTMNa2FJKZ+nsio5Ex4yiwnRHFoVn0puJ/XifF8MrPhEpS5IrNF4WpJBiT6d+kLzRnKMeWUKaFvZWwIdWUoU2naEPwFl9eJs1qxTuvVO8uyrXrPI4CHMMJnIEHl1CDW6hDAxgM4Ble4c2Rzovz7nzMW1ecfOYI/sD5/AELeI2j</latexit> <latexit sha1_base64="k9TH6JRVGzznxlg0BHK2AhK6Dh8=">AAAB6nicbVBNS8NAEJ34WetX1aOXxSJ4KkkV9Fj04rGi/YA2lM120y7dbMLuRCmhP8GLB0W8+ou8+W/ctjlo64OBx3szzMwLEikMuu63s7K6tr6xWdgqbu/s7u2XDg6bJk414w0Wy1i3A2q4FIo3UKDk7URzGgWSt4LRzdRvPXJtRKwecJxwP6IDJULBKFrp/qlX7ZXKbsWdgSwTLydlyFHvlb66/ZilEVfIJDWm47kJ+hnVKJjkk2I3NTyhbEQHvGOpohE3fjY7dUJOrdInYaxtKSQz9fdERiNjxlFgOyOKQ7PoTcX/vE6K4ZWfCZWkyBWbLwpTSTAm079JX2jOUI4toUwLeythQ6opQ5tO0YbgLb68TJrVindeqd5dlGvXeRwFOIYTOAMPLqEGt1CHBjAYwDO8wpsjnRfn3fmYt644+cwR/IHz+QMM/I2k</latexit>

<latexit sha1_base64="Jqap7piIcWW5up2+v3ATn2Y9lBE=">AAAB6nicbVDLTgJBEOzFF+IL9ehlIjHxRHbBRI9ELx4xyiOBDZkdemHC7OxmZlZDCJ/gxYPGePWLvPk3DrAHBSvppFLVne6uIBFcG9f9dnJr6xubW/ntws7u3v5B8fCoqeNUMWywWMSqHVCNgktsGG4EthOFNAoEtoLRzcxvPaLSPJYPZpygH9GB5CFn1Fjp/qlX7RVLbtmdg6wSLyMlyFDvFb+6/ZilEUrDBNW647mJ8SdUGc4ETgvdVGNC2YgOsGOppBFqfzI/dUrOrNInYaxsSUPm6u+JCY20HkeB7YyoGeplbyb+53VSE175Ey6T1KBki0VhKoiJyexv0ucKmRFjSyhT3N5K2JAqyoxNp2BD8JZfXiXNStmrlit3F6XadRZHHk7gFM7Bg0uowS3UoQEMBvAMr/DmCOfFeXc+Fq05J5s5hj9wPn8ADoCNpQ==</latexit>
with bias b

After
convolution*:

After ReLU:

• How many weights (including bias)? 4


• How many weights (including biases) for fully connected
layer with 10 inputs & 10 outputs? 10 x 11 = 110
7 *correlation
Convolutional Layer: 2D example
A 2D 1 0 1 0 0
image:
1 0 1 0 1
1 1 1 0 0
1 0 1 0 1
1 0 1 0 1

8
Convolutional Layer: 2D example
A 2D 1 0 1 0 0
image:
1 0 1 0 1
1 1 1 0 0
1 0 1 0 1
1 0 1 0 1

8
Convolutional Layer: 2D example
A 2D 1 0 1 0 0
image:
1 0 1 0 1
1 1 1 0 0
1 0 1 0 1
1 0 1 0 1

8
Convolutional Layer: 2D example
A 2D 1 0 1 0 0 A filter: -1 -1 -1
image:
1 0 1 0 1 -1 1 -1
1 1 1 0 0 -1 -1 -1
1 0 1 0 1
1 0 1 0 1

8
Convolutional Layer: 2D example
A 2D 1 0 1 0 0 A filter: -1 -1 -1
image:
1 0 1 0 1 -1 1 -1
1 1 1 0 0 -1 -1 -1
1 0 1 0 1
1 0 1 0 1

After
convolution:

8
Convolutional Layer: 2D example
A 2D 1 0 1 0 0 A filter: -1 -1 -1
image:
1 0 1 0 1 -1 1 -1
1 1 1 0 0 -1 -1 -1
1 0 1 0 1
1 0 1 0 1

After
convolution:

8
Convolutional Layer: 2D example
A 2D 1 0 1 0 0 A filter: -1 -1 -1
image:
1 0 1 0 1 -1 1 -1
1 1 1 0 0 -1 -1 -1
1 0 1 0 1
1 0 1 0 1

After
convolution:

8
Convolutional Layer: 2D example
A 2D 1 0 1 0 0 A filter: -1 -1 -1
image:
1 0 1 0 1 -1 1 -1
1 1 1 0 0 -1 -1 -1
1 0 1 0 1
1 0 1 0 1

After
convolution:

8
Convolutional Layer: 2D example
A 2D 1 0 1 0 0 A filter: -1 -1 -1
image:
1 0 1 0 1 -1 1 -1
1 1 1 0 0 -1 -1 -1
1 0 1 0 1
1 0 1 0 1

-1
After
convolution:

8
Convolutional Layer: 2D example
A 2D 1 0 1 0 0 A filter: -1 -1 -1
image:
1 0 1 0 1 -1 1 -1
1 1 1 0 0 -1 -1 -1
1 0 1 0 1
1 0 1 0 1

-1
After
convolution:

8
Convolutional Layer: 2D example
A 2D 1 0 1 0 0 A filter: -1 -1 -1
image:
1 0 1 0 1 -1 1 -1
1 1 1 0 0 -1 -1 -1
1 0 1 0 1
1 0 1 0 1

-1 + 0
After
convolution:

8
Convolutional Layer: 2D example
A 2D 1 0 1 0 0 A filter: -1 -1 -1
image:
1 0 1 0 1 -1 1 -1
1 1 1 0 0 -1 -1 -1
1 0 1 0 1
1 0 1 0 1

-1 + 0 + -1
After
convolution:

8
Convolutional Layer: 2D example
A 2D 1 0 1 0 0 A filter: -1 -1 -1
image:
1 0 1 0 1 -1 1 -1
1 1 1 0 0 -1 -1 -1
1 0 1 0 1
1 0 1 0 1

-1 + 0 + -1
+ -1
After
convolution:

8
Convolutional Layer: 2D example
A 2D 1 0 1 0 0 A filter: -1 -1 -1
image:
1 0 1 0 1 -1 1 -1
1 1 1 0 0 -1 -1 -1
1 0 1 0 1
1 0 1 0 1

-1 + 0 + -1
+ -1 + 0
After
convolution:

8
Convolutional Layer: 2D example
A 2D 1 0 1 0 0 A filter: -1 -1 -1
image:
1 0 1 0 1 -1 1 -1
1 1 1 0 0 -1 -1 -1
1 0 1 0 1
1 0 1 0 1

-1 + 0 + -1
+ -1 + 0 + -1
After
convolution:

8
Convolutional Layer: 2D example
A 2D 1 0 1 0 0 A filter: -1 -1 -1
image:
1 0 1 0 1 -1 1 -1
1 1 1 0 0 -1 -1 -1
1 0 1 0 1
1 0 1 0 1

-1 + 0 + -1
+ -1 + 0 + -1
After
+ -1 convolution:

8
Convolutional Layer: 2D example
A 2D 1 0 1 0 0 A filter: -1 -1 -1
image:
1 0 1 0 1 -1 1 -1
1 1 1 0 0 -1 -1 -1
1 0 1 0 1
1 0 1 0 1

-1 + 0 + -1
+ -1 + 0 + -1
After
+ -1 + - 1 convolution:

8
Convolutional Layer: 2D example
A 2D 1 0 1 0 0 A filter: -1 -1 -1
image:
1 0 1 0 1 -1 1 -1
1 1 1 0 0 -1 -1 -1
1 0 1 0 1
1 0 1 0 1

-1 + 0 + -1
+ -1 + 0 + -1
After
+ -1 + - 1 + -1 convolution:

8
Convolutional Layer: 2D example
A 2D 1 0 1 0 0 A filter: -1 -1 -1
image:
1 0 1 0 1 -1 1 -1
1 1 1 0 0 -1 -1 -1
1 0 1 0 1
1 0 1 0 1

-1 + 0 + -1
+ -1 + 0 + -1
After
+ -1 + - 1 + -1 convolution:
= -7
8
Convolutional Layer: 2D example
A 2D 1 0 1 0 0 A filter: -1 -1 -1
image:
1 0 1 0 1 -1 1 -1
1 1 1 0 0 -1 -1 -1
1 0 1 0 1
1 0 1 0 1

-1 + 0 + -1 -7
+ -1 + 0 + -1
After
+ -1 + - 1 + -1 convolution:
= -7
8
Convolutional Layer: 2D example
A 2D 1 0 1 0 0 A filter: -1 -1 -1
image:
1 0 1 0 1 -1 1 -1
1 1 1 0 0 -1 -1 -1
1 0 1 0 1
1 0 1 0 1

-7
After
convolution:

8
Convolutional Layer: 2D example
A 2D 1 0 1 0 0 A filter: -1 -1 -1
image:
1 0 1 0 1 -1 1 -1
1 1 1 0 0 -1 -1 -1
1 0 1 0 1
1 0 1 0 1

-7
After
convolution:

8
Convolutional Layer: 2D example
A 2D 1 0 1 0 0 A filter: -1 -1 -1
image:
1 0 1 0 1 -1 1 -1
1 1 1 0 0 -1 -1 -1
1 0 1 0 1
1 0 1 0 1

-7 -2
After
convolution:

8
Convolutional Layer: 2D example
A 2D 1 0 1 0 0 A filter: -1 -1 -1
image:
1 0 1 0 1 -1 1 -1
1 1 1 0 0 -1 -1 -1
1 0 1 0 1
1 0 1 0 1

-7 -2
After
convolution:

8
Convolutional Layer: 2D example
A 2D 1 0 1 0 0 A filter: -1 -1 -1
image:
1 0 1 0 1 -1 1 -1
1 1 1 0 0 -1 -1 -1
1 0 1 0 1
1 0 1 0 1

-7 -2 -4
After
convolution:

8
Convolutional Layer: 2D example
A 2D 1 0 1 0 0 A filter: -1 -1 -1
image:
1 0 1 0 1 -1 1 -1
1 1 1 0 0 -1 -1 -1
1 0 1 0 1
1 0 1 0 1

-7 -2 -4
After -5
convolution:

8
Convolutional Layer: 2D example
A 2D 1 0 1 0 0 A filter: -1 -1 -1
image:
1 0 1 0 1 -1 1 -1
1 1 1 0 0 -1 -1 -1
1 0 1 0 1
1 0 1 0 1

-7 -2 -4
After -5 -2
convolution:

8
Convolutional Layer: 2D example
A 2D 1 0 1 0 0 A filter: -1 -1 -1
image:
1 0 1 0 1 -1 1 -1
1 1 1 0 0 -1 -1 -1
1 0 1 0 1
1 0 1 0 1

-7 -2 -4
After -5 -2 -5
convolution:

8
Convolutional Layer: 2D example
A 2D 1 0 1 0 0 A filter: -1 -1 -1
image:
1 0 1 0 1 -1 1 -1
1 1 1 0 0 -1 -1 -1
1 0 1 0 1
1 0 1 0 1

-7 -2 -4
After -5 -2 -5
convolution:
-7

8
Convolutional Layer: 2D example
A 2D 1 0 1 0 0 A filter: -1 -1 -1
image:
1 0 1 0 1 -1 1 -1
1 1 1 0 0 -1 -1 -1
1 0 1 0 1
1 0 1 0 1

-7 -2 -4
After -5 -2 -5
convolution:
-7 -2

8
Convolutional Layer: 2D example
A 2D 1 0 1 0 0 A filter: -1 -1 -1
image:
1 0 1 0 1 -1 1 -1
1 1 1 0 0 -1 -1 -1
1 0 1 0 1
1 0 1 0 1

-7 -2 -4
After -5 -2 -5
convolution:
-7 -2 -5

8
Convolutional Layer: 2D example
A 2D 1 0 1 0 0 A filter: -1 -1 -1
image:
1 0 1 0 1 -1 1 -1
1 1 1 0 0 -1 -1 -1
1 0 1 0 1
1 0 1 0 1

-7 -2 -4
After -5 -2 -5
convolution:
-7 -2 -5

8
Convolutional Layer: 2D example
0 0 0 0 0 0 0
A 2D 0 1 0 1 0 0 0 A filter: -1 -1 -1
image:
0 1 0 1 0 1 0 -1 1 -1
0 1 1 1 0 0 0 -1 -1 -1
0 1 0 1 0 1 0
0 1 0 1 0 1 0
0 0 0 0 0 0 0
-7 -2 -4
After -5 -2 -5
convolution:
-7 -2 -5

8
Convolutional Layer: 2D example
0 0 0 0 0 0 0
A 2D 0 1 0 1 0 0 0 A filter: -1 -1 -1
image:
0 1 0 1 0 1 0 -1 1 -1
0 1 1 1 0 0 0 -1 -1 -1
0 1 0 1 0 1 0
0 1 0 1 0 1 0
0 0 0 0 0 0 0 -4 0 -3 -1
-2 -7 -2 -4 1
After -2 -5 -2 -5 -2
convolution:
-2 -7 -2 -5 0
0 -4 0 -4 0
8
Convolutional Layer: 2D example
0 0 0 0 0 0 0
A 2D 0 1 0 1 0 0 0 A filter: -1 -1 -1
image:
0 1 0 1 0 1 0 -1 1 -1
0 1 1 1 0 0 0 -1 -1 -1
0 1 0 1 0 1 0
0 1 0 1 0 1 0
0 0 0 0 0 0 0 0 -4 0 -3 -1
-2 -7 -2 -4 1
After -2 -5 -2 -5 -2
convolution:
-2 -7 -2 -5 0
0 -4 0 -4 0
8
Convolutional Layer: 2D example
0 0 0 0 0 0 0
A 2D 0 1 0 1 0 0 0 A filter: -1 -1 -1
image:
0 1 0 1 0 1 0 -1 1 -1
0 1 1 1 0 0 0 -1 -1 -1
0 1 0 1 0 1 0
0 1 0 1 0 1 0
0 0 0 0 0 0 0 0 -4 0 -3 -1
-2 -7 -2 -4 1
After -2 -5 -2 -5 -2
convolution:
-2 -7 -2 -5 0
0 -4 0 -4 0
8
Convolutional Layer: 2D example
0 0 0 0 0 0 0
A 2D 0 1 0 1 0 0 0 A filter: -1 -1 -1
image:
0 1 0 1 0 1 0 -1 1 -1
0 1 1 1 0 0 0 -1 -1 -1
0 1 0 1 0 1 0
0 1 0 1 0 1 0
0 0 0 0 0 0 0 0 -4 0 -3 -1
-2 -7 -2 -4 1
After -2 -5 -2 -5 -2
convolution:
-2 -7 -2 -5 0
0 -4 0 -4 0
8
Convolutional Layer: 2D example
0 0 0 0 0 0 0
A 2D 0 1 0 1 0 0 0 A filter: -1 -1 -1
image:
0 1 0 1 0 1 0 -1 1 -1
0 1 1 1 0 0 0 -1 -1 -1
0 1 0 1 0 1 0
0 1 0 1 0 1 0
0 0 0 0 0 0 0 0 -4 0 -3 -1
-2 -7 -2 -4 1
After -2 -5 -2 -5 -2
convolution:
-2 -7 -2 -5 0
0 -4 0 -4 0
8
Convolutional Layer: 2D example
0 0 0 0 0 0 0
A 2D 0 1 0 1 0 0 0 A filter: -1 -1 -1
image:
0 1 0 1 0 1 0 -1 1 -1
0 1 1 1 0 0 0 -1 -1 -1
0 1 0 1 0 1 0
0 1 0 1 0 1 0
0 0 0 0 0 0 0 0 -4 0 -3 -1
-2 -7 -2 -4 1
After -2 -5 -2 -5 -2
convolution
-2 -7 -2 -5 0
& ReLU:
0 -4 0 -4 0
9
Convolutional Layer: 2D example
0 0 0 0 0 0 0
A 2D 0 1 0 1 0 0 0 A filter: -1 -1 -1
image:
0 1 0 1 0 1 0 -1 1 -1
0 1 1 1 0 0 0 -1 -1 -1
0 1 0 1 0 1 0
0 1 0 1 0 1 0
0 0 0 0 0 0 0 0 -4 0 -3 -1
-2 -7 -2 -4 1
After -2 -5 -2 -5 -2
convolution
-2 -7 -2 -5 0
& ReLU:
0 -4 0 -4 0
9
Convolutional Layer: 2D example
0 0 0 0 0 0 0
A 2D 0 1 0 1 0 0 0 A filter: -1 -1 -1
image:
0 1 0 1 0 1 0 -1 1 -1
0 1 1 1 0 0 0 -1 -1 -1
0 1 0 1 0 1 0
0 1 0 1 0 1 0
0 0 0 0 0 0 0 0 0 0 0 0
0 0 0 0 1
After 0 0 0 0 0
convolution
0 0 0 0 0
& ReLU:
0 0 0 0 0
9
Convolutional Layer: 2D example
0 0 0 0 0 0 0
A 2D 0 1 0 1 0 0 0 A filter: -1 -1 -1
image:
0 1 0 1 0 1 0 -1 1 -1
0 1 1 1 0 0 0 -1 -1 -1
0 1 0 1 0 1 0
0 1 0 1 0 1 0
0 0 0 0 0 0 0 0 0 0 0 0
0 0 0 0 1
After 0 0 0 0 0
convolution
0 0 0 0 0
& ReLU:
0 0 0 0 0
9
Convolutional Layer: 2D example
0 0 0 0 0 0 0
A 2D 0 1 0 1 0 0 0 A filter: -1 -1 -1
image:
0 1 0 1 0 1 0 -1 1 -1
0 1 1 1 0 0 0 -1 -1 -1
0 1 0 1 0 1 0
0 1 0 1 0 1 0
0 0 0 0 0 0 0

After
convolution:

10
Convolutional Layer: 2D example
0 0 0 0 0 0 0
A 2D 0 1 0 1 0 0 0 A filter: -1 -1 -1
image:
0 1 0 1 0 1 0 -1 1 -1
0 1 1 1 0 0 0 -1 -1 -1
0 1 0 1 0 1 0 with bias 2
0 1 0 1 0 1 0
0 0 0 0 0 0 0

After
convolution:

10
Convolutional Layer: 2D example
0 0 0 0 0 0 0
A 2D 0 1 0 1 0 0 0 A filter: -1 -1 -1
image:
0 1 0 1 0 1 0 -1 1 -1
0 1 1 1 0 0 0 -1 -1 -1
0 1 0 1 0 1 0 with bias 2
0 1 0 1 0 1 0
0 0 0 0 0 0 0

After
convolution:

10
Convolutional Layer: 2D example
0 0 0 0 0 0 0
A 2D 0 1 0 1 0 0 0 A filter: -1 -1 -1
image:
0 1 0 1 0 1 0 -1 1 -1
0 1 1 1 0 0 0 -1 -1 -1
0 1 0 1 0 1 0 with bias 2
0 1 0 1 0 1 0
0 0 0 0 0 0 0 2

After
convolution:

10
Convolutional Layer: 2D example
0 0 0 0 0 0 0
A 2D 0 1 0 1 0 0 0 A filter: -1 -1 -1
image:
0 1 0 1 0 1 0 -1 1 -1
0 1 1 1 0 0 0 -1 -1 -1
0 1 0 1 0 1 0 with bias 2
0 1 0 1 0 1 0
0 0 0 0 0 0 0 2

After
convolution:

10
Convolutional Layer: 2D example
0 0 0 0 0 0 0
A 2D 0 1 0 1 0 0 0 A filter: -1 -1 -1
image:
0 1 0 1 0 1 0 -1 1 -1
0 1 1 1 0 0 0 -1 -1 -1
0 1 0 1 0 1 0 with bias 2
0 1 0 1 0 1 0
0 0 0 0 0 0 0 2 -2

After
convolution:

10
Convolutional Layer: 2D example
0 0 0 0 0 0 0
A 2D 0 1 0 1 0 0 0 A filter: -1 -1 -1
image:
0 1 0 1 0 1 0 -1 1 -1
0 1 1 1 0 0 0 -1 -1 -1
0 1 0 1 0 1 0 with bias 2
0 1 0 1 0 1 0
0 0 0 0 0 0 0 2 -2

After
convolution:

10
Convolutional Layer: 2D example
0 0 0 0 0 0 0
A 2D 0 1 0 1 0 0 0 A filter: -1 -1 -1
image:
0 1 0 1 0 1 0 -1 1 -1
0 1 1 1 0 0 0 -1 -1 -1
0 1 0 1 0 1 0 with bias 2
0 1 0 1 0 1 0
0 0 0 0 0 0 0

After
convolution:

11
Convolutional Layer: 2D example
0 0 0 0 0 0 0
A 2D 0 1 0 1 0 0 0 A filter: w11 w12 w13
<latexit sha1_base64="qlhRuRJFmIEKP2yx5MZhTZl0hAU=">AAAB7XicbVBNSwMxEJ31s9avqkcvwSJ4Kpsq6LHoxWMF+wHtUrJpto3NJkuSVcrS/+DFgyJe/T/e/Dem7R609cHA470ZZuaFieDG+v63t7K6tr6xWdgqbu/s7u2XDg6bRqWasgZVQul2SAwTXLKG5VawdqIZiUPBWuHoZuq3Hpk2XMl7O05YEJOB5BGnxDqp+dTLMJ70SmW/4s+AlgnOSRly1Hulr25f0TRm0lJBjOlgP7FBRrTlVLBJsZsalhA6IgPWcVSSmJkgm107QadO6aNIaVfSopn6eyIjsTHjOHSdMbFDs+hNxf+8TmqjqyDjMkktk3S+KEoFsgpNX0d9rhm1YuwIoZq7WxEdEk2odQEVXQh48eVl0qxW8HmlendRrl3ncRTgGE7gDDBcQg1uoQ4NoPAAz/AKb57yXrx372PeuuLlM0fwB97nD0D+juo=</latexit> <latexit sha1_base64="ednJHXP6anXZIIcll8dgWPsayh4=">AAAB7XicbVBNSwMxEJ34WetX1aOXYBE8ld0q6LHoxWMF+wHtUrJpto3NJkuSVcrS/+DFgyJe/T/e/Dem7R609cHA470ZZuaFieDGet43WlldW9/YLGwVt3d29/ZLB4dNo1JNWYMqoXQ7JIYJLlnDcitYO9GMxKFgrXB0M/Vbj0wbruS9HScsiMlA8ohTYp3UfOplfnXSK5W9ijcDXiZ+TsqQo94rfXX7iqYxk5YKYkzH9xIbZERbTgWbFLupYQmhIzJgHUcliZkJstm1E3zqlD6OlHYlLZ6pvycyEhszjkPXGRM7NIveVPzP66Q2ugoyLpPUMknni6JUYKvw9HXc55pRK8aOEKq5uxXTIdGEWhdQ0YXgL768TJrVin9eqd5dlGvXeRwFOIYTOAMfLqEGt1CHBlB4gGd4hTek0At6Rx/z1hWUzxzBH6DPH0KDjus=</latexit>

<latexit sha1_base64="XmJRtk0c0zkl3UOO/w/08PBCQQo=">AAAB7XicbVBNSwMxEJ2tX7V+VT16CRbBU9ltBT0WvXisYD+gXUo2zbax2WRJskpZ+h+8eFDEq//Hm//GdLsHbX0w8Hhvhpl5QcyZNq777RTW1jc2t4rbpZ3dvf2D8uFRW8tEEdoikkvVDbCmnAnaMsxw2o0VxVHAaSeY3Mz9ziNVmklxb6Yx9SM8EixkBBsrtZ8GqVefDcoVt+pmQKvEy0kFcjQH5a/+UJIkosIQjrXueW5s/BQrwwins1I/0TTGZIJHtGepwBHVfppdO0NnVhmiUCpbwqBM/T2R4kjraRTYzgibsV725uJ/Xi8x4ZWfMhEnhgqyWBQmHBmJ5q+jIVOUGD61BBPF7K2IjLHCxNiASjYEb/nlVdKuVb16tXZ3UWlc53EU4QRO4Rw8uIQG3EITWkDgAZ7hFd4c6bw4787HorXg5DPH8AfO5w9ECI7s</latexit>

image:
0 1 0 1 0 1 0 w21 w22 w23
<latexit sha1_base64="F16Z/jg9PLdycfov8Xiw4K48RnY=">AAAB7XicbVBNSwMxEJ34WetX1aOXYBE8ld0q6LHoxWMF+wHtUrJpto3NJkuSVcrS/+DFgyJe/T/e/Dem7R609cHA470ZZuaFieDGet43WlldW9/YLGwVt3d29/ZLB4dNo1JNWYMqoXQ7JIYJLlnDcitYO9GMxKFgrXB0M/Vbj0wbruS9HScsiMlA8ohTYp3UfOplVX/SK5W9ijcDXiZ+TsqQo94rfXX7iqYxk5YKYkzH9xIbZERbTgWbFLupYQmhIzJgHUcliZkJstm1E3zqlD6OlHYlLZ6pvycyEhszjkPXGRM7NIveVPzP66Q2ugoyLpPUMknni6JUYKvw9HXc55pRK8aOEKq5uxXTIdGEWhdQ0YXgL768TJrVin9eqd5dlGvXeRwFOIYTOAMfLqEGt1CHBlB4gGd4hTek0At6Rx/z1hWUzxzBH6DPH0KEjus=</latexit> <latexit sha1_base64="HaOFOHwx6p+kbQ3R1edS2Fo9vbg=">AAAB7XicbVBNSwMxEJ2tX7V+VT16CRbBU9ldC3osevFYwX5Au5Rsmm1js8mSZJWy9D948aCIV/+PN/+NabsHbX0w8Hhvhpl5YcKZNq777RTW1jc2t4rbpZ3dvf2D8uFRS8tUEdokkkvVCbGmnAnaNMxw2kkUxXHIaTsc38z89iNVmklxbyYJDWI8FCxiBBsrtZ76me9P++WKW3XnQKvEy0kFcjT65a/eQJI0psIQjrXuem5iggwrwwin01Iv1TTBZIyHtGupwDHVQTa/dorOrDJAkVS2hEFz9fdEhmOtJ3FoO2NsRnrZm4n/ed3URFdBxkSSGirIYlGUcmQkmr2OBkxRYvjEEkwUs7ciMsIKE2MDKtkQvOWXV0nLr3oXVf+uVqlf53EU4QRO4Rw8uIQ63EIDmkDgAZ7hFd4c6bw4787HorXg5DPH8AfO5w9ECY7s</latexit>

<latexit sha1_base64="snW1S3oZcj223tJTzLdJdUU55Sc=">AAAB7XicbVBNSwMxEJ2tX7V+VT16CRbBU9ltBT0WvXisYD+gXUo2zbax2WRJskpZ+h+8eFDEq//Hm//GdLsHbX0w8Hhvhpl5QcyZNq777RTW1jc2t4rbpZ3dvf2D8uFRW8tEEdoikkvVDbCmnAnaMsxw2o0VxVHAaSeY3Mz9ziNVmklxb6Yx9SM8EixkBBsrtZ8Gaa0+G5QrbtXNgFaJl5MK5GgOyl/9oSRJRIUhHGvd89zY+ClWhhFOZ6V+ommMyQSPaM9SgSOq/TS7dobOrDJEoVS2hEGZ+nsixZHW0yiwnRE2Y73szcX/vF5iwis/ZSJODBVksShMODISzV9HQ6YoMXxqCSaK2VsRGWOFibEBlWwI3vLLq6Rdq3r1au3uotK4zuMowgmcwjl4cAkNuIUmtIDAAzzDK7w50nlx3p2PRWvByWeO4Q+czx9Fjo7t</latexit>

0 1 1 1 0 0 0 w31 w32 w33


<latexit sha1_base64="GuAoYyTn/7WhtUVCvNtLUc5Bk+0=">AAAB7XicbVBNSwMxEJ2tX7V+VT16CRbBU9ltBT0WvXisYD+gXUo2zbax2WRJskpZ+h+8eFDEq//Hm//GdLsHbX0w8Hhvhpl5QcyZNq777RTW1jc2t4rbpZ3dvf2D8uFRW8tEEdoikkvVDbCmnAnaMsxw2o0VxVHAaSeY3Mz9ziNVmklxb6Yx9SM8EixkBBsrtZ8Gad2bDcoVt+pmQKvEy0kFcjQH5a/+UJIkosIQjrXueW5s/BQrwwins1I/0TTGZIJHtGepwBHVfppdO0NnVhmiUCpbwqBM/T2R4kjraRTYzgibsV725uJ/Xi8x4ZWfMhEnhgqyWBQmHBmJ5q+jIVOUGD61BBPF7K2IjLHCxNiASjYEb/nlVdKuVb16tXZ3UWlc53EU4QRO4Rw8uIQG3EITWkDgAZ7hFd4c6bw4787HorXg5DPH8AfO5w9ECo7s</latexit> <latexit sha1_base64="4Uxa9hlbKSnCChuXSKeWY363QIs=">AAAB7XicbVBNSwMxEJ2tX7V+VT16CRbBU9ltBT0WvXisYD+gXUo2zbax2WRJskpZ+h+8eFDEq//Hm//GdLsHbX0w8Hhvhpl5QcyZNq777RTW1jc2t4rbpZ3dvf2D8uFRW8tEEdoikkvVDbCmnAnaMsxw2o0VxVHAaSeY3Mz9ziNVmklxb6Yx9SM8EixkBBsrtZ8Gab02G5QrbtXNgFaJl5MK5GgOyl/9oSRJRIUhHGvd89zY+ClWhhFOZ6V+ommMyQSPaM9SgSOq/TS7dobOrDJEoVS2hEGZ+nsixZHW0yiwnRE2Y73szcX/vF5iwis/ZSJODBVksShMODISzV9HQ6YoMXxqCSaK2VsRGWOFibEBlWwI3vLLq6Rdq3r1au3uotK4zuMowgmcwjl4cAkNuIUmtIDAAzzDK7w50nlx3p2PRWvByWeO4Q+czx9Fj47t</latexit>
<latexit sha1_base64="NtND4dx5Cf464fpBYyCRxr4arAM=">AAAB7XicbVBNSwMxEJ2tX7V+VT16CRbBU9ltBT0WvXisYD+gXUo2zbax2WRJskpZ+h+8eFDEq//Hm//GdLsHbX0w8Hhvhpl5QcyZNq777RTW1jc2t4rbpZ3dvf2D8uFRW8tEEdoikkvVDbCmnAnaMsxw2o0VxVHAaSeY3Mz9ziNVmklxb6Yx9SM8EixkBBsrtZ8Gab0+G5QrbtXNgFaJl5MK5GgOyl/9oSRJRIUhHGvd89zY+ClWhhFOZ6V+ommMyQSPaM9SgSOq/TS7dobOrDJEoVS2hEGZ+nsixZHW0yiwnRE2Y73szcX/vF5iwis/ZSJODBVksShMODISzV9HQ6YoMXxqCSaK2VsRGWOFibEBlWwI3vLLq6Rdq3r1au3uotK4zuMowgmcwjl4cAkNuIUmtIDAAzzDK7w50nlx3p2PRWvByWeO4Q+czx9HFI7u</latexit>

0 1 0 1 0 1 0 with bias b
0 1 0 1 0 1 0
0 0 0 0 0 0 0

After
convolution:

11
Convolutional Layer: 2D example
0 0 0 0 0 0 0
A 2D 0 1 0 1 0 0 0 A filter: w11 w12 w13
<latexit sha1_base64="qlhRuRJFmIEKP2yx5MZhTZl0hAU=">AAAB7XicbVBNSwMxEJ31s9avqkcvwSJ4Kpsq6LHoxWMF+wHtUrJpto3NJkuSVcrS/+DFgyJe/T/e/Dem7R609cHA470ZZuaFieDG+v63t7K6tr6xWdgqbu/s7u2XDg6bRqWasgZVQul2SAwTXLKG5VawdqIZiUPBWuHoZuq3Hpk2XMl7O05YEJOB5BGnxDqp+dTLMJ70SmW/4s+AlgnOSRly1Hulr25f0TRm0lJBjOlgP7FBRrTlVLBJsZsalhA6IgPWcVSSmJkgm107QadO6aNIaVfSopn6eyIjsTHjOHSdMbFDs+hNxf+8TmqjqyDjMkktk3S+KEoFsgpNX0d9rhm1YuwIoZq7WxEdEk2odQEVXQh48eVl0qxW8HmlendRrl3ncRTgGE7gDDBcQg1uoQ4NoPAAz/AKb57yXrx372PeuuLlM0fwB97nD0D+juo=</latexit> <latexit sha1_base64="ednJHXP6anXZIIcll8dgWPsayh4=">AAAB7XicbVBNSwMxEJ34WetX1aOXYBE8ld0q6LHoxWMF+wHtUrJpto3NJkuSVcrS/+DFgyJe/T/e/Dem7R609cHA470ZZuaFieDGet43WlldW9/YLGwVt3d29/ZLB4dNo1JNWYMqoXQ7JIYJLlnDcitYO9GMxKFgrXB0M/Vbj0wbruS9HScsiMlA8ohTYp3UfOplfnXSK5W9ijcDXiZ+TsqQo94rfXX7iqYxk5YKYkzH9xIbZERbTgWbFLupYQmhIzJgHUcliZkJstm1E3zqlD6OlHYlLZ6pvycyEhszjkPXGRM7NIveVPzP66Q2ugoyLpPUMknni6JUYKvw9HXc55pRK8aOEKq5uxXTIdGEWhdQ0YXgL768TJrVin9eqd5dlGvXeRwFOIYTOAMfLqEGt1CHBlB4gGd4hTek0At6Rx/z1hWUzxzBH6DPH0KDjus=</latexit>

<latexit sha1_base64="XmJRtk0c0zkl3UOO/w/08PBCQQo=">AAAB7XicbVBNSwMxEJ2tX7V+VT16CRbBU9ltBT0WvXisYD+gXUo2zbax2WRJskpZ+h+8eFDEq//Hm//GdLsHbX0w8Hhvhpl5QcyZNq777RTW1jc2t4rbpZ3dvf2D8uFRW8tEEdoikkvVDbCmnAnaMsxw2o0VxVHAaSeY3Mz9ziNVmklxb6Yx9SM8EixkBBsrtZ8GqVefDcoVt+pmQKvEy0kFcjQH5a/+UJIkosIQjrXueW5s/BQrwwins1I/0TTGZIJHtGepwBHVfppdO0NnVhmiUCpbwqBM/T2R4kjraRTYzgibsV725uJ/Xi8x4ZWfMhEnhgqyWBQmHBmJ5q+jIVOUGD61BBPF7K2IjLHCxNiASjYEb/nlVdKuVb16tXZ3UWlc53EU4QRO4Rw8uIQG3EITWkDgAZ7hFd4c6bw4787HorXg5DPH8AfO5w9ECI7s</latexit>

image:
0 1 0 1 0 1 0 w21 w22 w23
<latexit sha1_base64="F16Z/jg9PLdycfov8Xiw4K48RnY=">AAAB7XicbVBNSwMxEJ34WetX1aOXYBE8ld0q6LHoxWMF+wHtUrJpto3NJkuSVcrS/+DFgyJe/T/e/Dem7R609cHA470ZZuaFieDGet43WlldW9/YLGwVt3d29/ZLB4dNo1JNWYMqoXQ7JIYJLlnDcitYO9GMxKFgrXB0M/Vbj0wbruS9HScsiMlA8ohTYp3UfOplVX/SK5W9ijcDXiZ+TsqQo94rfXX7iqYxk5YKYkzH9xIbZERbTgWbFLupYQmhIzJgHUcliZkJstm1E3zqlD6OlHYlLZ6pvycyEhszjkPXGRM7NIveVPzP66Q2ugoyLpPUMknni6JUYKvw9HXc55pRK8aOEKq5uxXTIdGEWhdQ0YXgL768TJrVin9eqd5dlGvXeRwFOIYTOAMfLqEGt1CHBlB4gGd4hTek0At6Rx/z1hWUzxzBH6DPH0KEjus=</latexit> <latexit sha1_base64="HaOFOHwx6p+kbQ3R1edS2Fo9vbg=">AAAB7XicbVBNSwMxEJ2tX7V+VT16CRbBU9ldC3osevFYwX5Au5Rsmm1js8mSZJWy9D948aCIV/+PN/+NabsHbX0w8Hhvhpl5YcKZNq777RTW1jc2t4rbpZ3dvf2D8uFRS8tUEdokkkvVCbGmnAnaNMxw2kkUxXHIaTsc38z89iNVmklxbyYJDWI8FCxiBBsrtZ76me9P++WKW3XnQKvEy0kFcjT65a/eQJI0psIQjrXuem5iggwrwwin01Iv1TTBZIyHtGupwDHVQTa/dorOrDJAkVS2hEFz9fdEhmOtJ3FoO2NsRnrZm4n/ed3URFdBxkSSGirIYlGUcmQkmr2OBkxRYvjEEkwUs7ciMsIKE2MDKtkQvOWXV0nLr3oXVf+uVqlf53EU4QRO4Rw8uIQ63EIDmkDgAZ7hFd4c6bw4787HorXg5DPH8AfO5w9ECY7s</latexit>

<latexit sha1_base64="snW1S3oZcj223tJTzLdJdUU55Sc=">AAAB7XicbVBNSwMxEJ2tX7V+VT16CRbBU9ltBT0WvXisYD+gXUo2zbax2WRJskpZ+h+8eFDEq//Hm//GdLsHbX0w8Hhvhpl5QcyZNq777RTW1jc2t4rbpZ3dvf2D8uFRW8tEEdoikkvVDbCmnAnaMsxw2o0VxVHAaSeY3Mz9ziNVmklxb6Yx9SM8EixkBBsrtZ8Gaa0+G5QrbtXNgFaJl5MK5GgOyl/9oSRJRIUhHGvd89zY+ClWhhFOZ6V+ommMyQSPaM9SgSOq/TS7dobOrDJEoVS2hEGZ+nsixZHW0yiwnRE2Y73szcX/vF5iwis/ZSJODBVksShMODISzV9HQ6YoMXxqCSaK2VsRGWOFibEBlWwI3vLLq6Rdq3r1au3uotK4zuMowgmcwjl4cAkNuIUmtIDAAzzDK7w50nlx3p2PRWvByWeO4Q+czx9Fjo7t</latexit>

0 1 1 1 0 0 0 w31 w32 w33


<latexit sha1_base64="GuAoYyTn/7WhtUVCvNtLUc5Bk+0=">AAAB7XicbVBNSwMxEJ2tX7V+VT16CRbBU9ltBT0WvXisYD+gXUo2zbax2WRJskpZ+h+8eFDEq//Hm//GdLsHbX0w8Hhvhpl5QcyZNq777RTW1jc2t4rbpZ3dvf2D8uFRW8tEEdoikkvVDbCmnAnaMsxw2o0VxVHAaSeY3Mz9ziNVmklxb6Yx9SM8EixkBBsrtZ8Gad2bDcoVt+pmQKvEy0kFcjQH5a/+UJIkosIQjrXueW5s/BQrwwins1I/0TTGZIJHtGepwBHVfppdO0NnVhmiUCpbwqBM/T2R4kjraRTYzgibsV725uJ/Xi8x4ZWfMhEnhgqyWBQmHBmJ5q+jIVOUGD61BBPF7K2IjLHCxNiASjYEb/nlVdKuVb16tXZ3UWlc53EU4QRO4Rw8uIQG3EITWkDgAZ7hFd4c6bw4787HorXg5DPH8AfO5w9ECo7s</latexit> <latexit sha1_base64="4Uxa9hlbKSnCChuXSKeWY363QIs=">AAAB7XicbVBNSwMxEJ2tX7V+VT16CRbBU9ltBT0WvXisYD+gXUo2zbax2WRJskpZ+h+8eFDEq//Hm//GdLsHbX0w8Hhvhpl5QcyZNq777RTW1jc2t4rbpZ3dvf2D8uFRW8tEEdoikkvVDbCmnAnaMsxw2o0VxVHAaSeY3Mz9ziNVmklxb6Yx9SM8EixkBBsrtZ8Gab02G5QrbtXNgFaJl5MK5GgOyl/9oSRJRIUhHGvd89zY+ClWhhFOZ6V+ommMyQSPaM9SgSOq/TS7dobOrDJEoVS2hEGZ+nsixZHW0yiwnRE2Y73szcX/vF5iwis/ZSJODBVksShMODISzV9HQ6YoMXxqCSaK2VsRGWOFibEBlWwI3vLLq6Rdq3r1au3uotK4zuMowgmcwjl4cAkNuIUmtIDAAzzDK7w50nlx3p2PRWvByWeO4Q+czx9Fj47t</latexit>
<latexit sha1_base64="NtND4dx5Cf464fpBYyCRxr4arAM=">AAAB7XicbVBNSwMxEJ2tX7V+VT16CRbBU9ltBT0WvXisYD+gXUo2zbax2WRJskpZ+h+8eFDEq//Hm//GdLsHbX0w8Hhvhpl5QcyZNq777RTW1jc2t4rbpZ3dvf2D8uFRW8tEEdoikkvVDbCmnAnaMsxw2o0VxVHAaSeY3Mz9ziNVmklxb6Yx9SM8EixkBBsrtZ8Gab0+G5QrbtXNgFaJl5MK5GgOyl/9oSRJRIUhHGvd89zY+ClWhhFOZ6V+ommMyQSPaM9SgSOq/TS7dobOrDJEoVS2hEGZ+nsixZHW0yiwnRE2Y73szcX/vF5iwis/ZSJODBVksShMODISzV9HQ6YoMXxqCSaK2VsRGWOFibEBlWwI3vLLq6Rdq3r1au3uotK4zuMowgmcwjl4cAkNuIUmtIDAAzzDK7w50nlx3p2PRWvByWeO4Q+czx9HFI7u</latexit>

0 1 0 1 0 1 0 with bias b
0 1 0 1 0 1 0
0 0 0 0 0 0 0

After
convolution:

11
Convolutional Layer: 2D example
A 2D 1 0 1 0 0 A filter: w11 w12 w13
<latexit sha1_base64="qlhRuRJFmIEKP2yx5MZhTZl0hAU=">AAAB7XicbVBNSwMxEJ31s9avqkcvwSJ4Kpsq6LHoxWMF+wHtUrJpto3NJkuSVcrS/+DFgyJe/T/e/Dem7R609cHA470ZZuaFieDG+v63t7K6tr6xWdgqbu/s7u2XDg6bRqWasgZVQul2SAwTXLKG5VawdqIZiUPBWuHoZuq3Hpk2XMl7O05YEJOB5BGnxDqp+dTLMJ70SmW/4s+AlgnOSRly1Hulr25f0TRm0lJBjOlgP7FBRrTlVLBJsZsalhA6IgPWcVSSmJkgm107QadO6aNIaVfSopn6eyIjsTHjOHSdMbFDs+hNxf+8TmqjqyDjMkktk3S+KEoFsgpNX0d9rhm1YuwIoZq7WxEdEk2odQEVXQh48eVl0qxW8HmlendRrl3ncRTgGE7gDDBcQg1uoQ4NoPAAz/AKb57yXrx372PeuuLlM0fwB97nD0D+juo=</latexit> <latexit sha1_base64="ednJHXP6anXZIIcll8dgWPsayh4=">AAAB7XicbVBNSwMxEJ34WetX1aOXYBE8ld0q6LHoxWMF+wHtUrJpto3NJkuSVcrS/+DFgyJe/T/e/Dem7R609cHA470ZZuaFieDGet43WlldW9/YLGwVt3d29/ZLB4dNo1JNWYMqoXQ7JIYJLlnDcitYO9GMxKFgrXB0M/Vbj0wbruS9HScsiMlA8ohTYp3UfOplfnXSK5W9ijcDXiZ+TsqQo94rfXX7iqYxk5YKYkzH9xIbZERbTgWbFLupYQmhIzJgHUcliZkJstm1E3zqlD6OlHYlLZ6pvycyEhszjkPXGRM7NIveVPzP66Q2ugoyLpPUMknni6JUYKvw9HXc55pRK8aOEKq5uxXTIdGEWhdQ0YXgL768TJrVin9eqd5dlGvXeRwFOIYTOAMfLqEGt1CHBlB4gGd4hTek0At6Rx/z1hWUzxzBH6DPH0KDjus=</latexit>

<latexit sha1_base64="XmJRtk0c0zkl3UOO/w/08PBCQQo=">AAAB7XicbVBNSwMxEJ2tX7V+VT16CRbBU9ltBT0WvXisYD+gXUo2zbax2WRJskpZ+h+8eFDEq//Hm//GdLsHbX0w8Hhvhpl5QcyZNq777RTW1jc2t4rbpZ3dvf2D8uFRW8tEEdoikkvVDbCmnAnaMsxw2o0VxVHAaSeY3Mz9ziNVmklxb6Yx9SM8EixkBBsrtZ8GqVefDcoVt+pmQKvEy0kFcjQH5a/+UJIkosIQjrXueW5s/BQrwwins1I/0TTGZIJHtGepwBHVfppdO0NnVhmiUCpbwqBM/T2R4kjraRTYzgibsV725uJ/Xi8x4ZWfMhEnhgqyWBQmHBmJ5q+jIVOUGD61BBPF7K2IjLHCxNiASjYEb/nlVdKuVb16tXZ3UWlc53EU4QRO4Rw8uIQG3EITWkDgAZ7hFd4c6bw4787HorXg5DPH8AfO5w9ECI7s</latexit>

image:
1 0 1 0 1 w21 w22 w23
<latexit sha1_base64="F16Z/jg9PLdycfov8Xiw4K48RnY=">AAAB7XicbVBNSwMxEJ34WetX1aOXYBE8ld0q6LHoxWMF+wHtUrJpto3NJkuSVcrS/+DFgyJe/T/e/Dem7R609cHA470ZZuaFieDGet43WlldW9/YLGwVt3d29/ZLB4dNo1JNWYMqoXQ7JIYJLlnDcitYO9GMxKFgrXB0M/Vbj0wbruS9HScsiMlA8ohTYp3UfOplVX/SK5W9ijcDXiZ+TsqQo94rfXX7iqYxk5YKYkzH9xIbZERbTgWbFLupYQmhIzJgHUcliZkJstm1E3zqlD6OlHYlLZ6pvycyEhszjkPXGRM7NIveVPzP66Q2ugoyLpPUMknni6JUYKvw9HXc55pRK8aOEKq5uxXTIdGEWhdQ0YXgL768TJrVin9eqd5dlGvXeRwFOIYTOAMfLqEGt1CHBlB4gGd4hTek0At6Rx/z1hWUzxzBH6DPH0KEjus=</latexit> <latexit sha1_base64="HaOFOHwx6p+kbQ3R1edS2Fo9vbg=">AAAB7XicbVBNSwMxEJ2tX7V+VT16CRbBU9ldC3osevFYwX5Au5Rsmm1js8mSZJWy9D948aCIV/+PN/+NabsHbX0w8Hhvhpl5YcKZNq777RTW1jc2t4rbpZ3dvf2D8uFRS8tUEdokkkvVCbGmnAnaNMxw2kkUxXHIaTsc38z89iNVmklxbyYJDWI8FCxiBBsrtZ76me9P++WKW3XnQKvEy0kFcjT65a/eQJI0psIQjrXuem5iggwrwwin01Iv1TTBZIyHtGupwDHVQTa/dorOrDJAkVS2hEFz9fdEhmOtJ3FoO2NsRnrZm4n/ed3URFdBxkSSGirIYlGUcmQkmr2OBkxRYvjEEkwUs7ciMsIKE2MDKtkQvOWXV0nLr3oXVf+uVqlf53EU4QRO4Rw8uIQ63EIDmkDgAZ7hFd4c6bw4787HorXg5DPH8AfO5w9ECY7s</latexit>

<latexit sha1_base64="snW1S3oZcj223tJTzLdJdUU55Sc=">AAAB7XicbVBNSwMxEJ2tX7V+VT16CRbBU9ltBT0WvXisYD+gXUo2zbax2WRJskpZ+h+8eFDEq//Hm//GdLsHbX0w8Hhvhpl5QcyZNq777RTW1jc2t4rbpZ3dvf2D8uFRW8tEEdoikkvVDbCmnAnaMsxw2o0VxVHAaSeY3Mz9ziNVmklxb6Yx9SM8EixkBBsrtZ8Gaa0+G5QrbtXNgFaJl5MK5GgOyl/9oSRJRIUhHGvd89zY+ClWhhFOZ6V+ommMyQSPaM9SgSOq/TS7dobOrDJEoVS2hEGZ+nsixZHW0yiwnRE2Y73szcX/vF5iwis/ZSJODBVksShMODISzV9HQ6YoMXxqCSaK2VsRGWOFibEBlWwI3vLLq6Rdq3r1au3uotK4zuMowgmcwjl4cAkNuIUmtIDAAzzDK7w50nlx3p2PRWvByWeO4Q+czx9Fjo7t</latexit>

1 1 1 0 0 w31 w32 w33


<latexit sha1_base64="GuAoYyTn/7WhtUVCvNtLUc5Bk+0=">AAAB7XicbVBNSwMxEJ2tX7V+VT16CRbBU9ltBT0WvXisYD+gXUo2zbax2WRJskpZ+h+8eFDEq//Hm//GdLsHbX0w8Hhvhpl5QcyZNq777RTW1jc2t4rbpZ3dvf2D8uFRW8tEEdoikkvVDbCmnAnaMsxw2o0VxVHAaSeY3Mz9ziNVmklxb6Yx9SM8EixkBBsrtZ8Gad2bDcoVt+pmQKvEy0kFcjQH5a/+UJIkosIQjrXueW5s/BQrwwins1I/0TTGZIJHtGepwBHVfppdO0NnVhmiUCpbwqBM/T2R4kjraRTYzgibsV725uJ/Xi8x4ZWfMhEnhgqyWBQmHBmJ5q+jIVOUGD61BBPF7K2IjLHCxNiASjYEb/nlVdKuVb16tXZ3UWlc53EU4QRO4Rw8uIQG3EITWkDgAZ7hFd4c6bw4787HorXg5DPH8AfO5w9ECo7s</latexit> <latexit sha1_base64="4Uxa9hlbKSnCChuXSKeWY363QIs=">AAAB7XicbVBNSwMxEJ2tX7V+VT16CRbBU9ltBT0WvXisYD+gXUo2zbax2WRJskpZ+h+8eFDEq//Hm//GdLsHbX0w8Hhvhpl5QcyZNq777RTW1jc2t4rbpZ3dvf2D8uFRW8tEEdoikkvVDbCmnAnaMsxw2o0VxVHAaSeY3Mz9ziNVmklxb6Yx9SM8EixkBBsrtZ8Gab02G5QrbtXNgFaJl5MK5GgOyl/9oSRJRIUhHGvd89zY+ClWhhFOZ6V+ommMyQSPaM9SgSOq/TS7dobOrDJEoVS2hEGZ+nsixZHW0yiwnRE2Y73szcX/vF5iwis/ZSJODBVksShMODISzV9HQ6YoMXxqCSaK2VsRGWOFibEBlWwI3vLLq6Rdq3r1au3uotK4zuMowgmcwjl4cAkNuIUmtIDAAzzDK7w50nlx3p2PRWvByWeO4Q+czx9Fj47t</latexit>
<latexit sha1_base64="NtND4dx5Cf464fpBYyCRxr4arAM=">AAAB7XicbVBNSwMxEJ2tX7V+VT16CRbBU9ltBT0WvXisYD+gXUo2zbax2WRJskpZ+h+8eFDEq//Hm//GdLsHbX0w8Hhvhpl5QcyZNq777RTW1jc2t4rbpZ3dvf2D8uFRW8tEEdoikkvVDbCmnAnaMsxw2o0VxVHAaSeY3Mz9ziNVmklxb6Yx9SM8EixkBBsrtZ8Gab0+G5QrbtXNgFaJl5MK5GgOyl/9oSRJRIUhHGvd89zY+ClWhhFOZ6V+ommMyQSPaM9SgSOq/TS7dobOrDJEoVS2hEGZ+nsixZHW0yiwnRE2Y73szcX/vF5iwis/ZSJODBVksShMODISzV9HQ6YoMXxqCSaK2VsRGWOFibEBlWwI3vLLq6Rdq3r1au3uotK4zuMowgmcwjl4cAkNuIUmtIDAAzzDK7w50nlx3p2PRWvByWeO4Q+czx9HFI7u</latexit>

1 0 1 0 1 with bias b
1 0 1 0 1

12
Convolutional Layer: 2D example
A 2D 1 0 1 0 0 A filter: w11 w12
<latexit sha1_base64="qlhRuRJFmIEKP2yx5MZhTZl0hAU=">AAAB7XicbVBNSwMxEJ31s9avqkcvwSJ4Kpsq6LHoxWMF+wHtUrJpto3NJkuSVcrS/+DFgyJe/T/e/Dem7R609cHA470ZZuaFieDG+v63t7K6tr6xWdgqbu/s7u2XDg6bRqWasgZVQul2SAwTXLKG5VawdqIZiUPBWuHoZuq3Hpk2XMl7O05YEJOB5BGnxDqp+dTLMJ70SmW/4s+AlgnOSRly1Hulr25f0TRm0lJBjOlgP7FBRrTlVLBJsZsalhA6IgPWcVSSmJkgm107QadO6aNIaVfSopn6eyIjsTHjOHSdMbFDs+hNxf+8TmqjqyDjMkktk3S+KEoFsgpNX0d9rhm1YuwIoZq7WxEdEk2odQEVXQh48eVl0qxW8HmlendRrl3ncRTgGE7gDDBcQg1uoQ4NoPAAz/AKb57yXrx372PeuuLlM0fwB97nD0D+juo=</latexit> <latexit sha1_base64="ednJHXP6anXZIIcll8dgWPsayh4=">AAAB7XicbVBNSwMxEJ34WetX1aOXYBE8ld0q6LHoxWMF+wHtUrJpto3NJkuSVcrS/+DFgyJe/T/e/Dem7R609cHA470ZZuaFieDGet43WlldW9/YLGwVt3d29/ZLB4dNo1JNWYMqoXQ7JIYJLlnDcitYO9GMxKFgrXB0M/Vbj0wbruS9HScsiMlA8ohTYp3UfOplfnXSK5W9ijcDXiZ+TsqQo94rfXX7iqYxk5YKYkzH9xIbZERbTgWbFLupYQmhIzJgHUcliZkJstm1E3zqlD6OlHYlLZ6pvycyEhszjkPXGRM7NIveVPzP66Q2ugoyLpPUMknni6JUYKvw9HXc55pRK8aOEKq5uxXTIdGEWhdQ0YXgL768TJrVin9eqd5dlGvXeRwFOIYTOAMfLqEGt1CHBlB4gGd4hTek0At6Rx/z1hWUzxzBH6DPH0KDjus=</latexit>

image:
1 0 1 0 1 w21 w22
<latexit sha1_base64="F16Z/jg9PLdycfov8Xiw4K48RnY=">AAAB7XicbVBNSwMxEJ34WetX1aOXYBE8ld0q6LHoxWMF+wHtUrJpto3NJkuSVcrS/+DFgyJe/T/e/Dem7R609cHA470ZZuaFieDGet43WlldW9/YLGwVt3d29/ZLB4dNo1JNWYMqoXQ7JIYJLlnDcitYO9GMxKFgrXB0M/Vbj0wbruS9HScsiMlA8ohTYp3UfOplVX/SK5W9ijcDXiZ+TsqQo94rfXX7iqYxk5YKYkzH9xIbZERbTgWbFLupYQmhIzJgHUcliZkJstm1E3zqlD6OlHYlLZ6pvycyEhszjkPXGRM7NIveVPzP66Q2ugoyLpPUMknni6JUYKvw9HXc55pRK8aOEKq5uxXTIdGEWhdQ0YXgL768TJrVin9eqd5dlGvXeRwFOIYTOAMfLqEGt1CHBlB4gGd4hTek0At6Rx/z1hWUzxzBH6DPH0KEjus=</latexit> <latexit sha1_base64="HaOFOHwx6p+kbQ3R1edS2Fo9vbg=">AAAB7XicbVBNSwMxEJ2tX7V+VT16CRbBU9ldC3osevFYwX5Au5Rsmm1js8mSZJWy9D948aCIV/+PN/+NabsHbX0w8Hhvhpl5YcKZNq777RTW1jc2t4rbpZ3dvf2D8uFRS8tUEdokkkvVCbGmnAnaNMxw2kkUxXHIaTsc38z89iNVmklxbyYJDWI8FCxiBBsrtZ76me9P++WKW3XnQKvEy0kFcjT65a/eQJI0psIQjrXuem5iggwrwwin01Iv1TTBZIyHtGupwDHVQTa/dorOrDJAkVS2hEFz9fdEhmOtJ3FoO2NsRnrZm4n/ed3URFdBxkSSGirIYlGUcmQkmr2OBkxRYvjEEkwUs7ciMsIKE2MDKtkQvOWXV0nLr3oXVf+uVqlf53EU4QRO4Rw8uIQ63EIDmkDgAZ7hFd4c6bw4787HorXg5DPH8AfO5w9ECY7s</latexit>

1 1 1 0 0
1 0 1 0 1 with bias b
1 0 1 0 1

12
Convolutional Layer: 3D example
A 3D
image:

[ https://helpx.adobe.com/photoshop/key-concepts/skew.html ]

13
Convolutional Layer: 3D example
A 3D
image:

height

d t h
wi
[ https://helpx.adobe.com/photoshop/key-concepts/skew.html ]

13
Convolutional Layer: 3D example
A 3D

de
image:

pt
hheight

d t h
wi
[ https://helpx.adobe.com/photoshop/key-concepts/skew.html ]

13
Convolutional Layer: 3D example
A 3D

de
image:

pt
hheight
• Tensor: generalization of a
matrix
• E.g. 1D: vector, 2D: matrix

d t h
wi
[ https://helpx.adobe.com/photoshop/key-concepts/skew.html ]

13
Convolutional Layer: 3D example
A 3D

de
image:

pt
hheight
• Tensor: generalization of a
matrix
• E.g. 1D: vector, 2D: matrix

d t h
wi
[ https://helpx.adobe.com/photoshop/key-concepts/skew.html ]

13 [ https://en.wikipedia.org/wiki/TensorFlow ]
Convolutional Layer: 3D example
A 3D A filter:

de
image:

pt
hheight
• Tensor: generalization of a
matrix
• E.g. 1D: vector, 2D: matrix

d t h
wi
[ https://helpx.adobe.com/photoshop/key-concepts/skew.html ]

13 [ https://en.wikipedia.org/wiki/TensorFlow ]
Convolutional Layer: 3D example
A 3D A filter:

de
image:

pt
hheight
• Tensor: generalization of a
matrix
• E.g. 1D: vector, 2D: matrix

d t h
wi
[ https://helpx.adobe.com/photoshop/key-concepts/skew.html ]

13 [ https://en.wikipedia.org/wiki/TensorFlow ]
Convolutional Layer: 3D example
A 3D A filter:

de
image:

pt
hheight
• Tensor: generalization of a
matrix
• E.g. 1D: vector, 2D: matrix

d t h
wi
[ https://helpx.adobe.com/photoshop/key-concepts/skew.html ]

13 [ https://en.wikipedia.org/wiki/TensorFlow ]
Convolutional Layer: 3D example
A 3D A filter:

de
image:

pt
hheight
• Tensor: generalization of a
matrix
• E.g. 1D: vector, 2D: matrix

d t h
wi
[ https://helpx.adobe.com/photoshop/key-concepts/skew.html ]

13 [ https://en.wikipedia.org/wiki/TensorFlow ]
Convolutional Layer: 3D example
A 3D A filter:

de
image:

pt
hheight
• Tensor: generalization of a
matrix
• E.g. 1D: vector, 2D: matrix

d t h
wi
[ https://helpx.adobe.com/photoshop/key-concepts/skew.html ]

13 [ https://en.wikipedia.org/wiki/TensorFlow ]
Convolutional Layer: 3D example
A 3D A filter:

de
image:

pt
hheight
• Tensor: generalization of a
matrix
• E.g. 1D: vector, 2D: matrix

d t h
wi
[ https://helpx.adobe.com/photoshop/key-concepts/skew.html ]

13 [ https://en.wikipedia.org/wiki/TensorFlow ]
Convolutional Layer: multiple filters
An
image:

14
Convolutional Layer: multiple filters
An
image:

F1

14
Convolutional Layer: multiple filters
An
image:

F1
F2

14
Convolutional Layer: multiple filters
An
image:

F1
F2

F3

14
Convolutional Layer: multiple filters
An
image:

F1
F2

F3

• Collection of
filters in the
layer: filter bank
14
Convolutional Layer: multiple filters
An
image:

F1
F2

F3

• Each resulting
• Collection of image is a channel
filters in the
layer: filter bank
14
Max pooling layer: 2D example
Output from the
convolutional
layer & ReLU:

0 0 0 0 0 0
0 0 0 0 1 0
0 0 0 0 0 0
0 1 0 0 0 0
0 0 0 0 0 0
0 0 0 0 0 0

15
Max pooling layer: 2D example
Output from the
convolutional
layer & ReLU:

0 0 0 0 0 0
0 0 0 0 1 0
0 0 0 0 0 0
0 1 0 0 0 0
0 0 0 0 0 0
0 0 0 0 0 0

15
Max pooling layer: 2D example
Output from the
convolutional
layer & ReLU:

0 0 0 0 0 0
0 0 0 0 1 0
0 0 0 0 0 0
0 1 0 0 0 0
0 0 0 0 0 0
0 0 0 0 0 0

15
Max pooling layer: 2D example
Output from the Max pooling: returns
convolutional max of its arguments
layer & ReLU:

0 0 0 0 0 0
0 0 0 0 1 0
0 0 0 0 0 0
0 1 0 0 0 0
0 0 0 0 0 0
0 0 0 0 0 0

15
Max pooling layer: 2D example
Output from the Max pooling: returns
convolutional max of its arguments
layer & ReLU: • size 3x3 (“size 3”)

0 0 0 0 0 0
0 0 0 0 1 0
0 0 0 0 0 0
0 1 0 0 0 0
0 0 0 0 0 0
0 0 0 0 0 0

15
Max pooling layer: 2D example
Output from the Max pooling: returns
convolutional max of its arguments
layer & ReLU: • size 3x3 (“size 3”)

0 0 0 0 0 0
0 0 0 0 1 0
0 0 0 0 0 0
0 1 0 0 0 0
0 0 0 0 0 0
0 0 0 0 0 0

15
Max pooling layer: 2D example
Output from the Max pooling: returns
convolutional max of its arguments
layer & ReLU: • size 3x3 (“size 3”)

0 0 0 0 0 0
0 0 0 0 1 0
0 0 0 0 0 0 After max
pooling:
0 1 0 0 0 0
0 0 0 0 0 0
0 0 0 0 0 0

15
Max pooling layer: 2D example
Output from the Max pooling: returns
convolutional max of its arguments
layer & ReLU: • size 3x3 (“size 3”)

0 0 0 0 0 0
0 0 0 0 1 0
0 0 0 0 0 0 After max
pooling:
0 1 0 0 0 0
0 0 0 0 0 0 0 1 1
0 0 0 0 0 0 1 1 1 1
1 1 0 0
1 1 0 0
15
Max pooling layer: 2D example
Output from the Max pooling: returns
convolutional max of its arguments
layer & ReLU: • size 3x3 (“size 3”)

0 0 0 0 0 0
0 0 0 0 1 0
0 0 0 0 0 0 After max
pooling:
0 1 0 0 0 0
0 0 0 0 0 0 0 0 1 1
0 0 0 0 0 0 1 1 1 1
1 1 0 0
1 1 0 0
15
Max pooling layer: 2D example
Output from the Max pooling: returns
convolutional max of its arguments
layer & ReLU: • size 3x3 (“size 3”)

0 0 0 0 0 0
0 0 0 0 1 0
0 0 0 0 0 0 After max
pooling:
0 1 0 0 0 0
0 0 0 0 0 0 0 0 1 1
0 0 0 0 0 0 1 1 1 1
1 1 0 0
1 1 0 0
15
Max pooling layer: 2D example
Output from the Max pooling: returns
convolutional max of its arguments
layer & ReLU: • size 3x3 (“size 3”)

0 0 0 0 0 0
0 0 0 0 1 0
0 0 0 0 0 0 After max
pooling:
0 1 0 0 0 0
0 0 0 0 0 0 0 0 1 1
0 0 0 0 0 0 1 1 1 1
1 1 0 0
1 1 0 0
15
Max pooling layer: 2D example
Output from the Max pooling: returns
convolutional max of its arguments
layer & ReLU: • size 3x3 (“size 3”)

0 0 0 0 0 0
0 0 0 0 1 0
0 0 0 0 0 0 After max
pooling:
0 1 0 0 0 0
0 0 0 0 0 0 0 0 1 1
0 0 0 0 0 0 1 1 1 1
1 1 0 0
1 1 0 0
15
Max pooling layer: 2D example
Output from the Max pooling: returns
convolutional max of its arguments
layer & ReLU: • size 3x3 (“size 3”)

0 0 0 0 0 0
0 0 0 0 1 0
0 0 0 0 0 0 After max
pooling:
0 1 0 0 0 0
0 0 0 0 0 0 0 0 1 1
0 0 0 0 0 0 1 1 1 1
1 1 0 0
1 1 0 0
15
Max pooling layer: 2D example
Output from the Max pooling: returns
convolutional max of its arguments
layer & ReLU: • size 3x3 (“size 3”)

0 0 0 0 0 0
0 0 0 0 1 0
0 0 0 0 0 0 After max
pooling:
0 1 0 0 0 0
0 0 0 0 0 0 0 0 1 1
0 0 0 0 0 0 1 1 1 1
1 1 0 0
1 1 0 0
15
Max pooling layer: 2D example
Output from the Max pooling: returns
convolutional max of its arguments
layer & ReLU: • size 3x3 (“size 3”)

0 0 0 0 0 0
0 0 0 0 1 0
0 0 0 0 0 0 After max
pooling:
0 1 0 0 0 0
0 0 0 0 0 0 0 0 1 1
0 0 0 0 0 0 1 1 1 1
1 1 0 0
1 1 0 0
15
Max pooling layer: 2D example
Output from the Max pooling: returns
convolutional max of its arguments
layer & ReLU: • size 3x3 (“size 3”)

0 0 0 0 0 0
0 0 0 0 1 0
0 0 0 0 0 0 After max
pooling:
0 1 0 0 0 0
0 0 0 0 0 0 0 0 1 1
0 0 0 0 0 0 1 1 1 1
1 1 0 0
1 1 0 0
15
Max pooling layer: 2D example
Output from the Max pooling: returns
convolutional max of its arguments
layer & ReLU: • size 3x3 (“size 3”)
• stride 1
0 0 0 0 0 0
0 0 0 0 1 0
0 0 0 0 0 0 After max
pooling:
0 1 0 0 0 0
0 0 0 0 0 0 0 0 1 1
0 0 0 0 0 0 1 1 1 1
1 1 0 0
1 1 0 0
15
Max pooling layer: 2D example
Output from the Max pooling: returns
convolutional max of its arguments
layer & ReLU: • size 3x3 (“size 3”)
• stride 1
0 0 0 0 0 0
0 0 0 0 1 0
0 0 0 0 0 0 After max
pooling:
0 1 0 0 0 0
0 0 0 0 0 0
0 0 0 0 0 0

16
Max pooling layer: 2D example
Output from the Max pooling: returns
convolutional max of its arguments
layer & ReLU: • size 3x3 (“size 3”)
• stride 3
0 0 0 0 0 0
0 0 0 0 1 0
0 0 0 0 0 0 After max
pooling:
0 1 0 0 0 0
0 0 0 0 0 0
0 0 0 0 0 0

16
Max pooling layer: 2D example
Output from the Max pooling: returns
convolutional max of its arguments
layer & ReLU: • size 3x3 (“size 3”)
• stride 3
0 0 0 0 0 0
0 0 0 0 1 0
0 0 0 0 0 0 After max
pooling:
0 1 0 0 0 0
0 0 0 0 0 0 1
0 0 0 0 0 0 1 0

16
Max pooling layer: 2D example
Output from the Max pooling: returns
convolutional max of its arguments
layer & ReLU: • size 3x3 (“size 3”)
• stride 3
0 0 0 0 0 0
0 0 0 0 1 0
0 0 0 0 0 0 After max
pooling:
0 1 0 0 0 0
0 0 0 0 0 0 0 1
0 0 0 0 0 0 1 0

16
Max pooling layer: 2D example
Output from the Max pooling: returns
convolutional max of its arguments
layer & ReLU: • size 3x3 (“size 3”)
• stride 3
0 0 0 0 0 0
0 0 0 0 1 0
0 0 0 0 0 0 After max
pooling:
0 1 0 0 0 0
0 0 0 0 0 0 0 1
0 0 0 0 0 0 1 0

16
Max pooling layer: 2D example
Output from the Max pooling: returns
convolutional max of its arguments
layer & ReLU: • size 3x3 (“size 3”)
• stride 3
0 0 0 0 0 0
0 0 0 0 1 0
0 0 0 0 0 0 After max
pooling:
0 1 0 0 0 0
0 0 0 0 0 0 0 1
0 0 0 0 0 0 1 0

16
Max pooling layer: 2D example
Output from the Max pooling: returns
convolutional max of its arguments
layer & ReLU: • size 3x3 (“size 3”)
• stride 3
0 0 0 0 0 0
0 0 0 0 1 0
0 0 0 0 0 0 After max
pooling:
0 1 0 0 0 0
0 0 0 0 0 0 0 1
0 0 0 0 0 0 1 0

16
Max pooling layer: 2D example
Output from the Max pooling: returns
convolutional max of its arguments
layer & ReLU: • size 3x3 (“size 3”)
• stride 3
0 0 0 0 0 0
0 0 0 0 1 0
0 0 0 0 0 0 After max
pooling:
0 1 0 0 0 0
0 0 0 0 0 0 0 1
0 0 0 0 0 0 1 0

16
Max pooling layer: 2D example
Output from the Max pooling: returns
convolutional max of its arguments
layer & ReLU: • size 3x3 (“size 3”)
• stride 3
0 0 0 0 0 0
0 0 0 0 1 0
0 0 0 0 0 0 After max
pooling:
0 1 0 0 0 0
0 0 0 0 0 0 0
0 0 0 0 0 0 1 0

16
Max pooling layer: 2D example
Output from the Max pooling: returns
convolutional max of its arguments
layer & ReLU: • size 3x3 (“size 3”)
• stride 3
0 0 0 0 0 0
0 0 0 0 1 0
0 0 0 0 0 0 After max
pooling:
0 1 0 0 0 0
0 0 0 0 0 0 0 1
0 0 0 0 0 0 1 0

16
Max pooling layer: 2D example
Output from the Max pooling: returns
convolutional max of its arguments
layer & ReLU: • size 3x3 (“size 3”)
• stride 3
0 0 0 0 0 0
0 0 0 0 1 0
0 0 0 0 0 0 After max
pooling:
0 1 0 0 0 0
0 0 0 0 0 0 0 1
0 0 0 0 0 0 1 0

16
Max pooling layer: 2D example
Output from the Max pooling: returns
convolutional max of its arguments
layer & ReLU: • size 3x3 (“size 3”)
• stride 3
0 0 0 0 0 0
0 0 0 0 1 0
0 0 0 0 0 0 After max
pooling:
0 1 0 0 0 0
0 0 0 0 0 0 0 1
0 0 0 0 0 0 1 0

16
Max pooling layer: 2D example
Output from the Max pooling: returns
convolutional max of its arguments
layer & ReLU: • size 3x3 (“size 3”)
• stride 3
0 0 0 0 0 0
0 0 0 0 1 0
0 0 0 0 0 0 After max
pooling:
0 1 0 0 0 0
0 0 0 0 0 0 0 1
0 0 0 0 0 0 1 0

16
Max pooling layer: 2D example
Output from the Max pooling: returns
convolutional max of its arguments
layer & ReLU: • size 3x3 (“size 3”)
• stride 3
0 0 0 0 0 0
0 0 0 0 1 0
0 0 0 0 0 0 After max
pooling:
0 1 0 0 0 0
0 0 0 0 0 0 0 1
0 0 0 0 0 0 1 0

16
Max pooling layer: 2D example
Output from the Max pooling: returns
convolutional max of its arguments
layer & ReLU: • size 3x3 (“size 3”)
• stride 3
0 0 0 0 0 0
0 0 0 0 1 0
0 0 0 0 0 0 After max
pooling:
0 1 0 0 0 0
0 0 0 0 0 0 0 1
0 0 0 0 0 0 1 0

16
Max pooling layer: 2D example
Output from the Max pooling: returns
convolutional max of its arguments
layer & ReLU: • size 3x3 (“size 3”)
• stride 3
0 0 0 0 0 0
0 0 0 0 1 0
0 0 0 0 0 0 After max
pooling:
0 1 0 0 0 0
0 0 0 0 0 0 0 1
0 0 0 0 0 0 0

16
Max pooling layer: 2D example
Output from the Max pooling: returns
convolutional max of its arguments
layer & ReLU: • size 3x3 (“size 3”)
• stride 3
0 0 0 0 0 0
0 0 0 0 1 0
0 0 0 0 0 0 After max
pooling:
0 1 0 0 0 0
0 0 0 0 0 0 0 1
0 0 0 0 0 0 1 0

16
Max pooling layer: 2D example
Output from the Max pooling: returns
convolutional max of its arguments
layer & ReLU: • size 3x3 (“size 3”)
• stride 3
0 0 0 0 0 0
0 0 0 0 1 0
0 0 0 0 0 0 After max
pooling:
0 1 0 0 0 0
0 0 0 0 0 0 0 1
0 0 0 0 0 0 1 0

16
Max pooling layer: 2D example
Output from the Max pooling: returns
convolutional max of its arguments
layer & ReLU: • size 3x3 (“size 3”)
• stride 3
0 0 0 0 0 0
0 0 0 0 1 0
0 0 0 0 0 0 After max
pooling:
0 1 0 0 0 0
0 0 0 0 0 0 0 1
0 0 0 0 0 0 1 0

16
Max pooling layer: 2D example
Output from the Max pooling: returns
convolutional max of its arguments
layer & ReLU: • size 3x3 (“size 3”)
• stride 3
0 0 0 0 0 0
0 0 0 0 1 0
0 0 0 0 0 0 After max
pooling:
0 1 0 0 0 0
0 0 0 0 0 0 0 1
0 0 0 0 0 0 1 0

16
Max pooling layer: 2D example
Output from the Max pooling: returns
convolutional max of its arguments
layer & ReLU: • size 3x3 (“size 3”)
• stride 3
0 0 0 0 0 0
0 0 0 0 1 0
0 0 0 0 0 0 After max
pooling:
0 1 0 0 0 0
0 0 0 0 0 0 0 1
0 0 0 0 0 0 1 0

16
Max pooling layer: 2D example
Output from the Max pooling: returns
convolutional max of its arguments
layer & ReLU: • size 3x3 (“size 3”)
• stride 3
0 0 0 0 0 0
0 0 0 0 1 0
0 0 0 0 0 0 After max
pooling:
0 1 0 0 0 0
0 0 0 0 0 0 0 1
0 0 0 0 0 0 1

16
Max pooling layer: 2D example
Output from the Max pooling: returns
convolutional max of its arguments
layer & ReLU: • size 3x3 (“size 3”)
• stride 3
0 0 0 0 0 0
0 0 0 0 1 0
0 0 0 0 0 0 After max
pooling:
0 1 0 0 0 0
0 0 0 0 0 0 0 1
0 0 0 0 0 0 1 0

16
Max pooling layer: 2D example
Output from the Max pooling: returns
convolutional max of its arguments
layer & ReLU: • size 3x3 (“size 3”)
• stride 3
0 0 0 0 0 0
0 0 0 0 1 0
0 0 0 0 0 0 After max
pooling:
0 1 0 0 0 0
0 0 0 0 0 0 0 1
0 0 0 0 0 0 1 0

16
Max pooling layer: 2D example
Output from the Max pooling: returns
convolutional max of its arguments
layer & ReLU: • size 3x3 (“size 3”)
• stride 3
0 0 0 0 0 0
0 0 0 0 1 0
0 0 0 0 0 0 After max
pooling:
0 1 0 0 0 0
0 0 0 0 0 0 0 1
0 0 0 0 0 0 1 0

• Can use stride with filters too


16
Max pooling layer: 2D example
Output from the Max pooling: returns
convolutional max of its arguments
layer & ReLU: • size 3x3 (“size 3”)
• stride 3
0 0 0 0 0 0
0 0 0 0 1 0
0 0 0 0 0 0 After max
pooling:
0 1 0 0 0 0
0 0 0 0 0 0 0 1
0 0 0 0 0 0 1 0

• Can use stride with filters too


16
• No weights in max pooling
CNNs: typical architecture

Conv Max Conv Max fla fu so


tte lly ftm
+ ReLU pool + ReLU pool n co
input nn ax
ec
te
d

17 [ https://www.mathworks.com/solutions/deep-learning/convolutional-neural-network.html ]
CNNs: typical architecture

Conv Max Conv Max fla fu so


tte lly ftm
+ ReLU pool + ReLU pool n co
input nn ax
ec
feature learning te
d

classification

17 [ https://www.mathworks.com/solutions/deep-learning/convolutional-neural-network.html ]
CNNs: typical architecture

Conv Max Conv Max fla fu so


tte lly ftm
+ ReLU pool + ReLU pool n co
input nn ax
ec
feature learning te
d

classification
Recall: we wanted to encode
• Spatial locality
• Translation invariance

17 [ https://www.mathworks.com/solutions/deep-learning/convolutional-neural-network.html ]
CNNs: a taste of backpropagation

al
n

d
io

te
ut

ec
ol

co lly
nn
nv

fu
co
input
image
output,
identity/no
filter, activation
no bias

18
Cat neurons [Hubel, Weisel 1959, 1962]
(Be careful with biology analogies)

19
Cat neurons [Hubel, Weisel 1959, 1962]
(Be careful with biology analogies)

receptive field

• simple cells
• complex cells

[ http://fourier.eng.hmc.edu/e180/lectures/v1/node7.html ]

19

You might also like