CV Lecture 4-Donnnn
CV Lecture 4-Donnnn
Its Applications
By
Dr.Eng.Wafaa Shalash
Objectives
• Basics of CNN.
• Different CNN Layers structure
• Using deep learning (hands-on
examples)
• Investigating topics related to deep
learning:
• CNN Architecture
• preparing data
• data augmentation
• building CNN from scratch
• optimizers
Deep Learning
To Build a Deep Model We Need:
Deep Net
Data Design
Training
Neural Networks Vs. Deep Learning
Deep Learning
Deep Network Output Example
F F
u u
l l
l l
y y
C C S
o o o C
n n f l
n n t
e e a
M s
c c
a s
t t
e e x
d d
How does
Convolution Neural
Networks work?
Step 1: Convolute input with mask
How does
Convolution Neural • Step 1: Convolute input with mask
Networks work?
Convolution with
Padding Padding
Convolution with
Padding Padding
Convolution with
Striding
Example: stride by 2
Convolution with
Striding
Example: stride by 2
Convolution with RGB Channels
Input
Convolution with RGB Channels Input
activation function
A function (for example, ReLU or sigmoid)
that takes in the weighted sum of all of the
inputs from the previous layer and then
generates and passes an output value
(typically nonlinear) to the next layer.
(Sigmoid)
Net
Design
• Select Layers
• Select Activation
Fn.
Data
• Labelling
• Augmented
• Splitting
Approaches to
Design a CNN
Reuse a pretrained
Design Layers from
Network with
Scratch.
Transfer Learning
Design Layers from Scratch.
F F
u u
ll ll
y y
C C
S
o o
o
n n C
f
n n l
t
e e a
M
c c s
a
t t s
x
e e
d d
2D Convolutional Layer
ReuL
Pooling
Practical
Example with
MATLAB
How to design
a CNN with
MATLAB
Use MATLAB
Write the
Deep Network
code directly
Designer
30
Coding From
Scratch
Preparing Data
1. Labeling
2. Splitting
3. Augmentation
31
MATLAB Data
Store
Look at data and store information
about it, not loading it into
memory
32
MATLAB Data
Store
Datastorte object manage memory and Look only the
Required data into memory
33
Image Browser
is a Tool from
MATLAB App
to Brows
Image Data
34
You can
Brows
individual
image
properties
12/11/2022 35
Augmentation
36
Splitting Data
Image
Data
12/11/2022 37
MATLAB Code : Import Data
'IncludeSubfolders',true,'LabelSource','folde
rnames');
Data Lable
38
MATLAB Code: Split Data
[trainImgs,testImgs, ValidateImg] = splitEachLabel(imds,0.7,0.2,'randomized');
39
MATLAB Code
• digitDatasetPath = 'flower_photos';
'IncludeSubfolders',true,'LabelSource','foldernames');
• [trainImgs,testImgs, ValidateImg] =
splitEachLabel(imds,0.7,0.2,'randomized');
40
(MNIST
dataset)
41
Example 1: Load and
Explore Data
digitDatasetPath = Minest_dataset;
imds = imageDatastore(digitDatasetPath, ...
'IncludeSubfolders',true,'LabelSource','foldernames’);
figure;
perm = randperm(10000,20);
for i = 1:20
subplot(4,5,i);
imshow(imds.Files{perm(i)});
end
Define Network Architecture
labelCount = countEachLabel(imds)
img = readimage(imds,1); size(img)
numTrainFiles = 750;
[imdsTrain,imdsValidation] = splitEachLabel(imds,numTrainFiles,'randomize’);
layers = [ imageInputLayer([28 28 1])
convolution2dLayer(3,8,'Padding','same’)
batchNormalizationLayer
reluLayer
maxPooling2dLayer(2,'Stride',2)
convolution2dLayer(3,16,'Padding','same’)
batchNormalizationLayer
reluLayer
maxPooling2dLayer(2,'Stride',2)
convolution2dLayer(3,32,'Padding','same’)
batchNormalizationLayer
reluLayer
fullyConnectedLayer(10)
softmaxLayer
classificationLayer];
Network
Analyzer
net2= AlexNet;
analyzeNetwork(net)
plot(layerGraph(layers))
Training Options and Training
testPreds = classify(net,imdsValidation)
acc = (nnz(testPreds ==
imdsValidation.Labels)/length(testPreds))*100;
confusionchart(imdsValidation.Labels,testPreds);
Hands- on with
MATLAB
Use MATLAB
Deep Network
Designer
Use MATLAB Deep Network Designer
Use MATLAB
Deep Network
Designer
Hands- on with
MATLAB
Adjust Training Options
Adjust Training Parameters
(Tunning)
Training Process is like the process to find the minimum among mountains.
The main target is to minimize the Loss function.
Learning Rate
• Small learning rate makes you arrive slowly.
Accuracy Sedm
Adam
Rmsprop
Iterations
Comparison between different Optimizers
Sedm
Loss Adam
Rmsprop
Iterations
Optmizers
• At each iteration, a subset of the training images, known as a mini-batch, is used to update the
weights. Each iteration uses a different mini-batch. Once the whole training set has been used,
What is a Mini- that's known as an epoch.
• The maximum number of epochs (MaxEpochs) and the size of the mini-batches (MiniBatchSize)
Batches? are parameters you can set in the training algorithm options.
• Note that the loss and accuracy reported during training are for the mini-batch being used in the
current iteration.
• By default, the images are shuffled once prior to being divided into mini-batches. You can control
this behavior with the Shuffle option.
Using GPUs
• A GPU (graphics processing unit) can significantly speed up the
many computations required for deep learning. If the computer
doesn't have a supported GPU, the training can be performed
on the CPU, but it may take a long time. If you're going to get
serious about deep learning, you'll want to train your network on
a computer with a GPU that can handle the processing.
• If you have an appropriate GPU and Parallel Computing Toolbox
installed, the trainNetwork function will automatically perform the
training on the GPU -- no special coding required.
• If not, the training will be done on your computer's CPU instead.
This gives you the option to experiment a little before committing
to purchasing the needed hardware and software.
Augmentation …
more
• Resize
• displacement
• Rotate
• Shear
options = trainingOptions('adam',
'GradientDecayFactor',0.6,"LearnRateSchedule
",'piecewise','ExecutionEnvironment','gpu',
...
'LearnRateDropFactor',0.5, ...
'LearnRateDropPeriod',5, ...
Training 'SquaredGradientDecayFactor',0.6, ...
64
Next time: Using Transfer Learning
65