Lecture 01

Download as pdf or txt
Download as pdf or txt
You are on page 1of 53

Lecture 1

Image Classification : From Shallow


Models to Deep Models
Today’s Agenda
• Classical Image Classification Methods

• Shallow Architecture vs Deep Architecture

• Convolutional Neural Network


Classical Image Classification Methods

• Features are key to recent


Illustration of SIFT Feature Calculation

progress in recognition

• Multitude of hand-designed
features currently in use

• Such as SIFT Features


Classical Image Classification Methods

• Widely-Used Classifier

• Multi-layered Perceptrons Illustration of SVM

• Support Vector Machine (SVM)

• Random Forests
Classical Image Classification Methods

• Researchers pay much attention for the engineering of


feature design

• Carefully designed features are efficient for specific task,


however they may be not work well for a new task

• Is there a better way for feature designation, such as


directly learning feature from data?
Convolutional Neural Networks
• Convolutional Neural Network (CNN) is an efficient deep
learning based method widely-used on lots of image-
related tasks including image classification.

• CNN is a special kind of neural network.

• CNN apply convolution on input images and intermediate


results to produce powerful feature representation.
Now Days : CNNs are Everywhere
Typical CNN Architecture
• A typical CNN is comprised by :
• Convolutional Layer with rectified linear units (ReLU)
• Pooling Layer
• Fully Connected (FC) Layer
Rectified Linear Units (ReLU)
• ReLU avoids or rectifies vanishing gradient problem

• Only used within hidden layers of deep neural networks


1 x 1 convolution layers are usually adopted for channel compression
Max Pooling & Average Pooling

6 8
Max Pooling
1 1 2 4 3 4

5 6 7 8 Pooling with
2x2 filters
3 2 1 0 and stride 2

3.25 5.25
1 2 3 4 Average Pooling
2 2
Fully Connected (FC) Layer
• FC Layer fully connects all inputs with outputs, as in ordinary neural
networks

• It is usually adopted at the last layer with a softmax activation


function

You might also like