0% found this document useful (0 votes)
33 views11 pages

Understanding ResNet

The document discusses ResNet, a deep learning architecture introduced by Microsoft Research in 2015 that addresses the degradation problem in deep networks through residual learning and skip connections. It highlights the challenges of deep neural networks, such as vanishing gradients, and explains how ResNet allows for the training of networks with over 100 layers. The applications of ResNet include image classification, object detection, and medical imaging.

Uploaded by

soadatef199
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
33 views11 pages

Understanding ResNet

The document discusses ResNet, a deep learning architecture introduced by Microsoft Research in 2015 that addresses the degradation problem in deep networks through residual learning and skip connections. It highlights the challenges of deep neural networks, such as vanishing gradients, and explains how ResNet allows for the training of networks with over 100 layers. The applications of ResNet include image classification, object detection, and medical imaging.

Uploaded by

soadatef199
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 11

"Understanding ResNet:

Deep Residual Learning


for Image Recognition"
Introduction to Deep Learning
• Briefly explain what deep learning is
• Mention CNNs (Convolutional Neural Networks)
• Transition: "As networks get deeper, new
challenges arise..."
The Problem with Deep Networks

• Vanishing/exploding gradients
• Degradation problem: accuracy gets worse with deeper
networks
• Image/graph showing performance drop beyond certain
layers
Deep Convolutional Neural Network

• A Deep Convolution Neural Network are the network which


consists of many hidden layer for examples AlexNet which consist of
8 layer where first 5 were convlutional layer and last 3 were full
connected layer or VGGNet which consists of 16 Convolutional layer.

• The problem with these deep neural network were as you increase
the layer we start seeing degradation problem. Or to put it in another
word as we increase depth of the network the accuracy gets
saturated and starts degrading rapidly. In a deep neural network as
we perform back-propogation, repeated mulitplication for finding
optimal solution makes gradient very small which result in
degradation. This problem is often called vanishing
gradient/exploding gradient.
What is ResNet?
• Introduced by Microsoft Research in 2015
• Won ImageNet 2015
• Concept: Residual Learning
• Allows training of networks with 100+ layers
Residual Block Explained
• ResNet solve this degradation problem, is by skipping
connection or layer. Skipping connection means,
consider input x and this input is passed through stack
of neural network layers and produce f(x) and this f(x) is
then added to original input x.So our ouput will be:
• Equation: F(x) + x
ResNet Architecture
ResNet Architecture Variants
• ResNet-18, ResNet-34, ResNet-50, ResNet-101, ResNet-
152
• Difference in depth and use of bottleneck blocks
• Table comparing number of layers and parameters
ResNet Architecture Table
Applications of ResNet

• Image classification
• Object detection (used in Faster R-CNN, Mask R-CNN)
• Medical imaging, facial recognition, etc.
Why ResNet Works?
• Solves vanishing gradient with skip connections
• Enables extremely deep networks to converge
• Simpler training, better generalization

You might also like