Resnet50 Summary
Resnet50 Summary
So one of the problems when using hundreds of layers in So this is the “Skip connections” that is used by ResNet50.
a deep neural network is the Vanishing Gradient These connections allowed the preservation of
Problem. information from earlier layers, which helped the
Vanishing gradient problem is a phenomenon that network learn better representations of the input data.
occurs during the training of deep neural networks, With the ResNet architecture, they were able to train
where the gradients that are used to update the networks with as many as 152 layers.
network become extremely small or "vanish" as they
are backpropagated from the output layers to the
earlier layers. ResNet50 Architecture:
So this problem is “solved” by developing a model called The "50" in ResNet50 refers to the total number of layers
ResNet50 which uses Skip Connections. in the network. It consists of 50 layers in total.
Skip Connections is the process of adding the original Input Layer – This layer takes the input image as an
input to the output of the convolutional block. In this input. In the case of ResNet50, the input image
way, dili ra kayo malayo ang output of hundreds of layers typically has dimensions of 224x224 pixels with three
sa imong original input. Mura siyag feedback loop. color channels (RGB).
Convolutional Layer - ResNet50 starts with a series
of convolutional layers, which are responsible for
extracting features from the input image. This layer
captures various patterns and features such as edges,
textures, shapes, etc.
Pooling Layers - After a few initial convolutional
layers, ResNet50 uses max pooling layers to reduce
the spatial dimensions of the feature maps.
Residual Blocks - The core component of ResNet50
is the residual block. These blocks introduce skip
connections, which allow information to bypass one
So this is how dataset normally propagates from one
or more layers and propagate more directly through
convolutional layer to another. However, if we use
the network.
hundreds of layers, maka cause siya og vanishing
gradient problem.
Fully Connected Layers - These fully connected
layers perform classification tasks, such as
identifying the object present in the input image.
Output Layer - The output layer of ResNet50
typically uses a softmax activation function to
convert the raw output of the neural network into
probabilities for each class. The class with the
highest probability is considered the predicted
class for the input image.
Another ResNet50 Architecture Diagram:
Residual Blocks
Fully Connected
Pooling Layers Layers
So in the case of training ResNet50 tuning para mag work og mayo imong OWN ResNet50
model.
with custom dataset:
Code: