0% found this document useful (0 votes)
8 views7 pages

Batch Normalization

Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
8 views7 pages

Batch Normalization

Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 7

Batch Normalization

• We have sophisticated optimization


procedures such as SGD+momentum,
RMSProp, or Adam to improve training
without changing the architecture
• Another strategy is to change the architecture
of the network to make it easier to train
• One idea is batch normalization
• ML works better when their input data
consists of uncorrelated features with zero
mean and unit variance
• While training neural network we can
preprocess the data before feeding it to the
network to explicitly decorrelate its features
• this will ensure that the first layer of the
network sees data that follows a nice
distribution.
• the activations at deeper layers of the
network will likely no longer be decorrelated
and will no longer have zero mean or unit
variance
• insert batch normalization layers into the
network
• At training time, a batch normalization layer
uses a minibatch of data to estimate the mean
and standard deviation of each feature
• These estimated means and standard deviations
are then used to center and normalize the features
of the minibatch
• A running average of these means and standard
deviations is kept during training, and at test time
these running averages are used to center and
normalize features
• the batch normalization layer includes learnable
shift and scale parameters for each feature
dimension
keras
• epsilon is small constant
• gamma is a learned scaling factor (initialized
as 1),
• beta is a learned offset factor (initialized as 0)

You might also like