Deep Learning
Deep Learning
Deep Learning
https://colab.research.google.com
Cloud Configurations
!nvidia-smi
# Processor
!lscpu | grep "MHz"
# Usable Memory
!cat /proc/meminfo | grep 'MemAvailable'
Disk Space is in Sync with Google Drive . We can upload the dataset
on Google Drive. Google Colab will automatically link Google Drive to
perform implementations. All Source Codes automatically saved in
Google Drive. Full Security without any privacy issues
cpu_stats()
!cat /proc/cpuinfo
!df -h
virtual_memory()
Extraction of Data Files
!apt-get install p7zip-full
!p7zip -d file_name.tar.7z
!tar -xvf file_name.tar
from google.colab import files
Sync Google Drive
Deep Learning and Transfer Functions in Keras
1. Activation Function or Transfer Function is used to
determine the output of node
If Activation function not applied, then the output signal will be a simple linear function
as a polynomial of one degree. Deep Networks are complex
Not fit for complexity or various parameters of usual data (Real Time)
that is fed to the neural networks.
x, y, z
0.4, 0.7887, 0.3423
Why ReLU??
ReLU gives big jump from vanishing gradient
Activation Function: Tanh - Hyperbolic Tangent
• Mathematical formula is f(x) = 1 - exp(-2x) / 1 + exp(-2x).
• ReLU is default activation when developing MLP and CNN. The model takes less
time to train or run. At every iteration, it rounds off the values
• As ReLU is 0 for all negative inputs for any given unit not to activate at all (For
Missing Data or Data Sparsity)
• The downside for being zero for all negative values is a problem called dying
ReLU, Neurons die for all inputs and remain inactive no matter what input is
supplied, here no gradient flows
• The leak helps to increase the range of the ReLU function. Usually, the value of a
is 0.01 or so. When a is not 0.01 then it is called Randomized ReLU. Therefore the
range of the Leaky ReLU is (-infinity to infinity)
Avoidance of Vanishing Gradient in ReLU
# create model
model = Sequential()
model.add(Dense(12, input_dim=8, activation=‘sigmoid'))
model.add(Dense(6, activation='relu'))
model.add(Dense(4, activation=‘sigmoid'))
model.add(Dense(2, activation='relu'))
model.add(Dense(1, activation=‘relu'))
# Compile model
model.compile(loss='binary_crossentropy', optimizer='adam', metrics=['accuracy'])
https://keras.io/activations/
# Fit the model
model.fit(X, Y, epochs=100, batch_size=10)
# Test Data
testdata = files.upload()
testdataset = numpy.loadtxt("testdata.csv", delimiter=",")
X2 = testdataset[:,0:8]
predictions = model.predict(X2)
# Round predictions
rounded = [round(x[0]) for x in predictions]
print(rounded)
Loss Functions
A loss function (or objective function, or optimization
score function) is one of the two parameters required
to compile a model:
model.compile(loss='mean_squared_error',
optimizer='sgd',
metrics=['mae', 'acc'])
model.compile(loss='mean_squared_error',
optimizer='sgd',
metrics=[metrics.mae, metrics.categorical_accuracy])
• The number of weights will be even bigger for images with size
225x225x3 = 151875.
model = Sequential([
Dense(32, input_shape=(784,)),
Activation('relu'),
Dense(10),
Activation('softmax'),
])
You can also simply add layers via the .add() method:
model = Sequential()
model.add(Dense(32, input_dim=784))
model.add(Activation('relu'))
https://keras.io/getting-started/sequential-model-guide/
Pre-Trained Models in Keras
• … classical models available in Keras as Applications.
Xception InceptionResNetV2
VGG16 MobileNet
VGG19 MobileNetV2
ResNet, ResNetV2, ResNeXt DenseNet
InceptionV3 NASNet
Depth
Parameters
Network (Weight Size Image Input Size
(Millions)
Layers)
alexnet 8 227 MB 61.0 227-by-227
vgg16 16 515 MB 138 224-by-224
vgg19 19 535 MB 144 224-by-224
squeezenet 18 4.6 MB 1.24 227-by-227
googlenet 22 27 MB 7.0 224-by-224
inceptionv3 48 89 MB 23.9 299-by-299
densenet201 201 77 MB 20.0 224-by-224
mobilenetv2 53 13 MB 3.5 224-by-224
resnet18 18 44 MB 11.7 224-by-224
resnet50 50 96 MB 25.6 224-by-224
resnet101 101 167 MB 44.6 224-by-224
xception 71 85 MB 22.9 299-by-299
inceptionresnetv2 164 209 MB 55.9 299-by-299