f8194544 Microsoft PowerPoint DeepLearning
f8194544 Microsoft PowerPoint DeepLearning
https://colab.research.google.com
Cloud Configurations
!nvidia-smi
Not fit for complexity or various parameters of usual data (Real Time)
that is fed to the neural networks.
• ReLU is default activation when developing MLP and CNN. The model
takes less time to train or run
• As ReLU is 0 for all negative inputs for any given unit not to activate at all
(For Missing Data or Data Sparsity)
• The downside for being zero for all negative values is a problem called
dying ReLU, Neurons die for all inputs and remain inactive no matter what
input is supplied, here no gradient flows
• The leak helps to increase the range of the ReLU function. Usually, the
value of a is 0.01 or so. When a is not 0.01 then it is called Randomized
ReLU. Therefore the range of the Leaky ReLU is (-infinity to infinity)
Avoidance of Vanishing Gradient in ReLU
# create model
model = Sequential()
model.add(Dense(12, input_dim=8, activation='relu'))
model.add(Dense(8, activation='relu'))
model.add(Dense(4, activation='relu'))
model.add(Dense(2, activation='relu'))
model.add(Dense(1, activation='sigmoid'))
# Compile model
model.compile(loss='binary_crossentropy', optimizer='adam', metrics=['accuracy'])
https://keras.io/activations/
# Fit the model
model.fit(X, Y, epochs=100, batch_size=10)
# Test Data
testdata = files.upload()
testdataset = numpy.loadtxt("testdata.csv", delimiter=",")
X2 = testdataset[:,0:8]
predictions = model.predict(X2)
# Round predictions
rounded = [round(x[0]) for x in predictions]
print(rounded)
Loss Functions
A loss function (or objective function, or optimization
score function) is one of the two parameters required
to compile a model:
model.compile(loss='mean_squared_error',
optimizer='sgd',
metrics=['mae', 'acc'])
model.compile(loss='mean_squared_error',
optimizer='sgd',
metrics=[metrics.mae, metrics.categorical_accuracy])
E-mail : [email protected]
http://www.gauravkumarindia.com