Neural Network
Neural Network
Suppose we use
linear activation
function, what is the
output?
Sigmoid
Tanh
Relu
y = max(x, 0)
Leaky ReLU
Training set/Validation set/Test set
Softmax function
One-hot encoding
Cross-entropy loss
Similar to binary_crossentropy
Backpropagation
Q&A
The end