0% found this document useful (0 votes)
48 views14 pages

Introduction To Keras

This document introduces the Keras deep learning library. Keras is a high-level neural network API written in Python that can run on top of TensorFlow, CNTK, or Theano. It was developed with a focus on fast experimentation. Keras allows users to define and train neural network models in a modular and extensible way. The document discusses how to install Keras, common layers like Dense, Conv2D and MaxPool2D, optimizers, activation functions, cost functions, and defines two ways to define a neural network architecture in Keras. It provides exercises to design a LeNet-5 model and evaluate performance based on learning rate, activation functions, and dropout values.

Uploaded by

ouhda.med
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
48 views14 pages

Introduction To Keras

This document introduces the Keras deep learning library. Keras is a high-level neural network API written in Python that can run on top of TensorFlow, CNTK, or Theano. It was developed with a focus on fast experimentation. Keras allows users to define and train neural network models in a modular and extensible way. The document discusses how to install Keras, common layers like Dense, Conv2D and MaxPool2D, optimizers, activation functions, cost functions, and defines two ways to define a neural network architecture in Keras. It provides exercises to design a LeNet-5 model and evaluate performance based on learning rate, activation functions, and dropout values.

Uploaded by

ouhda.med
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 14

INTRODUCTION TO KERAS

R.Q. Feitosa, J.D. Bermúdez, L.E. Cué, P. S. Vega.


March 2019
Keras: The python deep learning library
 High-level neural network API.
 Written in Python.
 Run on top of TensorFlow, CNTK and Theano.

2
Keras: The python deep learning library
 High-level neural network API.
 Written in Python.
 TensorFlow absorved keras.

3
Keras: The python deep learning library
 High-level neural network API.
 Written in Python.
 TensorFlow absorved keras.
 Developed with a focus on enabling fast experimentation.

4
Keras: Install…
 Anaconda Environment
 Windows: Running the file (*.exe) available at:
https://www.anaconda.com/download/
 Linux: Running trough the bash command in the terminal the file available
in the same link.

5
Keras: Install…
 Using the Anaconda environment:
 Install the packages tensorflow and keras:
conda install -c anaconda tensorflow
conda install -c anaconda keras (using tensorflow as backend)
 In general, the packages can be installed:
conda install –c conda package_name
 More information at:
https://keras.io/

6
Keras: Layers
 Input:
input_img = Input(shape=(rows , cols , channels))
 Dense:
x = Dense(num_of_units , activation=‘activation_function’)
 Conv2D:
x = Conv2D(num_of_filters, kernel_size , stride,
activation=‘activation_function’,padding=‘type_of_padding’)
 MaxPool2D:
x = MaxPool2D(kernel_size)
 Flatten:
 Dropout:
x = Dropout(value_of_dropout)

7
Keras: Optimizers
 SGD

 RMSProp

 AdaGrad

 Adam

 …

8
Keras: Activation Functions
 Sigmoid

 Tanh

 Relu

 LeakyRelu

 ELU

 Softmax
 …

9
Keras: Cost Functions
 Mean Squared Error (‘mse’)

 Binary Cross Entropy (‘binary_crossentropy’)

 Kullback Leibler Divergence (‘kullback_leibler_divergence’)

 …

10
Keras: Defining the architecture
There are two ways to define the architecture:

11
Keras: Defining the architecture
There are two ways to define the architecture:

12
Exercises:
 Exercise 1:
Define the network architecture following the LeNet-5 model.
 Exercise 2:
Evaluate the network performance in terms of accuracy in relation to the
change of:
1. Learning rate: 0.1 and 0.001.
2. Activation functions: ReLU and Sigmoid.
3. Dropout values: 0.25, 0.5 and 0.75

13
INTRODUCTION TO KERAS

R.Q. Feitosa, J.D. Bermúdez, L.E. Cué, P. S. Vega.


March 2019

You might also like