SlideShare a Scribd company logo
Presentation by:
In the name of God
Professor :
1
Introduction To Using TensorFlow
2Introduction To Using TensorFlow
Overview
■ TensorFlow
– What is TensorFlow
– TensorFlow Code Basics
– TensorFlow UseCase
■ Deep Learning
– CNN & RNN
– Exmple of Mnist Data Set Classification
What is TensorFlow
3Introduction To Using TensorFlow
What are Tensors?
As shown in the image above, tensors are just multidimensional arrays, that allows you to represent data
having higher dimensions.
9 …
0
What is TensorFlow
■ VGG Network
■ Plain Network
■ Residual Network
■ Experiments
■ Conclusion Network
4Introduction To Using TensorFlow
What are Tensors & Flow?
In fact, the name “TensorFlow” has been derived from the operations which neural networks
perform on tensors.
TensorFlow is a library based on Python that provides different types of functionality for implementing
Deep Learning Models. the term TensorFlow is made up of two terms – Tensor & Flow:
What is TensorFlow
5Introduction To Using TensorFlow
TensorFlow (data flow) graph
TensorFlow Code Basics
6Introduction To Using TensorFlow
■ Basically, the overall process of writing a TensorFlow program
involves two steps:
■ Building a Computational Graph
■ Running a Computational Graph
Let me explain you the above two steps one by one:
TensorFlow Code Basics
7Introduction To Using TensorFlow
■ Building & Running The Computational Graph
■ Example: Tensor & Flow OR Data & Flow
import tensorflow as tf
# Build a graph
a = tf.constant(8.0)
b = tf.constant(9.0)
c = a * b
# Create the session object
sess = tf.Session()
output_c = sess.run(c)
print(output_c)
sess.close()
What is TensorFlow
8Introduction To Using TensorFlow
 Main Components of Tensorflow:
A. Variables: Retain values between sessions, use for
weights/bias
B. Nodes: The operations
C. Tensors: Signals that pass from/to nodes
D. Placeholders: Used to send data between your
program and the tensorflow graph
E. Session: Place when graph is executed.
Points to Remember about placeholders:
•Placeholders are not initialized and contains no data.
•One must provides inputs or feeds to the placeholder which are considered during runtime.
•Executing a placeholder without input generates an error.
TensorFlow Code Basics
9Introduction To Using TensorFlow
■ Building & Running The Computational Graph
■ Constants, Placeholder and Variables
import tensorflow as tf
# Creating placeholders
a = tf. placeholder(tf.float32)
b = tf. placeholder(tf.float32)
# Assigning multiplication operation w.r.t. a & b to node mul
mul = a*b
# Create session object
sess = tf.Session()
# Executing mul by passing the values [1, 3] [2, 4] for a and b respectively
output = sess.run(mul, {a: [1,3], b: [2, 4]})
print('Multiplying a b:', output)
Output: [2. 12.]
TensorFlow Code Basics
10Introduction To Using TensorFlow
■ Example : Linear Regression on tensorflow
TensorFlow Code Basics
11Introduction To Using TensorFlow
■ Example : Linear Regression on tensorflow
TensorFlow Code Basics
12Introduction To Using TensorFlow
■ Example : Linear Regression on tensorflow
TensorFlow Code Basics
13Introduction To Using TensorFlow
■ Example : Linear Regression on tensorflow
TensorFlow Code Basics
14Introduction To Using TensorFlow
■ Example : Linear Regression on tensorflow
15Introduction To Using TensorFlow
Deep Learning
CNN & RNN
Artificial Intelligence
16Introduction To Using TensorFlow
Deep Learning & Machine Learning
Deep Learning
17Introduction To Using TensorFlow
■ Deep Learning vs Machine Learning
Deep Learning
18Introduction To Using TensorFlow
■ Deep Learning with Neural Network
Deep Learning
19Introduction To Using TensorFlow
■ Deep Learning with Neural Network
Deep Learning
20Introduction To Using TensorFlow
■ Deep Learning with Neural Network
TensorFlow Code Basics
21Introduction To Using TensorFlow
■ Example of Neural Network:
Deep Learning
22Introduction To Using TensorFlow
■ Neural Network (NN)
Forward Pass Backward Pass
Deep Learning
23Introduction To Using TensorFlow
■ Convolutional Neural Network (CNN)
The three main processing stages in a CNN
Deep Learning
24Introduction To Using TensorFlow
■ Convolutional Neural Network (CNN)
Deep Learning
25Introduction To Using TensorFlow
■ Convolutional Neural Network (CNN)
Example filters learned by Krizhevsky et al. Each of the 96 filters shown here is of size [11x11x3], and each
one is shared by the 55*55 neurons in one depth slice.
1 2 3
Deep Learning
26Introduction To Using TensorFlow
■ Convolutional Neural Network (CNN)
Deep Learning
27Introduction To Using TensorFlow
■ Convolutional Neural Network (CNN)
Deep Learning
28Introduction To Using TensorFlow
■ Convolutional Neural Network (CNN)
Deep Learning
29Introduction To Using TensorFlow
■ Convolutional Neural Network (CNN)
Deep Learning
30Introduction To Using TensorFlow
■ Convolutional Neural Network (CNN) Denoising
Deep Learning
31Introduction To Using TensorFlow
■ Convolutional Neural Network (CNN)
Deep Learning
32Introduction To Using TensorFlow
■ Convolutional Neural Network (CNN)
Deep Learning
33Introduction To Using TensorFlow
■ Convolutional Neural Network (CNN)
Deep Learning
34Introduction To Using TensorFlow
■ Convolutional Neural Network (CNN)
Deep Learning
35Introduction To Using TensorFlow
■ Convolutional Neural Network (CNN)
Deep Learning
36Introduction To Using TensorFlow
■ Convolutional Neural Network (CNN)
http://scs.ryerson.ca/~aharley/vis/conv/The three main processing stages in a CNN
Deep Learning
37Introduction To Using TensorFlow
■ Example :
Deep Learning
38Introduction To Using TensorFlow
■ Example :
Deep Learning
39Introduction To Using TensorFlow
■ Example :
TensorFlow Code Basics
40Introduction To Using TensorFlow
■ Example :
Multi Layer Perceptron MNIST on tensorflow
The MNIST database (Modified National Institute of Standards and Technology database) is a large database of
handwritten digits that is commonly used for training various image processing systems.
TensorFlow Code Basics
41Introduction To Using TensorFlow
■ Example : Multi Layer Perceptron MNIST on tensorflow
1. Load tensorflow library and MNIST data
2. Neural network parameters
3. Build graph
4. Initialize weights and construct the model
5. Define Loss function, and Optimizer
6. Launch graph
TensorFlow Code Basics
42Introduction To Using TensorFlow
# Parameters
learning_rate = 0.001 training_epochs = 15 batch_size = 100
# Network Parameters
n_hidden_1 = 256 n_hidden_2 = 256 n_input = 784 n_classes = 10
# On this case we choose the AdamOptimizer
optimizer = tf.train.AdamOptimizer(learning_rate=learning_rate).minimize(cost)
# Cross entropy loss function
cost = tf.reduce_mean(tf.nn.softmax_cross_entropy_with_logits(pred, y))
■ Example :
TensorFlow Code Basics
43Introduction To Using TensorFlow
■ Example : Multi Layer Perceptron MNIST on tensorflow
1. Load tensorflow library and MNIST data
import tensorflow as tf # Import MNIST data
from tensorflow.examples.tutorials.mnist import input_data
mnist = input_data.read_data_sets("/tmp/data/", one_hot=True)
print('Test shape:',mnist.test.images.shape)
print('Train shape:',mnist.train.images.shape)
Test shape: (10000, 784)
Train shape: (55000, 784)
TensorFlow Code Basics
44Introduction To Using TensorFlow
■ Example : Multi Layer Perceptron MNIST on tensorflow
# Parameters
learning_rate = 0.001
training_epochs = 15
batch_size = 100
display_step = 1
# Network Parameters
n_hidden_1 = 256 # 1st layer number of features
n_hidden_2 = 256 # 2nd layer number of features
n_input = 784 # MNIST data input (img shape: 28*28)
n_classes = 10 # MNIST total classes (0-9 digits)
2. Neural network parameters
TensorFlow Code Basics
45Introduction To Using TensorFlow
■ Example : Multi Layer Perceptron MNIST on tensorflow
x = tf.placeholder("float", [None, n_input])
y = tf.placeholder("float", [None, n_classes])
# Create model
def multilayer_perceptron(x, weights, biases):
print('x:',x.get_shape(),'W1:',weights['h1'].get_shape(),'b1:',biases['b1'].get_shape())
layer_1 = tf.add(tf.matmul(x, weights['h1']), biases['b1'])
layer_1 = tf.nn.relu(layer_1)
print( 'layer_1:', layer_1.get_shape(), 'W2:', weights['h2'].get_shape(), 'b2:',
biases['b2'].get_shape())
layer_2 = tf.add(tf.matmul(layer_1, weights['h2']), biases['b2'])
layer_2 = tf.nn.relu(layer_2)
print( 'layer_2:', layer_2.get_shape(), 'W3:', weights['out'].get_shape(), 'b3:',
biases['out'].get_shape())
out_layer = tf.matmul(layer_2, weights['out']) + biases['out']
print('out_layer:',out_layer.get_shape()) return out_layer
3. Build graph
TensorFlow Code Basics
46Introduction To Using TensorFlow
■ Example : Multi Layer Perceptron MNIST on tensorflow
# Store layers weight & bias
weights = { 'h1': tf.Variable(tf.random_normal([n_input, n_hidden_1])), #784x256
'h2': tf.Variable(tf.random_normal([n_hidden_1, n_hidden_2])), #256x256
'out': tf.Variable(tf.random_normal([n_hidden_2, n_classes])) #256x10
}
biases = { 'b1': tf.Variable(tf.random_normal([n_hidden_1])), #256x1
'b2': tf.Variable(tf.random_normal([n_hidden_2])), #256x1
'out': tf.Variable(tf.random_normal([n_classes])) #10x1
}
# Construct model
pred = multilayer_perceptron(x, weights, biases)
4. Initialize weights and construct the model
TensorFlow Code Basics
47Introduction To Using TensorFlow
■ Example : Multi Layer Perceptron MNIST on tensorflow
5. Define Loss function, and Optimizer
# Cross entropy loss function
cost = tf.reduce_mean(tf.nn.softmax_cross_entropy_with_logits(pred, y))
# On this case we choose the AdamOptimizer
optimizer = tf.train.AdamOptimizer(learning_rate=learning_rate).minimize(cost)
TensorFlow Code Basics
48Introduction To Using TensorFlow
■ Example : Multi Layer Perceptron MNIST on tensorflow
6.1 Launch graph
# Initializing the variables
init = tf.initialize_all_variables()
# Launch the graph
with tf.Session() as sess:
sess.run(init)
# Training cycle
for epoch in range(training_epochs):
avg_cost = 0.
total_batch = int(mnist.train.num_examples/batch_size)
# Loop over all batches
for i in range(total_batch):
batch_x, batch_y = mnist.train.next_batch(batch_size)
# Run optimization op (backprop) and cost op (to get loss value)
_, c = sess.run([optimizer, cost], feed_dict={x: batch_x, y: batch_y})
# Compute average loss
avg_cost += c / total_batch
TensorFlow Code Basics
49Introduction To Using TensorFlow
■ Example : Multi Layer Perceptron MNIST on tensorflow
6.2 Launch graph
# Display logs per epoch step
if epoch % display_step == 0:
print ("Epoch:", '%04d' % (epoch+1), "cost=", 
"{:.9f}".format(avg_cost))
print("Optimization Finished!")
# Test model
correct_prediction = tf.equal(tf.argmax(pred, 1), tf.argmax(y, 1))
# Calculate accuracy
accuracy = tf.reduce_mean(tf.cast(correct_prediction, "float"))
# To keep sizes compatible with model
print ("Accuracy:", accuracy.eval({x: mnist.test.images, y: mnist.test.labels}))
TensorFlow Code Basics
50Introduction To Using TensorFlow
■ Example : Multi Layer Perceptron MNIST on tensorflow
6.2 Output of Execute graph with CNN
Epoch: 0001 cost= 152.289635962
Epoch: 0002 cost= 39.134648348
...
Epoch: 0015 cost= 0.850344581
Optimization Finished!
Accuracy: 0.9464
Deep Learning Layer
51Introduction To Using TensorFlow
W
Increasing Depth Layer ?Increasing Parameter ?
Become Overfit
GoogleNet
52Introduction To Using TensorFlow
ResNet
53Introduction To Using TensorFlow
54Introduction To Using TensorFlow
Results :
CapsNet
55Introduction To Using TensorFlow
Parallel Processing
56Introduction To Using TensorFlow
Parallel Processing
57Introduction To Using TensorFlow
Reference
58Introduction To Using TensorFlow
https://www.edureka.co/blog/tensorflow-tutorial/
http://howsam.org/1396/08/11/%D8%A2%D9%85%D9%88%D8%B2%D8%B4-
%D8%AA%D9%86%D8%B3%D9%88%D8%B1%D9%81%D9%84%D9%88/
http://www.7khatcode.com/7677/%D8%AA%D9%86%D8%B3%D9%88%D8%B1%D9%81%D9
%84%D9%88-tensorflow-%DA%86%DB%8C%D8%B3%D8%AA%D8%9F
https://blog.faradars.org/cnn-convolution-perceptron-neural-network-2/
https://leonardoaraujosantos.gitbooks.io/artificial-inteligence/content/loss-function.html
https://medium.com/machine-learning-in-practice/over-150-of-the-best-machine-learning-nlp-and-
python-tutorials-ive-found-ffce2939bd78
https://ujjwalkarn.me/2016/08/11/intuitive-explanation-convnets/
http://howsam.org/1396/08/12/%D9%86%D8%B5%D8%A8-
%D8%AA%D9%86%D8%B3%D9%88%D8%B1%D9%81%D9%84%D9%88/
Reference
59Introduction To Using TensorFlow
http://howsam.org/1396/08/16/%D8%B4%D8%B1%D9%88%D8%B9-%DA%A9%D8%A7%D8%B1-
%D8%A8%D8%A7-%D8%AA%D9%86%D8%B3%D9%88%D8%B1%D9%81%D9%84%D9%88/
https://stanford.edu/~shervine/l/fa/teaching/cs-229/cheatsheet-supervised-learning
https://stanford.edu/~shervine/l/fa/teaching/cs-229/cheatsheet-deep-learning
https://stanford.edu/~shervine/l/fa/teaching/cs-229/cheatsheet-machine-learning-tips-and-tricks
https://chistio.ir/%D9%BE%D8%B3-%D8%A7%D9%86%D8%AA%D8%B4%D8%A7%D8%B1-
%D8%AE%D8%B7%D8%A7-back-propagation-%D8%B4%D8%A8%DA%A9%D9%87-
%D8%B9%D8%B5%D8%A8%DB%8C/
http://deeplearning.ir/%D9%BE%DB%8C%D8%B4%DB%8C%D9%86%D9%87-%D9%88-
%D9%85%D8%B1%D9%88%D8%B1%DB%8C-%D8%A8%D8%B1-
%D8%B1%D9%88%D8%B4%D9%87%D8%A7%DB%8C-
%D9%85%D8%AE%D8%AA%D9%84%D9%81-
%DB%8C%D8%A7%D8%AF%DA%AF%DB%8C%D8%B1%DB%8C/
https://ujjwalkarn.me/2016/08/11/intuitive-explanation-convnets/
60Introduction To Using TensorFlow
Reference
61Introduction To Using TensorFlow
https://www.youtube.com/watch?v=FmpDIaiMIeA
https://www.youtube.com/watch?v=2-Ol7ZB0MmU
https://brohrer.github.io/how_convolutional_neural_networks_work.html
http://www.rtc.us.es/nullhop-a-flexible-convolutional-neural-network-accelerator-
based-on-sparse-representations-of-feature-maps/

More Related Content

PDF
Google Dev Summit Extended Seoul - TensorFlow: Tensorboard & Keras
Taegyun Jeon
 
PDF
Introduction to TensorFlow
Matthias Feys
 
PDF
TENSORFLOW: ARCHITECTURE AND USE CASE - NASA SPACE APPS CHALLENGE by Gema Par...
Big Data Spain
 
PPTX
Getting started with TensorFlow
ElifTech
 
PDF
Tensorflow presentation
Ahmed rebai
 
PDF
TensorFlow Dev Summit 2017 요약
Jin Joong Kim
 
PDF
Data Science and Deep Learning on Spark with 1/10th of the Code with Roope As...
Databricks
 
Google Dev Summit Extended Seoul - TensorFlow: Tensorboard & Keras
Taegyun Jeon
 
Introduction to TensorFlow
Matthias Feys
 
TENSORFLOW: ARCHITECTURE AND USE CASE - NASA SPACE APPS CHALLENGE by Gema Par...
Big Data Spain
 
Getting started with TensorFlow
ElifTech
 
Tensorflow presentation
Ahmed rebai
 
TensorFlow Dev Summit 2017 요약
Jin Joong Kim
 
Data Science and Deep Learning on Spark with 1/10th of the Code with Roope As...
Databricks
 

What's hot (19)

PPTX
TensorFrames: Google Tensorflow on Apache Spark
Databricks
 
PDF
Machine learning at scale with Google Cloud Platform
Matthias Feys
 
PDF
Introduction to TensorFlow
Ralph Vincent Regalado
 
PPTX
An Introduction to TensorFlow architecture
Mani Goswami
 
PDF
TensorFlow and Keras: An Overview
Poo Kuan Hoong
 
PPTX
Deep learning with Tensorflow in R
mikaelhuss
 
PPTX
Neural networks and google tensor flow
Shannon McCormick
 
PDF
Spark Meetup TensorFrames
Jen Aman
 
PDF
TensorFlow
Sang-Houn Choi
 
PPTX
Meetup tensorframes
Paolo Platter
 
PDF
Advanced Spark and TensorFlow Meetup May 26, 2016
Chris Fregly
 
PDF
First steps with Keras 2: A tutorial with Examples
Felipe
 
PPTX
Deep-Dive into Deep Learning Pipelines with Sue Ann Hong and Tim Hunter
Databricks
 
PDF
Common Design of Deep Learning Frameworks
Kenta Oono
 
PDF
Multithreading to Construct Neural Networks
Altoros
 
PDF
Åsted .Net (CSI .Net)
Kjetil Klaussen
 
PPTX
ML6 talk at Nexxworks Bootcamp
Karel Dumon
 
PPTX
Tensorflow 101 @ Machine Learning Innovation Summit SF June 6, 2017
Ashish Bansal
 
PDF
Teaching Recurrent Neural Networks using Tensorflow (May 2016)
Rajiv Shah
 
TensorFrames: Google Tensorflow on Apache Spark
Databricks
 
Machine learning at scale with Google Cloud Platform
Matthias Feys
 
Introduction to TensorFlow
Ralph Vincent Regalado
 
An Introduction to TensorFlow architecture
Mani Goswami
 
TensorFlow and Keras: An Overview
Poo Kuan Hoong
 
Deep learning with Tensorflow in R
mikaelhuss
 
Neural networks and google tensor flow
Shannon McCormick
 
Spark Meetup TensorFrames
Jen Aman
 
TensorFlow
Sang-Houn Choi
 
Meetup tensorframes
Paolo Platter
 
Advanced Spark and TensorFlow Meetup May 26, 2016
Chris Fregly
 
First steps with Keras 2: A tutorial with Examples
Felipe
 
Deep-Dive into Deep Learning Pipelines with Sue Ann Hong and Tim Hunter
Databricks
 
Common Design of Deep Learning Frameworks
Kenta Oono
 
Multithreading to Construct Neural Networks
Altoros
 
Åsted .Net (CSI .Net)
Kjetil Klaussen
 
ML6 talk at Nexxworks Bootcamp
Karel Dumon
 
Tensorflow 101 @ Machine Learning Innovation Summit SF June 6, 2017
Ashish Bansal
 
Teaching Recurrent Neural Networks using Tensorflow (May 2016)
Rajiv Shah
 
Ad

Similar to Introduction To Using TensorFlow & Deep Learning (20)

PPTX
TensorFlow Tutorial | Deep Learning With TensorFlow | TensorFlow Tutorial For...
Simplilearn
 
PDF
TensorFlow example for AI Ukraine2016
Andrii Babii
 
PPTX
What is TensorFlow? | Introduction to TensorFlow | TensorFlow Tutorial For Be...
Simplilearn
 
PDF
CCS335 _ Neural Networks and Deep Learning Laboratory_Lab Complete Record
Guru Nanak Technical Institutions
 
PPTX
Neural Networks with Google TensorFlow
Darshan Patel
 
PPTX
Deep Learning in your Browser: powered by WebGL
Oswald Campesato
 
PPTX
TensorFlow in Your Browser
Oswald Campesato
 
PPTX
Introduction to Deep Learning and TensorFlow
Oswald Campesato
 
PDF
Introduction To TensorFlow | Deep Learning with TensorFlow | TensorFlow For B...
Edureka!
 
PDF
Language translation with Deep Learning (RNN) with TensorFlow
S N
 
PDF
Power ai tensorflowworkloadtutorial-20171117
Ganesan Narayanasamy
 
PPTX
Deep Learning in Your Browser
Oswald Campesato
 
PPTX
Intro to Deep Learning, TensorFlow, and tensorflow.js
Oswald Campesato
 
PPTX
Lecture Note DL&NN Tensorflow.pptx
BhaviniBhatt7
 
PPTX
Java and Deep Learning (Introduction)
Oswald Campesato
 
PPTX
Deep Learning and TensorFlow
Oswald Campesato
 
PPTX
Tensorflow - Intro (2017)
Alessio Tonioni
 
PPTX
TensorFlow.pptx
Jayesh Patil
 
PPTX
Introduction to Deep Learning, Keras, and Tensorflow
Oswald Campesato
 
PDF
Introduction to Deep Learning, Keras, and TensorFlow
Sri Ambati
 
TensorFlow Tutorial | Deep Learning With TensorFlow | TensorFlow Tutorial For...
Simplilearn
 
TensorFlow example for AI Ukraine2016
Andrii Babii
 
What is TensorFlow? | Introduction to TensorFlow | TensorFlow Tutorial For Be...
Simplilearn
 
CCS335 _ Neural Networks and Deep Learning Laboratory_Lab Complete Record
Guru Nanak Technical Institutions
 
Neural Networks with Google TensorFlow
Darshan Patel
 
Deep Learning in your Browser: powered by WebGL
Oswald Campesato
 
TensorFlow in Your Browser
Oswald Campesato
 
Introduction to Deep Learning and TensorFlow
Oswald Campesato
 
Introduction To TensorFlow | Deep Learning with TensorFlow | TensorFlow For B...
Edureka!
 
Language translation with Deep Learning (RNN) with TensorFlow
S N
 
Power ai tensorflowworkloadtutorial-20171117
Ganesan Narayanasamy
 
Deep Learning in Your Browser
Oswald Campesato
 
Intro to Deep Learning, TensorFlow, and tensorflow.js
Oswald Campesato
 
Lecture Note DL&NN Tensorflow.pptx
BhaviniBhatt7
 
Java and Deep Learning (Introduction)
Oswald Campesato
 
Deep Learning and TensorFlow
Oswald Campesato
 
Tensorflow - Intro (2017)
Alessio Tonioni
 
TensorFlow.pptx
Jayesh Patil
 
Introduction to Deep Learning, Keras, and Tensorflow
Oswald Campesato
 
Introduction to Deep Learning, Keras, and TensorFlow
Sri Ambati
 
Ad

Recently uploaded (20)

PDF
TIC ACTIVIDAD 1geeeeeeeeeeeeeeeeeeeeeeeeeeeeeer3.pdf
Thais Ruiz
 
PDF
345_IT infrastructure for business management.pdf
LEANHTRAN4
 
PPTX
artificial intelligence deeplearning-200712115616.pptx
revathi148366
 
PPTX
Economic Sector Performance Recovery.pptx
yulisbaso2020
 
PPTX
Introduction-to-Python-Programming-Language (1).pptx
dhyeysapariya
 
PDF
Mastering Financial Analysis Materials.pdf
SalamiAbdullahi
 
PDF
Data_Cleaning_Infographic_Series_by_CA_Suvidha_Chaplot.pdf
CA Suvidha Chaplot
 
PPTX
lecture 13 mind test academy it skills.pptx
ggesjmrasoolpark
 
PPTX
Analysis of Employee_Attrition_Presentation.pptx
AdawuRedeemer
 
PPTX
INFO8116 - Week 10 - Slides.pptx big data architecture
guddipatel10
 
PDF
CH2-MODEL-SETUP-v2017.1-JC-APR27-2017.pdf
jcc00023con
 
PDF
blockchain123456789012345678901234567890
tanvikhunt1003
 
PPTX
Introduction to Data Analytics and Data Science
KavithaCIT
 
PDF
Classifcation using Machine Learning and deep learning
bhaveshagrawal35
 
PPTX
Pipeline Automatic Leak Detection for Water Distribution Systems
Sione Palu
 
PPTX
Employee Salary Presentation.l based on data science collection of data
barridevakumari2004
 
PPTX
Decoding Physical Presence: Unlocking Business Intelligence with Wi-Fi Analytics
meghahiremath253
 
PDF
Company Profile 2023 PT. ZEKON INDONESIA.pdf
hendranofriadi26
 
PPTX
Data Security Breach: Immediate Action Plan
varmabhuvan266
 
PDF
Company Presentation pada Perusahaan ADB.pdf
didikfahmi
 
TIC ACTIVIDAD 1geeeeeeeeeeeeeeeeeeeeeeeeeeeeeer3.pdf
Thais Ruiz
 
345_IT infrastructure for business management.pdf
LEANHTRAN4
 
artificial intelligence deeplearning-200712115616.pptx
revathi148366
 
Economic Sector Performance Recovery.pptx
yulisbaso2020
 
Introduction-to-Python-Programming-Language (1).pptx
dhyeysapariya
 
Mastering Financial Analysis Materials.pdf
SalamiAbdullahi
 
Data_Cleaning_Infographic_Series_by_CA_Suvidha_Chaplot.pdf
CA Suvidha Chaplot
 
lecture 13 mind test academy it skills.pptx
ggesjmrasoolpark
 
Analysis of Employee_Attrition_Presentation.pptx
AdawuRedeemer
 
INFO8116 - Week 10 - Slides.pptx big data architecture
guddipatel10
 
CH2-MODEL-SETUP-v2017.1-JC-APR27-2017.pdf
jcc00023con
 
blockchain123456789012345678901234567890
tanvikhunt1003
 
Introduction to Data Analytics and Data Science
KavithaCIT
 
Classifcation using Machine Learning and deep learning
bhaveshagrawal35
 
Pipeline Automatic Leak Detection for Water Distribution Systems
Sione Palu
 
Employee Salary Presentation.l based on data science collection of data
barridevakumari2004
 
Decoding Physical Presence: Unlocking Business Intelligence with Wi-Fi Analytics
meghahiremath253
 
Company Profile 2023 PT. ZEKON INDONESIA.pdf
hendranofriadi26
 
Data Security Breach: Immediate Action Plan
varmabhuvan266
 
Company Presentation pada Perusahaan ADB.pdf
didikfahmi
 

Introduction To Using TensorFlow & Deep Learning

  • 1. Presentation by: In the name of God Professor : 1 Introduction To Using TensorFlow
  • 2. 2Introduction To Using TensorFlow Overview ■ TensorFlow – What is TensorFlow – TensorFlow Code Basics – TensorFlow UseCase ■ Deep Learning – CNN & RNN – Exmple of Mnist Data Set Classification
  • 3. What is TensorFlow 3Introduction To Using TensorFlow What are Tensors? As shown in the image above, tensors are just multidimensional arrays, that allows you to represent data having higher dimensions. 9 … 0
  • 4. What is TensorFlow ■ VGG Network ■ Plain Network ■ Residual Network ■ Experiments ■ Conclusion Network 4Introduction To Using TensorFlow What are Tensors & Flow? In fact, the name “TensorFlow” has been derived from the operations which neural networks perform on tensors. TensorFlow is a library based on Python that provides different types of functionality for implementing Deep Learning Models. the term TensorFlow is made up of two terms – Tensor & Flow:
  • 5. What is TensorFlow 5Introduction To Using TensorFlow TensorFlow (data flow) graph
  • 6. TensorFlow Code Basics 6Introduction To Using TensorFlow ■ Basically, the overall process of writing a TensorFlow program involves two steps: ■ Building a Computational Graph ■ Running a Computational Graph Let me explain you the above two steps one by one:
  • 7. TensorFlow Code Basics 7Introduction To Using TensorFlow ■ Building & Running The Computational Graph ■ Example: Tensor & Flow OR Data & Flow import tensorflow as tf # Build a graph a = tf.constant(8.0) b = tf.constant(9.0) c = a * b # Create the session object sess = tf.Session() output_c = sess.run(c) print(output_c) sess.close()
  • 8. What is TensorFlow 8Introduction To Using TensorFlow  Main Components of Tensorflow: A. Variables: Retain values between sessions, use for weights/bias B. Nodes: The operations C. Tensors: Signals that pass from/to nodes D. Placeholders: Used to send data between your program and the tensorflow graph E. Session: Place when graph is executed. Points to Remember about placeholders: •Placeholders are not initialized and contains no data. •One must provides inputs or feeds to the placeholder which are considered during runtime. •Executing a placeholder without input generates an error.
  • 9. TensorFlow Code Basics 9Introduction To Using TensorFlow ■ Building & Running The Computational Graph ■ Constants, Placeholder and Variables import tensorflow as tf # Creating placeholders a = tf. placeholder(tf.float32) b = tf. placeholder(tf.float32) # Assigning multiplication operation w.r.t. a & b to node mul mul = a*b # Create session object sess = tf.Session() # Executing mul by passing the values [1, 3] [2, 4] for a and b respectively output = sess.run(mul, {a: [1,3], b: [2, 4]}) print('Multiplying a b:', output) Output: [2. 12.]
  • 10. TensorFlow Code Basics 10Introduction To Using TensorFlow ■ Example : Linear Regression on tensorflow
  • 11. TensorFlow Code Basics 11Introduction To Using TensorFlow ■ Example : Linear Regression on tensorflow
  • 12. TensorFlow Code Basics 12Introduction To Using TensorFlow ■ Example : Linear Regression on tensorflow
  • 13. TensorFlow Code Basics 13Introduction To Using TensorFlow ■ Example : Linear Regression on tensorflow
  • 14. TensorFlow Code Basics 14Introduction To Using TensorFlow ■ Example : Linear Regression on tensorflow
  • 15. 15Introduction To Using TensorFlow Deep Learning CNN & RNN
  • 16. Artificial Intelligence 16Introduction To Using TensorFlow Deep Learning & Machine Learning
  • 17. Deep Learning 17Introduction To Using TensorFlow ■ Deep Learning vs Machine Learning
  • 18. Deep Learning 18Introduction To Using TensorFlow ■ Deep Learning with Neural Network
  • 19. Deep Learning 19Introduction To Using TensorFlow ■ Deep Learning with Neural Network
  • 20. Deep Learning 20Introduction To Using TensorFlow ■ Deep Learning with Neural Network
  • 21. TensorFlow Code Basics 21Introduction To Using TensorFlow ■ Example of Neural Network:
  • 22. Deep Learning 22Introduction To Using TensorFlow ■ Neural Network (NN) Forward Pass Backward Pass
  • 23. Deep Learning 23Introduction To Using TensorFlow ■ Convolutional Neural Network (CNN) The three main processing stages in a CNN
  • 24. Deep Learning 24Introduction To Using TensorFlow ■ Convolutional Neural Network (CNN)
  • 25. Deep Learning 25Introduction To Using TensorFlow ■ Convolutional Neural Network (CNN) Example filters learned by Krizhevsky et al. Each of the 96 filters shown here is of size [11x11x3], and each one is shared by the 55*55 neurons in one depth slice. 1 2 3
  • 26. Deep Learning 26Introduction To Using TensorFlow ■ Convolutional Neural Network (CNN)
  • 27. Deep Learning 27Introduction To Using TensorFlow ■ Convolutional Neural Network (CNN)
  • 28. Deep Learning 28Introduction To Using TensorFlow ■ Convolutional Neural Network (CNN)
  • 29. Deep Learning 29Introduction To Using TensorFlow ■ Convolutional Neural Network (CNN)
  • 30. Deep Learning 30Introduction To Using TensorFlow ■ Convolutional Neural Network (CNN) Denoising
  • 31. Deep Learning 31Introduction To Using TensorFlow ■ Convolutional Neural Network (CNN)
  • 32. Deep Learning 32Introduction To Using TensorFlow ■ Convolutional Neural Network (CNN)
  • 33. Deep Learning 33Introduction To Using TensorFlow ■ Convolutional Neural Network (CNN)
  • 34. Deep Learning 34Introduction To Using TensorFlow ■ Convolutional Neural Network (CNN)
  • 35. Deep Learning 35Introduction To Using TensorFlow ■ Convolutional Neural Network (CNN)
  • 36. Deep Learning 36Introduction To Using TensorFlow ■ Convolutional Neural Network (CNN) http://scs.ryerson.ca/~aharley/vis/conv/The three main processing stages in a CNN
  • 37. Deep Learning 37Introduction To Using TensorFlow ■ Example :
  • 38. Deep Learning 38Introduction To Using TensorFlow ■ Example :
  • 39. Deep Learning 39Introduction To Using TensorFlow ■ Example :
  • 40. TensorFlow Code Basics 40Introduction To Using TensorFlow ■ Example : Multi Layer Perceptron MNIST on tensorflow The MNIST database (Modified National Institute of Standards and Technology database) is a large database of handwritten digits that is commonly used for training various image processing systems.
  • 41. TensorFlow Code Basics 41Introduction To Using TensorFlow ■ Example : Multi Layer Perceptron MNIST on tensorflow 1. Load tensorflow library and MNIST data 2. Neural network parameters 3. Build graph 4. Initialize weights and construct the model 5. Define Loss function, and Optimizer 6. Launch graph
  • 42. TensorFlow Code Basics 42Introduction To Using TensorFlow # Parameters learning_rate = 0.001 training_epochs = 15 batch_size = 100 # Network Parameters n_hidden_1 = 256 n_hidden_2 = 256 n_input = 784 n_classes = 10 # On this case we choose the AdamOptimizer optimizer = tf.train.AdamOptimizer(learning_rate=learning_rate).minimize(cost) # Cross entropy loss function cost = tf.reduce_mean(tf.nn.softmax_cross_entropy_with_logits(pred, y)) ■ Example :
  • 43. TensorFlow Code Basics 43Introduction To Using TensorFlow ■ Example : Multi Layer Perceptron MNIST on tensorflow 1. Load tensorflow library and MNIST data import tensorflow as tf # Import MNIST data from tensorflow.examples.tutorials.mnist import input_data mnist = input_data.read_data_sets("/tmp/data/", one_hot=True) print('Test shape:',mnist.test.images.shape) print('Train shape:',mnist.train.images.shape) Test shape: (10000, 784) Train shape: (55000, 784)
  • 44. TensorFlow Code Basics 44Introduction To Using TensorFlow ■ Example : Multi Layer Perceptron MNIST on tensorflow # Parameters learning_rate = 0.001 training_epochs = 15 batch_size = 100 display_step = 1 # Network Parameters n_hidden_1 = 256 # 1st layer number of features n_hidden_2 = 256 # 2nd layer number of features n_input = 784 # MNIST data input (img shape: 28*28) n_classes = 10 # MNIST total classes (0-9 digits) 2. Neural network parameters
  • 45. TensorFlow Code Basics 45Introduction To Using TensorFlow ■ Example : Multi Layer Perceptron MNIST on tensorflow x = tf.placeholder("float", [None, n_input]) y = tf.placeholder("float", [None, n_classes]) # Create model def multilayer_perceptron(x, weights, biases): print('x:',x.get_shape(),'W1:',weights['h1'].get_shape(),'b1:',biases['b1'].get_shape()) layer_1 = tf.add(tf.matmul(x, weights['h1']), biases['b1']) layer_1 = tf.nn.relu(layer_1) print( 'layer_1:', layer_1.get_shape(), 'W2:', weights['h2'].get_shape(), 'b2:', biases['b2'].get_shape()) layer_2 = tf.add(tf.matmul(layer_1, weights['h2']), biases['b2']) layer_2 = tf.nn.relu(layer_2) print( 'layer_2:', layer_2.get_shape(), 'W3:', weights['out'].get_shape(), 'b3:', biases['out'].get_shape()) out_layer = tf.matmul(layer_2, weights['out']) + biases['out'] print('out_layer:',out_layer.get_shape()) return out_layer 3. Build graph
  • 46. TensorFlow Code Basics 46Introduction To Using TensorFlow ■ Example : Multi Layer Perceptron MNIST on tensorflow # Store layers weight & bias weights = { 'h1': tf.Variable(tf.random_normal([n_input, n_hidden_1])), #784x256 'h2': tf.Variable(tf.random_normal([n_hidden_1, n_hidden_2])), #256x256 'out': tf.Variable(tf.random_normal([n_hidden_2, n_classes])) #256x10 } biases = { 'b1': tf.Variable(tf.random_normal([n_hidden_1])), #256x1 'b2': tf.Variable(tf.random_normal([n_hidden_2])), #256x1 'out': tf.Variable(tf.random_normal([n_classes])) #10x1 } # Construct model pred = multilayer_perceptron(x, weights, biases) 4. Initialize weights and construct the model
  • 47. TensorFlow Code Basics 47Introduction To Using TensorFlow ■ Example : Multi Layer Perceptron MNIST on tensorflow 5. Define Loss function, and Optimizer # Cross entropy loss function cost = tf.reduce_mean(tf.nn.softmax_cross_entropy_with_logits(pred, y)) # On this case we choose the AdamOptimizer optimizer = tf.train.AdamOptimizer(learning_rate=learning_rate).minimize(cost)
  • 48. TensorFlow Code Basics 48Introduction To Using TensorFlow ■ Example : Multi Layer Perceptron MNIST on tensorflow 6.1 Launch graph # Initializing the variables init = tf.initialize_all_variables() # Launch the graph with tf.Session() as sess: sess.run(init) # Training cycle for epoch in range(training_epochs): avg_cost = 0. total_batch = int(mnist.train.num_examples/batch_size) # Loop over all batches for i in range(total_batch): batch_x, batch_y = mnist.train.next_batch(batch_size) # Run optimization op (backprop) and cost op (to get loss value) _, c = sess.run([optimizer, cost], feed_dict={x: batch_x, y: batch_y}) # Compute average loss avg_cost += c / total_batch
  • 49. TensorFlow Code Basics 49Introduction To Using TensorFlow ■ Example : Multi Layer Perceptron MNIST on tensorflow 6.2 Launch graph # Display logs per epoch step if epoch % display_step == 0: print ("Epoch:", '%04d' % (epoch+1), "cost=", "{:.9f}".format(avg_cost)) print("Optimization Finished!") # Test model correct_prediction = tf.equal(tf.argmax(pred, 1), tf.argmax(y, 1)) # Calculate accuracy accuracy = tf.reduce_mean(tf.cast(correct_prediction, "float")) # To keep sizes compatible with model print ("Accuracy:", accuracy.eval({x: mnist.test.images, y: mnist.test.labels}))
  • 50. TensorFlow Code Basics 50Introduction To Using TensorFlow ■ Example : Multi Layer Perceptron MNIST on tensorflow 6.2 Output of Execute graph with CNN Epoch: 0001 cost= 152.289635962 Epoch: 0002 cost= 39.134648348 ... Epoch: 0015 cost= 0.850344581 Optimization Finished! Accuracy: 0.9464
  • 51. Deep Learning Layer 51Introduction To Using TensorFlow W Increasing Depth Layer ?Increasing Parameter ? Become Overfit
  • 54. 54Introduction To Using TensorFlow Results :
  • 58. Reference 58Introduction To Using TensorFlow https://www.edureka.co/blog/tensorflow-tutorial/ http://howsam.org/1396/08/11/%D8%A2%D9%85%D9%88%D8%B2%D8%B4- %D8%AA%D9%86%D8%B3%D9%88%D8%B1%D9%81%D9%84%D9%88/ http://www.7khatcode.com/7677/%D8%AA%D9%86%D8%B3%D9%88%D8%B1%D9%81%D9 %84%D9%88-tensorflow-%DA%86%DB%8C%D8%B3%D8%AA%D8%9F https://blog.faradars.org/cnn-convolution-perceptron-neural-network-2/ https://leonardoaraujosantos.gitbooks.io/artificial-inteligence/content/loss-function.html https://medium.com/machine-learning-in-practice/over-150-of-the-best-machine-learning-nlp-and- python-tutorials-ive-found-ffce2939bd78 https://ujjwalkarn.me/2016/08/11/intuitive-explanation-convnets/ http://howsam.org/1396/08/12/%D9%86%D8%B5%D8%A8- %D8%AA%D9%86%D8%B3%D9%88%D8%B1%D9%81%D9%84%D9%88/
  • 59. Reference 59Introduction To Using TensorFlow http://howsam.org/1396/08/16/%D8%B4%D8%B1%D9%88%D8%B9-%DA%A9%D8%A7%D8%B1- %D8%A8%D8%A7-%D8%AA%D9%86%D8%B3%D9%88%D8%B1%D9%81%D9%84%D9%88/ https://stanford.edu/~shervine/l/fa/teaching/cs-229/cheatsheet-supervised-learning https://stanford.edu/~shervine/l/fa/teaching/cs-229/cheatsheet-deep-learning https://stanford.edu/~shervine/l/fa/teaching/cs-229/cheatsheet-machine-learning-tips-and-tricks https://chistio.ir/%D9%BE%D8%B3-%D8%A7%D9%86%D8%AA%D8%B4%D8%A7%D8%B1- %D8%AE%D8%B7%D8%A7-back-propagation-%D8%B4%D8%A8%DA%A9%D9%87- %D8%B9%D8%B5%D8%A8%DB%8C/ http://deeplearning.ir/%D9%BE%DB%8C%D8%B4%DB%8C%D9%86%D9%87-%D9%88- %D9%85%D8%B1%D9%88%D8%B1%DB%8C-%D8%A8%D8%B1- %D8%B1%D9%88%D8%B4%D9%87%D8%A7%DB%8C- %D9%85%D8%AE%D8%AA%D9%84%D9%81- %DB%8C%D8%A7%D8%AF%DA%AF%DB%8C%D8%B1%DB%8C/ https://ujjwalkarn.me/2016/08/11/intuitive-explanation-convnets/
  • 61. Reference 61Introduction To Using TensorFlow https://www.youtube.com/watch?v=FmpDIaiMIeA https://www.youtube.com/watch?v=2-Ol7ZB0MmU https://brohrer.github.io/how_convolutional_neural_networks_work.html http://www.rtc.us.es/nullhop-a-flexible-convolutional-neural-network-accelerator- based-on-sparse-representations-of-feature-maps/