S06 DNN Tensorflow PyTorch Wip
S06 DNN Tensorflow PyTorch Wip
TensorFlow… PyTorch…
2 Agenda
What is Tensor
Overview of libraries
TensorFlow
PyTorch
Summary
11/22/2024
1
11/22/2024
3 Story So far…
11/22/2024
What is Tensor…
11/22/2024
2
11/22/2024
What is a vector????
11/22/2024
Tensor of Rank 1
11/22/2024
3
11/22/2024
11/22/2024
Tensor of Rank 2
11/22/2024
4
11/22/2024
11/22/2024
Consider intuitively that a tensor represents a physical entity which may be characterized by
magnitude and multiple directions simultaneously (Fleisch 2012).
Therefore, the number of simultaneous directions is denoted R and is called the rank of the tensor in
question.
11/22/2024
5
11/22/2024
11/22/2024
Looks similar to Numpy Array, even behave similar way in some aspects
11/22/2024
6
11/22/2024
13 As a Data Scientist…
A tensor is a type of multidimensional array with certain transformation properties
You toss a ball in the air, how many numbers do I need to define it’s velocity?
1… 2… 6! Right?
𝑣 ,𝑣 ,𝑣 ,𝑟 ,𝑟 ,𝑟
What if I am standing outside earth?
Outside our galaxy…. My head is spinning already!
It's the rules of changing representation when switching between systems of reference that make
multidimensional array a tensor.
11/22/2024
14
11/22/2024
7
11/22/2024
“
15
Why Tensors!
”
11/22/2024
16
11/22/2024
8
11/22/2024
11/22/2024
11/22/2024
9
11/22/2024
11/22/2024
11/22/2024
10
11/22/2024
11/22/2024
11/22/2024
11
11/22/2024
11/22/2024
11/22/2024
12
11/22/2024
11/22/2024
11/22/2024
13
11/22/2024
27
TensorFlow
11/22/2024
Numpy has Ndarray support, but doesn’t offer methods to create tensor functions and automatically
compute derivatives
11/22/2024
14
11/22/2024
a.shape a.get_shape()
b*5+1 b*5+1
np.dot(a,b) tf.matmul(a, b)
11/22/2024
ta = tf.zeros((2,2))
TensorFlow computations define
a computation graph that has no
print(a)
numerical value until evaluated!
[[ 0. 0.]
[ 0. 0.]]
print(ta)
Tensor("zeros_1:0", shape=(2, 2), dtype=float32)
print(ta.eval())
[[ 0. 0.]
[ 0. 0.]]
11/22/2024
15
11/22/2024
31 Session Object
Till version 1:
“A Session object encapsulates the environment in which Tensor objects are evaluated”
- TensorFlow Docs
a = tf.constant ( 5.0 )
b = tf.constant ( 6.0 )
c=a*b
11/22/2024
32 Session Object
Till version 1 - “A Session object encapsulates the environment in which Tensor objects are
evaluated” - TensorFlow Docs
a = tf.constant(5.0)
b = tf.constant(6.0)
Not available in Version 2.0!
c=a*b
11/22/2024
16
11/22/2024
33 TensorFlow
“TensorFlow programs are usually structured into a construction phase, that assembles a graph, and
an execution phase that uses a session to execute ops in the graph.” - TensorFlow docs
“When you train a model you use variables to hold and update parameters. Variables are in-memory
buffers containing tensors” - TensorFlow Docs.
34
PyTorch
11/22/2024
17
11/22/2024
35 Pytorch
An open source machine learning framework that accelerates the path from research prototyping to
production deployment
11/22/2024
36 What is PyTorch?
It’s a Python-based scientific computing package targeted at two sets of audiences:
A replacement for NumPy to use the power of GPUs
A deep learning research platform that provides maximum flexibility and speed
https://pytorch.org/tutorials/beginner/basics/intro.html
https://pytorch.org/tutorials/beginner/examples_tensor/two_layer_net_tensor.html
11/22/2024
18
11/22/2024
37 PyTorch
Three levels of abstraction:
Tensor: Tensor is an imperative n-dimensional array which runs on GPU.
Variable: It is a node in the computational graph. This stores data and gradient.
Module: Neural network layer will store state the otherwise learnable weight.
torch.from_numpy():
To create a tensor from numpy.ndarray. The ndarray and return tensor share the same memory.
If we make any changes in the returned tensor, then it will reflect the ndarray also
autograd.variable:
Provides classes and functions for implementing automatic differentiation of arbitrary scalar-valued functions.
We only need to declare tensor for which gradients should be computed with the ‘requires_grad=True’
Calculating Gradient:
Initialization of the function for which we will calculate the derivatives.
Set the value of the variable which is used in the function.
Compute the derivative of the function by using the backward () method.
Print the value of the derivative using grad.
11/22/2024
38
11/22/2024
19
11/22/2024
39 Comparison
Features PyTorch TensorFlow 2.0
Created by FAIR Lab (Facebook AI Research Lab) Google Brain Team
Based on Torch Theano
Production Research focused Industry-focused
Visualization Visdom Tensorboard
Deployment Torch Serve (experimental) TensorFlow Serve
Mobile Deployment Yes (experimental) Yes
Device Management CUDA Automated
Graph Generation Dynamic and static mode Eager and static mode
Learning Curve Easier for developers and scientists Easier for industry-level projects
Facebook
Google
CheXNet
Sinovation
Tesla
Use Cases Ventures
Autopilot
PayPal
Uber
China Mobile
PYRO
11/22/2024
40 Comparison
TENSORFLOW PYTORCH
PROS: PROS:
Simple built-in high-level API Python-like coding
Visualizing training with Tensorboard Dynamic graph
Production-ready thanks to TensorFlow serving Easy & quick editing
Easy mobile support Good documentation and community support
Open source Open source
Good documentation and community support Plenty of projects out there using PyTorch
CONS: CONS:
Static graph Third-party needed for visualization.
Debugging method API server needed for production.
Hard to make quick changes
11/22/2024
20
11/22/2024
41 Visualization
PyTorch and TensorFlow both have tools for quick visual analysis.
TensorFlow PyTorch
Tensorboard is used for visualizing data. PyTorch uses Visdom for visualization.
The interface is interactive and visually appealing. The interface is lightweight and straightforward to
Tensorboard provides a detailed overview of use.
metrics and training data. Visdom is flexible and customizable.
The data is easily exported and looks great for Direct support for PyTorch tensors makes it
presentation purposes. simple to use.
Plugins make Tensorboard available for PyTorch
as well.
Visdom lacks interactivity and many essential
features for overviewing data.
However, Tensorboard is cumbersome and
complicated to use.
11/22/2024
42 Production Deployment
When it comes to deploying trained models to production, TensorFlow is the clear winner.
TensorFlow PyTorch
We can directly deploy models in TensorFlow In PyTorch, these production deployments
using TensorFlow serving which is a framework became easier to handle than in it’s latest 1.0
that uses REST Client API. stable version, but it doesn't provide any
framework to deploy models directly on to the
web. You'll have to use either Flask, FastAPI or
Django as the backend server. So, TensorFlow
serving may be a better option if performance
is a concern.
11/22/2024
21
11/22/2024
11/22/2024
44 Reflect…
Loss Function:
Used to evaluate how well our algorithm is modeling training data.
If our prediction is completely off, then the function will output a higher number else it will output a lower number
Activation function
A neuron should be activated or not, is determined by an activation function
Neuron has two functions – an Aggregation function and an Activation function
Perceptron:
Perceptron is a single neuron neural network. Perceptron is a binary classifier, and it is used in supervised
learning.
A simple model of a biological neuron in an artificial neural network is known as Perceptron
Backpropagation:
Backpropogation of error to learn model parameters.
Algorithms are a set of methods to solve NN as optimization problem.
11/22/2024
22
11/22/2024
45 Reflect…
Which of the subsequent declaration(s) effectively represents an actual neuron in TensorFlow?
a. A neuron has a single enter and a single output best
b. A neuron has multiple inputs but a single output only
c. A neuron has a single input, however, more than one outputs
d. A neuron has multiple inputs and more than one outputs
e. All of the above statements are valid
11/22/2024
46
Let’s Code…
11/22/2024
23
11/22/2024
47 Next Session
11/22/2024
48
11/22/2024
24