0% found this document useful (0 votes)
14 views13 pages

INT527 Unit1 1

TensorFlow is an open-source library developed by Google for building, training, and deploying deep learning models using dataflow graphs and tensors. It supports various optimization techniques, including Stochastic Gradient Descent and Adam, to enhance machine learning computations. The library is widely used in AI applications and addresses challenges like the vanishing gradient in deep neural networks.

Uploaded by

prathammalviya8
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
14 views13 pages

INT527 Unit1 1

TensorFlow is an open-source library developed by Google for building, training, and deploying deep learning models using dataflow graphs and tensors. It supports various optimization techniques, including Stochastic Gradient Descent and Adam, to enhance machine learning computations. The library is widely used in AI applications and addresses challenges like the vanishing gradient in deep neural networks.

Uploaded by

prathammalviya8
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 13

Introduction to TensorFlow

 Open-source library for deep learning


 Developed by Google Brain Team
 Build, train, and deploy complex neural networks
 Ability to scale to large datasets and complex models
 Utilizes dataflow graphs for computations
 Widely used for AI applications
TensorFlow ranks and tensors, TensorFlow’s
Computation graphs
•Tensors: Multidimensional arrays for data representation
•Ranks define tensor's dimensionality and structure
•Scalars rank zero, vectors rank one
•Graphs define TensorFlow's computation flow systematically
•Nodes represent operations, edges are tensors
•Enables efficient, scalable machine learning computations
TensorFlow ranks and tensors
TensorFlow's computation graphs
variables in TensorFlow
•Gradient Descent Optimizers
Stochastic Gradient Descent (SGD)
•Iteratively updates weights using the gradient of the loss function.
•Adaptive Optimizers
Adam (Adaptive Moment Estimation)
•Combines momentum and RMSprop; adapts learning rate.
•Momentum-Based Optimizers
•RMSprop
•Divides learning rate by a moving average of recent gradients' magnitudes.
•Regularization Optimizers
•AdaGrad (Adaptive Gradient)
•Adapts learning rate to parameters, performing larger updates for infrequent
and smaller updates for frequent parameters.
transforming tensors as multidimensional data arrays
visualization with Tensorboard
Introduction to Deep Learning
•Mimics human brain for intelligent tasks.
•Employs layers for feature extraction learning.
•Revolutionized AI with superior predictive performance.
•Requires substantial data for accurate results.
•Utilizes neural networks for solving problems.
•Applications include healthcare, finance, image
recognition.
The Vanishing Gradient

•Occurs in deep neural network training.


•Gradients shrink, slow down weight updates.
•Affects learning in earlier network layers.
•Leads to poor performance and convergence.
•Mitigated using activation functions like ReLU.
•Optimization techniques improve gradient flow stability.
Deep Learning Libraries

You might also like