Random number generation using TensorFlow
Last Updated :
26 Feb, 2024
In the field of Machine Learning, Random numbers generation plays an important role by providing stochasticity essential for model training, initialization, and augmentation. We have TensorFlow, a powerful open-source machine learning library, that contains tf.random module. This module helps us for implementing reproducible random number generation.
In this article, we will understand how we can generate random numbers by using the tools provided by TensorFlow. Two major approaches that we will discuss are:
- tf.random.Generator Class
- Stateless Random Functions
Let us discuss each of these approaches in detail.
tf.random.Generator Class
TensorFlow provides us with tf.random.Generator which is a powerful tool for managing random number generation. It allows us to create unique random numbers, each of them having their own internal state. This internal state is updated each time a random number is generated. It provides a level of control and reproducibility.
Creating a Generator
A generator can be created by using various methods like Generator.from_seed or Generator.from_non_deterministic_state. You can check the code given below.
The tf.random.Generator class in TensorFlow is used to create random number generators with customizable properties and behaviors. Instances of this class represent individual random number generators that can be used to produce random numbers according to various distributions and configurations.
In the following code, we are using TensorFlow's 'tf.random.Generator' to create a random number generator with specific seed. Then, we are going generate random numbers from a normal (Gaussian) distribution using this generator.
- we first create a random number generator with a specified seed value, the practise of using a fixed seed ensures that the generated random number will be the same every time we run the code.
- then, we generate random numbers from a normal distribution using the generator.normal() method.
Python3
import tensorflow as tf
# Create a generator with a specific seed
generator = tf.random.Generator.from_seed(42)
# Generate random numbers using the generator
random_numbers = generator.normal(shape=(4, 2))
print(random_numbers)
Output:
tf.Tensor(
[[-0.7565803 -0.06854702]
[ 0.07595026 -1.2573844 ]
[-0.23193763 -1.8107855 ]
[ 0.09988727 -0.50998646]], shape=(4, 2), dtype=float32)
The output contains random numbers drawn from normal distribution with mean 0 and standard deviation 1.
Device Placement
In TensorFlow, Device placement refers to the assignment of computational operations to specific devices, such as CPUs or GPUs, for execution. In this case device placement is relevent, when we are obtaining the global generator using tf.random.get_global_generator().
The tf.random.get_global_generator() function in TensorFlow is used to obtain the global random number generator instance.
It is important to consider device placement when you are working with generators. When you invoke tf.random.get_global_generator() the first time, the global generator is constructed and set up on the default device.
Python3
# Obtain the global generator
global_generator = tf.random.get_global_generator()
# Generate random numbers using the global generator
random_numbers_global = global_generator.normal(shape=())
print(random_numbers_global)
Output:
tf.Tensor(-0.1616769, shape=(), dtype=float32)
As an output we get a single random number. Take a note that the global generator is set up on the default device.
Switching Global Generator
It is possible to switch the global generator with a different one. This is done using tf.random.set_global_generator. However, you must use this function cautiously. A change in the global generator can impact tf.function.
Python3
# Create a new generator
new_generator = tf.random.Generator.from_seed(123)
# Switch out the global generator
tf.random.set_global_generator(new_generator)
# Generate random numbers using the new global generator
random_numbers_new_global = tf.random.normal(shape=())
print(random_numbers_new_global)
Output:
tf.Tensor(-0.054140557, shape=(), dtype=float32)
In the above code, we have created a new generator and set it as global generator. Then, by using this new global generator we have generated a random number.
Stateless Random Functions
In TensorFlow, Stateless Random Number Generators (RNGs) are the functions that generate random numbers without maintaining any internal state. They are purely functional as they produce the same output for the same set of input arguments.
Stateless random number generation is achieved using cryptographic hash functions to generate random numbers based on input seeds. This ensures that the same input seeds will always produce the same random numbers, making the process deterministic and reproducible.
The tf.random.stateless module provides functionality for stateless random number generation.
Stateless RNGs are deterministic which makes them useful in cases where we need reproducible results are required. Check the code given below, where we have generated random uniform numbers using tf.random.stateless_uniform function.
- After importing TensorFlow library, we have defined a Stateless RNG Key with seed values. These seed values are used as input to the stateless RNG to produce random numbers.
- Then, we generate random uniform numbers using the Stateless RNG.
Python3
import tensorflow as tf
# Define a stateless RNG key
key = tf.constant([1, 2], dtype=tf.int32)
# Generate random uniform numbers using the stateless RNG
random_uniform_numbers = tf.random.stateless_uniform(shape=(4, 2), seed=key)
print("Random Uniform Numbers:")
print(random_uniform_numbers)
Output:
Random Uniform Numbers:
tf.Tensor(
[[0.8440604 0.19204533]
[0.9962232 0.1603874 ]
[0.27199018 0.649346 ]
[0.10694444 0.06516242]], shape=(4, 2), dtype=float32)
In the above code, tf.random.stateless_uniform function is used to generate a 4x2 tensor of random uniform numbers. The function takes as an input, a shape argument and a stateless RNG key (seed). If you use the same key, then the same set of random numbers will be produced always.
Deterministic Behavior
Stateless RNGs maintain deterministic behavior based on their input arguments. If you re-run the same code with same input arguments, you will get identical random numbers.
Python3
# Re-run with the same key
random_uniform_numbers_again = tf.random.stateless_uniform(shape=(4, 2), seed=key)
print("Random Uniform Numbers (Again):")
print(random_uniform_numbers_again)
Output:
Random Uniform Numbers (Again):
tf.Tensor(
[[0.8440604 0.19204533]
[0.9962232 0.1603874 ]
[0.27199018 0.649346 ]
[0.10694444 0.06516242]], shape=(4, 2), dtype=float32)
You can see that we got the same numbers as we used the code with the same seed (key).
Difference Between Stateless RNGs and tf.random.Generator
A major difference between stateless RNGs and tf.random.Generator class is that stateless RNGs do not maintain any internal state. Here, the randomness is solely determined by the input seed. Thus, these functions are useful in scenarios where you want to control randomness deterministically. Let's explore more differences:
Feature
| Stateful RNGs (e.g., tf.random.Generator )
| Stateless RNGs (e.g., tf.random.stateless_uniform() )
|
---|
Internal State | Maintains internal state that evolves as random numbers are generated | Does not maintain any internal state; random numbers are solely determined by input seeds |
---|
Reproducibility | Sequence of random numbers may vary depending on internal state, but can be seeded for reproducibility | Produces deterministic and reproducible random numbers based solely on input seeds |
---|
Parallelism and Independence | May introduce dependencies among parallel random number sequences | Generates independent random number sequences from the same seed values |
---|
Performance | Optimized for generating large numbers of random samples | Slightly higher computational overhead due to hash function calculations for each random number |
---|
Use Cases | General-purpose random number generation where exact reproducibility is not required | Scenarios requiring deterministic reproducibility and parallelism, such as distributed computing or model training |
---|
Conclusion
Thus, TensorFlow provides us with variety of tools that can be used for creating randomness in the machine learning workflows. We studied two approaches to achieve the same. The tf.random.Generator class allows for the creation of distinct random number generators with manageable states, providing control and reproducibility. Stateless RNGs, on the other hand, offer a functional and deterministic approach, ensuring consistent results based solely on input arguments.
Similar Reads
Integrating Numba with Tensorflow
TensorFlow is a widely-used open-source library for machine learning and deep learning applications, while Numba is a just-in-time (JIT) compiler that translates a subset of Python and NumPy code into fast machine code. Combining these two powerful tools can potentially enhance computational efficie
5 min read
How to Generate Random Numbers in R
Random number generation is a process of creating a sequence of numbers that don't follow any predictable pattern. They are widely used in simulations, cryptography and statistical modeling. R Programming Language contains various functions to generate random numbers from different distributions lik
2 min read
Implementing Neural Networks Using TensorFlow
Deep learning has been on the rise in this decade and its applications are so wide-ranging and amazing that it's almost hard to believe that it's been only a few years in its advancements. And at the core of deep learning lies a basic "unit" that governs its architecture, yes, It's neural networks.
8 min read
SMS Spam Detection using TensorFlow in Python
In today's society, practically everyone has a mobile phone, and they all get communications (SMS/ email) on their phone regularly. But the essential point is that majority of the messages received will be spam, with only a few being ham or necessary communications. Scammers create fraudulent text m
8 min read
Music Generation Using RNN
Most of us love to hear music and sing music. Its creation involves a blend of creativity, structure, and emotional depth. The fusion of music and technology led to advancements in generating musical compositions using artificial intelligence which can lead to outstanding creativity. One of the sign
12 min read
Generative Modeling in TensorFlow
Generative modeling is the process of learning the underlying structure of a dataset to generate new samples that mimic the distribution of the original data. The article aims to provide a comprehensive overview of generative modelling along with the implementation leveraging the TensorFlow framewor
14 min read
One Hot Encoding using Tensorflow
In this post, we will be seeing how to initialize a vector in TensorFlow with all zeros or ones. The function you will be calling is tf.ones(). To initialize with zeros you could use tf.zeros() instead. These functions take in a shape and return an array full of zeros and ones accordingly. Code: imp
2 min read
Sudoku Solver using TensorFlow
The goal of the project is to build a Sudoku solver that can complete Sudoku problems autonomously using the capabilities of TensorFlow, a Google open-source machine learning toolkit. The algorithm aims to recognize patterns and relationships within the incomplete grids; the solver will be able to p
9 min read
Tensorflow.js tf.randomGamma() Function
Tensorflow.js is an open-source library developed by Google for running machine learning models and deep learning neural networks in the browser or node environment. The tf.randomGamma() function is used to create a tf.Tensor with values sampled from a gamma distribution. Syntax: tf.randomGamma(shap
2 min read
Image Recognition using TensorFlow
Image recognition is a task where a model identifies objects in an image and assigns labels to them. For example a model can be trained to identify difference between different types of flowers, animals or traffic signs. In this article, we will use Tensorflow and Keras to build a simple image recog
5 min read