This function is deprecated in favor of
tf.config.experimental.get_memory_info. Calling this function is equivalent
to calling tf.config.experimental.get_memory_info()['current'].
For GPUs, TensorFlow will allocate all the memory by default, unless changed
with tf.config.experimental.set_memory_growth. This function only returns
the memory that TensorFlow is actually using, not the memory that TensorFlow
has allocated on the GPU.
Args
device
Device string to get the bytes in use for, e.g. "GPU:0"
[null,null,["Last updated 2024-04-26 UTC."],[],[],null,["# tf.config.experimental.get_memory_usage\n\n\u003cbr /\u003e\n\n|--------------------------------------------------------------------------------------------------------------------------------|\n| [View source on GitHub](https://github.com/tensorflow/tensorflow/blob/v2.16.1/tensorflow/python/framework/config.py#L661-L697) |\n\nGet the current memory usage, in bytes, for the chosen device. (deprecated)\n\n#### View aliases\n\n\n**Compat aliases for migration**\n\nSee\n[Migration guide](https://www.tensorflow.org/guide/migrate) for\nmore details.\n\n[`tf.compat.v1.config.experimental.get_memory_usage`](https://www.tensorflow.org/api_docs/python/tf/config/experimental/get_memory_usage)\n\n\u003cbr /\u003e\n\n tf.config.experimental.get_memory_usage(\n device\n )\n\n| **Deprecated:** THIS FUNCTION IS DEPRECATED. It will be removed in a future version. Instructions for updating: Use tf.config.experimental.get_memory_info(device)\\['current'\\] instead.\n\nThis function is deprecated in favor of\n[`tf.config.experimental.get_memory_info`](../../../tf/config/experimental/get_memory_info). Calling this function is equivalent\nto calling `tf.config.experimental.get_memory_info()['current']`.\n\nSee \u003chttps://www.tensorflow.org/api_docs/python/tf/device\u003e for specifying device\nstrings.\n\n#### For example:\n\n gpu_devices = tf.config.list_physical_devices('GPU')\n if gpu_devices:\n tf.config.experimental.get_memory_usage('GPU:0')\n\nDoes not work for CPU.\n\nFor GPUs, TensorFlow will allocate all the memory by default, unless changed\nwith [`tf.config.experimental.set_memory_growth`](../../../tf/config/experimental/set_memory_growth). This function only returns\nthe memory that TensorFlow is actually using, not the memory that TensorFlow\nhas allocated on the GPU.\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n| Args ---- ||\n|----------|-----------------------------------------------------------|\n| `device` | Device string to get the bytes in use for, e.g. `\"GPU:0\"` |\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n| Returns ------- ||\n|---|---|\n| Total memory usage in bytes. ||\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n| Raises ------ ||\n|--------------|---------------------------------------|\n| `ValueError` | Non-existent or CPU device specified. |\n\n\u003cbr /\u003e"]]