tf.io.encode_proto
Stay organized with collections
Save and categorize content based on your preferences.
The op serializes protobuf messages provided in the input tensors.
tf.io.encode_proto(
sizes: Annotated[Any, _atypes.Int32],
values,
field_names,
message_type: str,
descriptor_source: str = 'local://',
name=None
) -> Annotated[Any, _atypes.String]
The types of the tensors in values
must match the schema for the fields
specified in field_names
. All the tensors in values
must have a common
shape prefix, batch_shape.
The sizes
tensor specifies repeat counts for each field. The repeat count
(last dimension) of a each tensor in values
must be greater than or equal
to corresponding repeat count in sizes
.
A message_type
name must be provided to give context for the field names.
The actual message descriptor can be looked up either in the linked-in
descriptor pool or a filename provided by the caller using the
descriptor_source
attribute.
For the most part, the mapping between Proto field types and TensorFlow dtypes
is straightforward. However, there are a few special cases:
A proto field that contains a submessage or group can only be converted
to DT_STRING
(the serialized submessage). This is to reduce the complexity
of the API. The resulting string can be used as input to another instance of
the decode_proto op.
TensorFlow lacks support for unsigned integers. The ops represent uint64
types as a DT_INT64
with the same twos-complement bit pattern (the obvious
way). Unsigned int32 values can be represented exactly by specifying type
DT_INT64
, or using twos-complement if the caller specifies DT_INT32
in
the output_types
attribute.
The descriptor_source
attribute selects the source of protocol
descriptors to consult when looking up message_type
. This may be:
An empty string or "local://", in which case protocol descriptors are
created for C++ (not Python) proto definitions linked to the binary.
A file, in which case protocol descriptors are created from the file,
which is expected to contain a FileDescriptorSet
serialized as a string.
NOTE: You can build a descriptor_source
file using the --descriptor_set_out
and --include_imports
options to the protocol compiler protoc
.
A "bytes://", in which protocol descriptors are created from <bytes>
,
which is expected to be a FileDescriptorSet
serialized as a string.
Args |
sizes
|
A Tensor of type int32 .
Tensor of int32 with shape [batch_shape, len(field_names)] .
|
values
|
A list of Tensor objects.
List of tensors containing values for the corresponding field.
|
field_names
|
A list of strings .
List of strings containing proto field names.
|
message_type
|
A string . Name of the proto message type to decode.
|
descriptor_source
|
An optional string . Defaults to "local://" .
|
name
|
A name for the operation (optional).
|
Returns |
A Tensor of type string .
|
Except as otherwise noted, the content of this page is licensed under the Creative Commons Attribution 4.0 License, and code samples are licensed under the Apache 2.0 License. For details, see the Google Developers Site Policies. Java is a registered trademark of Oracle and/or its affiliates. Some content is licensed under the numpy license.
Last updated 2024-04-26 UTC.
[[["Easy to understand","easyToUnderstand","thumb-up"],["Solved my problem","solvedMyProblem","thumb-up"],["Other","otherUp","thumb-up"]],[["Missing the information I need","missingTheInformationINeed","thumb-down"],["Too complicated / too many steps","tooComplicatedTooManySteps","thumb-down"],["Out of date","outOfDate","thumb-down"],["Samples / code issue","samplesCodeIssue","thumb-down"],["Other","otherDown","thumb-down"]],["Last updated 2024-04-26 UTC."],[],[],null,["# tf.io.encode_proto\n\n\u003cbr /\u003e\n\nThe op serializes protobuf messages provided in the input tensors.\n\n#### View aliases\n\n\n**Compat aliases for migration**\n\nSee\n[Migration guide](https://www.tensorflow.org/guide/migrate) for\nmore details.\n\n[`tf.compat.v1.io.encode_proto`](https://www.tensorflow.org/api_docs/python/tf/io/encode_proto)\n\n\u003cbr /\u003e\n\n tf.io.encode_proto(\n sizes: Annotated[Any, _atypes.Int32],\n values,\n field_names,\n message_type: str,\n descriptor_source: str = 'local://',\n name=None\n ) -\u003e Annotated[Any, _atypes.String]\n\nThe types of the tensors in `values` must match the schema for the fields\nspecified in `field_names`. All the tensors in `values` must have a common\nshape prefix, *batch_shape*.\n\nThe `sizes` tensor specifies repeat counts for each field. The repeat count\n(last dimension) of a each tensor in `values` must be greater than or equal\nto corresponding repeat count in `sizes`.\n\nA `message_type` name must be provided to give context for the field names.\nThe actual message descriptor can be looked up either in the linked-in\ndescriptor pool or a filename provided by the caller using the\n`descriptor_source` attribute.\n\nFor the most part, the mapping between Proto field types and TensorFlow dtypes\nis straightforward. However, there are a few special cases:\n\n- A proto field that contains a submessage or group can only be converted\n to `DT_STRING` (the serialized submessage). This is to reduce the complexity\n of the API. The resulting string can be used as input to another instance of\n the decode_proto op.\n\n- TensorFlow lacks support for unsigned integers. The ops represent uint64\n types as a `DT_INT64` with the same twos-complement bit pattern (the obvious\n way). Unsigned int32 values can be represented exactly by specifying type\n `DT_INT64`, or using twos-complement if the caller specifies `DT_INT32` in\n the `output_types` attribute.\n\nThe `descriptor_source` attribute selects the source of protocol\ndescriptors to consult when looking up `message_type`. This may be:\n\n- An empty string or \"local://\", in which case protocol descriptors are\n created for C++ (not Python) proto definitions linked to the binary.\n\n- A file, in which case protocol descriptors are created from the file,\n which is expected to contain a `FileDescriptorSet` serialized as a string.\n NOTE: You can build a `descriptor_source` file using the `--descriptor_set_out`\n and `--include_imports` options to the protocol compiler `protoc`.\n\n- A \"bytes://\", in which protocol descriptors are created from `\u003cbytes\u003e`, which is expected to be a `FileDescriptorSet` serialized as a string.\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n| Args ---- ||\n|---------------------|--------------------------------------------------------------------------------------------|\n| `sizes` | A `Tensor` of type `int32`. Tensor of int32 with shape `[batch_shape, len(field_names)]`. |\n| `values` | A list of `Tensor` objects. List of tensors containing values for the corresponding field. |\n| `field_names` | A list of `strings`. List of strings containing proto field names. |\n| `message_type` | A `string`. Name of the proto message type to decode. |\n| `descriptor_source` | An optional `string`. Defaults to `\"local://\"`. |\n| `name` | A name for the operation (optional). |\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n| Returns ------- ||\n|---|---|\n| A `Tensor` of type `string`. ||\n\n\u003cbr /\u003e"]]