tfm.optimization.StepCosineDecayWithOffset
Stay organized with collections
Save and categorize content based on your preferences.
Stepwise cosine learning rate decay with offset.
tfm.optimization.StepCosineDecayWithOffset(
boundaries,
values,
offset: int = 0,
name: str = 'StepCosineDecayWithOffset'
)
Learning rate is equivalent to one or more cosine decay(s) starting and
ending at each interval.
ExampleL
boundaries: [100000, 110000]
values: [1.0, 0.5]
lr_decayed_fn = (
lr_schedule.StepCosineDecayWithOffset(
boundaries,
values))
from 0 to 100000 step, it will cosine decay from 1.0 to 0.5
from 100000 to 110000 step, it cosine decay from 0.5 to 0.0
Args |
boundaries
|
A list of Tensor s or int s with strictly
increasing entries, and with all elements having the same type as the
optimizer step.
|
values
|
A list of Tensor s or float s that specifies the
values for the intervals defined by boundaries . It should have one
more element than boundaries , and all elements should have the same
type.
|
offset
|
The offset when computing the power decay.
|
name
|
Optional, name of learning rate schedule.
|
Methods
from_config
@classmethod
from_config(
config
)
Instantiates a LearningRateSchedule
from its config.
Args |
config
|
Output of get_config() .
|
Returns |
A LearningRateSchedule instance.
|
get_config
View source
get_config()
__call__
View source
__call__(
global_step
)
Call self as a function.
Except as otherwise noted, the content of this page is licensed under the Creative Commons Attribution 4.0 License, and code samples are licensed under the Apache 2.0 License. For details, see the Google Developers Site Policies. Java is a registered trademark of Oracle and/or its affiliates. Some content is licensed under the numpy license.
Last updated 2024-02-02 UTC.
[[["Easy to understand","easyToUnderstand","thumb-up"],["Solved my problem","solvedMyProblem","thumb-up"],["Other","otherUp","thumb-up"]],[["Missing the information I need","missingTheInformationINeed","thumb-down"],["Too complicated / too many steps","tooComplicatedTooManySteps","thumb-down"],["Out of date","outOfDate","thumb-down"],["Samples / code issue","samplesCodeIssue","thumb-down"],["Other","otherDown","thumb-down"]],["Last updated 2024-02-02 UTC."],[],[],null,["# tfm.optimization.StepCosineDecayWithOffset\n\n\u003cbr /\u003e\n\n|------------------------------------------------------------------------------------------------------------------------------------|\n| [View source on GitHub](https://github.com/tensorflow/models/blob/v2.15.0/official/modeling/optimization/lr_schedule.py#L389-L487) |\n\nStepwise cosine learning rate decay with offset.\n\n#### View aliases\n\n\n**Main aliases**\n\n[`tfm.optimization.lr_schedule.StepCosineDecayWithOffset`](https://www.tensorflow.org/api_docs/python/tfm/optimization/StepCosineDecayWithOffset)\n\n\u003cbr /\u003e\n\n tfm.optimization.StepCosineDecayWithOffset(\n boundaries,\n values,\n offset: int = 0,\n name: str = 'StepCosineDecayWithOffset'\n )\n\nLearning rate is equivalent to one or more cosine decay(s) starting and\nending at each interval.\n\nExampleL \n\n boundaries: [100000, 110000]\n values: [1.0, 0.5]\n lr_decayed_fn = (\n lr_schedule.StepCosineDecayWithOffset(\n boundaries,\n values))\n\nfrom 0 to 100000 step, it will cosine decay from 1.0 to 0.5\nfrom 100000 to 110000 step, it cosine decay from 0.5 to 0.0\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n| Args ---- ||\n|--------------|-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|\n| `boundaries` | A list of `Tensor`s or `int`s with strictly increasing entries, and with all elements having the same type as the optimizer step. |\n| `values` | A list of `Tensor`s or `float`s that specifies the values for the intervals defined by `boundaries`. It should have one more element than `boundaries`, and all elements should have the same type. |\n| `offset` | The offset when computing the power decay. |\n| `name` | Optional, name of learning rate schedule. |\n\n\u003cbr /\u003e\n\nMethods\n-------\n\n### `from_config`\n\n @classmethod\n from_config(\n config\n )\n\nInstantiates a `LearningRateSchedule` from its config.\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n| Args ||\n|----------|---------------------------|\n| `config` | Output of `get_config()`. |\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n| Returns ||\n|---|---|\n| A `LearningRateSchedule` instance. ||\n\n\u003cbr /\u003e\n\n### `get_config`\n\n[View source](https://github.com/tensorflow/models/blob/v2.15.0/official/modeling/optimization/lr_schedule.py#L481-L487) \n\n get_config()\n\n### `__call__`\n\n[View source](https://github.com/tensorflow/models/blob/v2.15.0/official/modeling/optimization/lr_schedule.py#L446-L479) \n\n __call__(\n global_step\n )\n\nCall self as a function."]]