View source on GitHub |
A LearningRateSchedule that uses a cosine decay schedule.
Inherits From: LearningRateSchedule
tf.keras.optimizers.schedules.CosineDecay(
initial_learning_rate, decay_steps, alpha=0.0, name=None
)
See Loshchilov & Hutter, ICLR2016, SGDR: Stochastic Gradient Descent with Warm Restarts.
When training a model, it is often useful to lower the learning rate as
the training progresses. This schedule applies a cosine decay function
to an optimizer step, given a provided initial learning rate.
It requires a step
value to compute the decayed learning rate. You can
just pass a TensorFlow variable that you increment at each training step.
The schedule is a 1-arg callable that produces a decayed learning rate when passed the current optimizer step. This can be useful for changing the learning rate value across different invocations of optimizer functions. It is computed as:
def decayed_learning_rate(step):
step = min(step, decay_steps)
cosine_decay = 0.5 * (1 + cos(pi * step / decay_steps))
decayed = (1 - alpha) * cosine_decay + alpha
return initial_learning_rate * decayed
Example usage:
decay_steps = 1000
lr_decayed_fn = tf.keras.optimizers.schedules.CosineDecay(
initial_learning_rate, decay_steps)
You can pass this schedule directly into a tf.keras.optimizers.Optimizer
as the learning rate. The learning rate schedule is also serializable and
deserializable using tf.keras.optimizers.schedules.serialize
and
tf.keras.optimizers.schedules.deserialize
.
Returns | |
---|---|
A 1-arg callable learning rate schedule that takes the current optimizer
step and outputs the decayed learning rate, a scalar Tensor of the same
type as initial_learning_rate .
|
Methods
from_config
@classmethod
from_config( config )
Instantiates a LearningRateSchedule
from its config.
Args | |
---|---|
config
|
Output of get_config() .
|
Returns | |
---|---|
A LearningRateSchedule instance.
|
get_config
get_config()
__call__
__call__(
step
)
Call self as a function.