mindspore.nn.InverseDecayLR

View Source On Gitee
class mindspore.nn.InverseDecayLR(learning_rate, decay_rate, decay_steps, is_stair=False)[source]

Calculates learning rate base on inverse-time decay function.

For current step, the formula of computing decayed learning rate is:

\[decayed\_learning\_rate = learning\_rate / (1 + decay\_rate * p)\]

Where :

\[p = \frac{current\_step}{decay\_steps}\]

If is_stair is True, the formula is :

\[p = floor(\frac{current\_step}{decay\_steps})\]
Parameters
  • learning_rate (float) – The initial value of learning rate.

  • decay_rate (float) – The decay rate.

  • decay_steps (int) – Number of steps to decay over.

  • is_stair (bool) – If true, learning rate decay once every decay_steps times. If False, the learning rate decays for every step. Default: False .

Inputs:
  • global_step (Tensor) - The current step number.

Outputs:

Tensor. The learning rate value for the current step with shape \(()\).

Raises
  • TypeError – If learning_rate or decay_rate is not a float.

  • TypeError – If decay_steps is not an int or is_stair is not a bool.

  • ValueError – If decay_steps is less than 1.

  • ValueError – If learning_rate or decay_rate is less than or equal to 0.

Supported Platforms:

Ascend GPU CPU

Examples

>>> import mindspore
>>> from mindspore import Tensor, nn
>>>
>>> learning_rate = 0.1
>>> decay_rate = 0.9
>>> decay_steps = 4
>>> global_step = Tensor(2, mindspore.int32)
>>> inverse_decay_lr = nn.InverseDecayLR(learning_rate, decay_rate, decay_steps, True)
>>> lr = inverse_decay_lr(global_step)
>>> net = nn.Dense(2, 3)
>>> optim = nn.SGD(net.trainable_params(), learning_rate=lr)