mindspore.nn.CosineDecayLR

class mindspore.nn.CosineDecayLR(min_lr, max_lr, decay_steps)[source]

Calculates learning rate base on cosine decay function.

For the i-th step, the formula of computing decayed_learning_rate[i] is:

decayed_learning_rate[i]=min_learning_rate+0.5(max_learning_ratemin_learning_rate)(1+cos(current_stepdecay_stepsπ))
Parameters
  • min_lr (float) – The minimum value of learning rate.

  • max_lr (float) – The maximum value of learning rate.

  • decay_steps (int) – A value used to calculate decayed learning rate.

Inputs:

Tensor. The current step number.

Outputs:

Tensor. The learning rate value for the current step.

Examples

>>> min_lr = 0.01
>>> max_lr = 0.1
>>> decay_steps = 4
>>> global_steps = Tensor(2, mstype.int32)
>>> cosine_decay_lr = nn.CosineDecayLR(min_lr, max_lr, decay_steps)
>>> result = cosine_decay_lr(global_steps)
>>> print(result)
0.055