mindspore.nn.piecewise_constant_lr
- mindspore.nn.piecewise_constant_lr(milestone, learning_rates)[source]
Get piecewise constant learning rate. The learning rate for each step will be stored in a list.
Calculate learning rate by the given milestone and learning_rates. Let the value of milestone be \((M_1, M_2, ..., M_t, ..., M_N)\) and the value of learning_rates be \((x_1, x_2, ..., x_t, ..., x_N)\). N is the length of milestone. Let the output learning rate be y, then for the i-th step, the formula of computing decayed_learning_rate[i] is:
\[y[i] = x_t,\ for\ i \in [M_{t-1}, M_t)\]- Parameters
- Returns
list[float]. The size of list is \(M_N\).
- Raises
TypeError – If milestone or learning_rates is neither a tuple nor a list.
ValueError – If the length of milestone and learning_rates is not same.
ValueError – If the value in milestone is not monotonically decreasing.
- Supported Platforms:
Ascend
GPU
CPU
Examples
>>> import mindspore.nn as nn >>> >>> milestone = [2, 5, 10] >>> learning_rates = [0.1, 0.05, 0.01] >>> output = nn.piecewise_constant_lr(milestone, learning_rates) >>> print(output) [0.1, 0.1, 0.05, 0.05, 0.05, 0.01, 0.01, 0.01, 0.01, 0.01]