mindspore.experimental.optim.lr_scheduler.LambdaLR
- class mindspore.experimental.optim.lr_scheduler.LambdaLR(optimizer, lr_lambda, last_epoch=- 1)[source]
Sets the learning rate of each parameter group to the initial lr times a given function. When last_epoch=-1, sets initial lr as lr.
Warning
This is an experimental lr scheduler module that is subject to change. This module must be used with optimizers in Experimental Optimizer .
- Parameters
optimizer (
mindspore.experimental.optim.Optimizer
) – Wrapped optimizer.lr_lambda (Union(function, list)) – A function which computes a multiplicative factor given a parameter last_epoch, or a list of such functions, one for each group in optimizer.param_groups.
last_epoch (int, optional) – The index of the last epoch. Default:
-1
.
- Raises
ValueError – If the length of lr_lambda is not equal to the number of param groups.
- Supported Platforms:
Ascend
GPU
CPU
Examples
>>> from mindspore import nn >>> from mindspore.experimental import optim >>> net = nn.Dense(2, 3) >>> optimizer = optim.Adam(net.trainable_params(), 0.01) >>> lmbda = lambda epoch: 0.9 ** epoch >>> scheduler = optim.lr_scheduler.LambdaLR(optimizer, lr_lambda=[lmbda]) >>> for i in range(3): ... scheduler.step() ... current_lr = scheduler.get_last_lr() ... print(current_lr) [Tensor(shape=[], dtype=Float32, value= 0.009)] [Tensor(shape=[], dtype=Float32, value= 0.0081)] [Tensor(shape=[], dtype=Float32, value= 0.00729)]