mindspore.amp.DynamicLossScaleManager
- class mindspore.amp.DynamicLossScaleManager(init_loss_scale=2 ** 24, scale_factor=2, scale_window=2000)[source]
Loss scale(Magnification factor of gradients when mix precision is used) manager with loss scale dynamically adjusted, inherits from
mindspore.amp.LossScaleManager
.- Parameters
- Supported Platforms:
Ascend
GPU
Examples
>>> import mindspore as ms >>> from mindspore import amp, nn >>> >>> # Define the network structure of LeNet5. Refer to >>> # https://gitee.com/mindspore/docs/blob/r2.1/docs/mindspore/code/lenet.py >>> net = LeNet5() >>> loss_scale_manager = amp.DynamicLossScaleManager() >>> optim = nn.Momentum(params=net.trainable_params(), learning_rate=0.1, momentum=0.9) >>> model = ms.Model(net, loss_scale_manager=loss_scale_manager, optimizer=optim)
- get_drop_overflow_update()[source]
Whether to drop optimizer update for current step when there is an overflow.
- Returns
bool, always True.
- get_update_cell()[source]
Returns the instance of
mindspore.nn.Cell
that is used to update the loss scale which will be called atmindspore.nn.TrainOneStepWithLossScaleCell
.