mindspore.DynamicLossScaleManager
- class mindspore.DynamicLossScaleManager(init_loss_scale=2 ** 24, scale_factor=2, scale_window=2000)[source]
Loss scale that dynamically adjusts itself, inherits from LossScaleManager.
- Parameters
Examples
>>> from mindspore import Model, nn, DynamicLossScaleManager >>> >>> net = Net() >>> loss_scale_manager = DynamicLossScaleManager() >>> optim = nn.Momentum(params=net.trainable_params(), learning_rate=0.1, momentum=0.9) >>> model = Model(net, loss_scale_manager=loss_scale_manager, optimizer=optim)
- get_drop_overflow_update()[source]
Get the flag whether to drop optimizer update when there is an overflow.
- Returns
bool, always return True at DynamicLossScaleManager.