mindspore.FixedLossScaleManager

class mindspore.FixedLossScaleManager(loss_scale=128.0, drop_overflow_update=True)[source]

Loss scale with a fixed value, inherits from LossScaleManager.

Parameters
  • loss_scale (float) – Loss scale. Note that if drop_overflow_update is set to False, the value of loss_scale in optimizer that you used need to be set to the same value as here. Default: 128.0.

  • drop_overflow_update (bool) – Whether to execute optimizer if there is an overflow. If True, the optimizer will not executed when overflow occurs. Default: True.

Examples

>>> from mindspore import Model, nn, FixedLossScaleManager
>>>
>>> net = Net()
>>> #1) Drop the parameter update if there is an overflow
>>> loss_scale_manager = FixedLossScaleManager()
>>> optim = nn.Momentum(params=net.trainable_params(), learning_rate=0.1, momentum=0.9)
>>> model = Model(net, loss_scale_manager=loss_scale_manager, optimizer=optim)
>>>
>>> #2) Execute parameter update even if overflow occurs
>>> loss_scale = 1024.0
>>> loss_scale_manager = FixedLossScaleManager(loss_scale, False)
>>> optim = nn.Momentum(params=net.trainable_params(), learning_rate=0.1, momentum=0.9, loss_scale=loss_scale)
>>> model = Model(net, loss_scale_manager=loss_scale_manager, optimizer=optim)
get_drop_overflow_update()[source]

Get the flag whether to drop optimizer update when there is an overflow.

Returns

bool, drop_overflow_update value.

get_loss_scale()[source]

Get loss scale value.

Returns

bool, loss_scale value.

get_update_cell()[source]

Returns the update cell for TrainOneStepWithLossScaleCell.

Returns

None or Cell. Cell object, used to update loss_scale, when drop_overflow_update is True. None when drop_overflow_update is False.

update_loss_scale(overflow)[source]

Update loss scale value. The interface at FixedLossScaleManager will do nothing.

Parameters

overflow (bool) – Whether it overflows.