mindspore.amp.StaticLossScaler

class mindspore.amp.StaticLossScaler(scale_value)[source]

Static Loss scale class.

Scales and unscales loss or gradients by a fixed constant.

Warning

This is an experimental API that is subject to change or deletion.

Parameters

scale_value (Union(float, int)) – The initial loss scale value.

Supported Platforms:

Ascend GPU CPU

Examples

>>> import mindspore
>>> from mindspore import amp, Tensor
>>> import numpy as np
>>> loss_scaler = amp.StaticLossScaler(scale_value=2**10)
>>> loss_value = Tensor([1.], mindspore.float32)
>>> scaled_loss_value = loss_scaler.scale(loss_value)
>>> print(scaled_loss_value)
[1024.]
>>> grads = (Tensor(np.array([1.5, 1.0]), mindspore.float16),
...      Tensor(np.array([1.2]), mindspore.float16))
>>> unscaled_grads = loss_scaler.unscale(grads)
>>> print(unscaled_grads)
(Tensor(shape=[2], dtype=Float16, value= [ 1.4648e-03,  9.7656e-04]),
Tensor(shape=[1], dtype=Float16, value= [ 1.1721e-03]))
adjust(grads_finite)[source]

scale_value is fixed.

Parameters

grads_finite (Tensor) – a scalar bool Tensor indicating whether the grads are finite.

scale(inputs)[source]

Scaling inputs by scale_value.

Parameters

inputs (Union(Tensor, tuple(Tensor))) – the input loss value or gradients.

Returns

Union(Tensor, tuple(Tensor)), the scaled value.

unscale(inputs)[source]

Unscaling inputs by scale_value.

Parameters

inputs (Union(Tensor, tuple(Tensor))) – the input loss value or gradients.

Returns

Union(Tensor, tuple(Tensor)), the unscaled value.