mindspore.mint.nn.SmoothL1Loss

查看源文件
class mindspore.mint.nn.SmoothL1Loss(reduction='mean', beta=1.0)[源代码]

计算平滑L1损失,该L1损失函数有稳健性。

警告

这是一个实验性API,后续可能修改或删除。

参见 mindspore.mint.nn.functional.smooth_l1_loss() 获取更多详情。

支持平台:

Ascend

样例:

>>> import mindspore
>>> import numpy as np
>>> from mindspore import Tensor, mint
>>> input = Tensor(np.array([2, 2, 3]), mindspore.float32)
>>> target = Tensor(np.array([2, 2, 2]), mindspore.float32)
>>> beta = 1.0
>>> reduction_1 = 'none'
>>> loss1 = mint.nn.SmoothL1Loss(reduction=reduction_1, beta=beta)
>>> output = loss1(input, target)
>>> print(output)
[0.  0.  0.5]
>>> reduction_2 = 'mean'
>>> loss2 = mint.nn.SmoothL1Loss(reduction=reduction_2, beta=beta)
>>> output = loss2(input, target)
>>> print(output)
0.16666667
>>> reduction_3 = 'sum'
>>> loss3 = mint.nn.SmoothL1Loss(reduction=reduction_3, beta=beta)
>>> output = loss2(loss3, target)
>>> print(output)
0.5