mindspore.nn.HuberLoss
- class mindspore.nn.HuberLoss(reduction='mean', delta=1.0)[source]
HuberLoss calculate the error between the predicted value and the target value. It has the advantages of both L1Loss and MSELoss.
Assuming that the \(x\) and \(y\) are 1-D Tensor, length \(N\), then calculate the loss of \(x\) and \(y\) without dimensionality reduction (the reduction parameter is set to "none"). The formula is as follows:
\[\ell(x, y) = L = \{l_1,\dots,l_N\}^\top\]with
\[\begin{split}l_n = \begin{cases} 0.5 * (x_n - y_n)^2, & \text{if } |x_n - y_n| < delta; \\ delta * (|x_n - y_n| - 0.5 * delta), & \text{otherwise. } \end{cases}\end{split}\]where \(N\) is the batch size. If reduction is not
"none"
, then:\[\begin{split}\ell(x, y) = \begin{cases} \operatorname{mean}(L), & \text{if reduction} = \text{"mean";}\\ \operatorname{sum}(L), & \text{if reduction} = \text{"sum".} \end{cases}\end{split}\]- Parameters
reduction (str, optional) –
Apply specific reduction method to the output:
'none'
,'mean'
,'sum'
. Default:'mean'
.'none'
: no reduction will be applied.'mean'
: compute and return the mean of elements in the output.'sum'
: the output elements will be summed.
delta (Union[int, float]) – The threshold to change between two type of loss. The value must be positive. Default:
1.0
.
- Inputs:
logits (Tensor) - Predicted value, Tensor of any dimension. The data type must be float16 or float32.
labels (Tensor) - Target value, same dtype and shape as the logits in common cases. However, it supports the shape of logits is different from the shape of labels and they should be broadcasted to each other.
- Outputs:
Tensor or Scalar, if reduction is
"none"
, return a Tensor with same shape and dtype as logits. Otherwise, a scalar value will be returned.
- Raises
TypeError – If data type of logits or labels is neither float16 nor float32.
TypeError – If data type of logits or labels are not the same.
TypeError – If dtype of delta is neither float nor int.
ValueError – If delta is less than or equal to 0.
ValueError – If reduction is not one of
"none"
,"mean"
,"sum"
.ValueError – If logits and labels have different shapes and cannot be broadcasted to each other.
- Supported Platforms:
Ascend
GPU
CPU
Examples
>>> import mindspore as ms >>> import mindspore.nn as nn >>> import numpy as np >>> # Case 1: logits.shape = labels.shape = (3,) >>> loss = nn.HuberLoss() >>> logits = ms.Tensor(np.array([1, 2, 3]), ms.float32) >>> labels = ms.Tensor(np.array([1, 2, 2]), ms.float32) >>> output = loss(logits, labels) >>> print(output) 0.16666667 >>> # Case 2: logits.shape = (3,), labels.shape = (2, 3) >>> loss = nn.HuberLoss(reduction="none") >>> logits = ms.Tensor(np.array([1, 2, 3]), ms.float32) >>> labels = ms.Tensor(np.array([[1, 1, 1], [1, 2, 2]]), ms.float32) >>> output = loss(logits, labels) >>> print(output) [[0. 0.5 1.5] [0. 0. 0.5]]