mindspore.ops.huber_loss
- mindspore.ops.huber_loss(input, target, reduction='mean', delta=1.0)[source]
Calculates the error between the predicted value and the target value, which has the best of both the loss of
mindspore.ops.l1_loss()
and the loss ofmindspore.ops.mse_loss()
.Assuming that the \(x\) and \(y\) are 1-D Tensor, length \(N\), the reduction parameter is set to
'none'
then calculate the loss of \(x\) and \(y\) without dimensionality reduction. The formula is as follows:\[\ell(x, y) = L = \{l_1,\dots,l_N\}^\top\]with
\[\begin{split}l_n = \begin{cases} 0.5 * (x_n - y_n)^2, & \text{if } |x_n - y_n| < delta; \\ delta * (|x_n - y_n| - 0.5 * delta), & \text{otherwise. } \end{cases}\end{split}\]where \(N\) is the batch size.
If reduction is "mean" or "sum", then:
\[\begin{split}\ell(x, y) = \begin{cases} \operatorname{mean}(L), & \text{if reduction} = \text{"mean";}\\ \operatorname{sum}(L), & \text{if reduction} = \text{"sum".} \end{cases}\end{split}\]- Parameters
input (Tensor) – Predicted value, Tensor of any dimension.
target (Tensor) – Target value, has same dtype and shape as the input in common cases. However, when the shape of target is different from the shape of input, and they should be broadcasted to each other.
reduction (str, optional) –
Apply specific reduction method to the output:
'none'
,'mean'
,'sum'
. Default:'mean'
.'none'
: no reduction will be applied.'mean'
: compute and return the mean of elements in the output.'sum'
: the output elements will be summed.
delta (Union[int, float]) – The threshold to change between two type of loss. The value must be greater than zero. Default:
1.0
.
- Returns
Tensor or Scalar, if reduction is
'none'
, return a Tensor with same shape and dtype as input. Otherwise, a scalar value will be returned.- Raises
TypeError – If input or target is not a Tensor.
TypeError – If dtype of delta is neither float nor int.
ValueError – If delta is less than or equal to 0.
ValueError – If reduction is not one of
'none'
,'mean'
,'sum'
.ValueError – If input and target have different shapes and cannot be broadcasted to each other.
- Supported Platforms:
Ascend
GPU
CPU
Examples
>>> import mindspore >>> from mindspore import Tensor, ops >>> x = Tensor([1, 2, 10, 2], mindspore.float32) >>> target = Tensor([1, 5, 1, 20], mindspore.float32) >>> output = ops.huber_loss(x, target, reduction="mean", delta=2) >>> print(output) 13.5