mindspore.mint.nn.functional.l1_loss
- mindspore.mint.nn.functional.l1_loss(input, target, reduction='mean')[source]
Calculate the mean absolute error between the input value and the target value.
Assuming that the \(x\) and \(y\) are the predicted value and target value, both are one-dimensional tensors of length \(N\), length \(N\), reduction is set to
'none'
, then calculate the loss of \(x\) and \(y\) without dimensionality reduction.The formula is as follows:
\[\ell(x, y) = L = \{l_1,\dots,l_N\}^\top, \quad \text{with } l_n = \left| x_n - y_n \right|,\]where \(N\) is the batch size.
If reduction is
'mean'
or'sum'
, then:\[\begin{split}\ell(x, y) = \begin{cases} \operatorname{mean}(L), & \text{if reduction} = \text{'mean';}\\ \operatorname{sum}(L), & \text{if reduction} = \text{'sum'.} \end{cases}\end{split}\]- Parameters
input (Tensor) – Predicted value, Tensor of any dimension.
target (Tensor) – Target value, usually has the same shape as the input. If input and target have different shapes, make sure they can broadcast to each other.
reduction (str, optional) –
Apply specific reduction method to the output:
'none'
,'mean'
,'sum'
. Default:'mean'
.'none'
: no reduction will be applied.'mean'
: compute and return the mean of elements in the output. Notice: At least one of the input and target is float type when the reduction is'mean'
.'sum'
: the output elements will be summed.
- Returns
Tensor or Scalar, if reduction is
'none'
, return a Tensor with same shape and dtype as input. Otherwise, a scalar value will be returned.- Raises
TypeError – If input is not a Tensor.
TypeError – If target is not a Tensor.
ValueError – If reduction is not one of
'none'
,'mean'
or'sum'
.
- Supported Platforms:
Ascend
Examples
>>> from mindspore import Tensor, mint >>> from mindspore import dtype as mstype >>> x = Tensor([[1, 2, 3], [4, 5, 6]], mstype.float32) >>> target = Tensor([[6, 5, 4], [3, 2, 1]], mstype.float32) >>> output = mint.nn.functional.l1_loss(x, target, reduction="mean") >>> print(output) 3.0