mindspore.ops.l1_loss
- mindspore.ops.l1_loss(input, target, reduction='mean')[source]
Calculate the mean absolute error between the input value and the target value.
Assuming that the
and are 1-D Tensor, length , reduction is set to “none” , then calculate the loss of and without dimensionality reduction.The formula is as follows:
where
is the batch size.If reduction is mean or sum, then:
- Parameters
input (Tensor) – Predicted value, Tensor of any dimension.
target (Tensor) – Target value, usually has the same shape as the input. If input and target have different shape, make sure they can broadcast to each other.
reduction (str, optional) – Type of reduction to be applied to loss. The optional value is “mean”, “sum” or “none”. Default:
'mean'
.
- Returns
Tensor or Scalar, if reduction is “none”, return a Tensor with same shape and dtype as input. Otherwise, a scalar value will be returned.
- Raises
TypeError – If input is not a Tensor.
TypeError – If target is not a Tensor.
ValueError – If reduction is not one of “none”, “mean” or “sum”.
- Supported Platforms:
Ascend
GPU
CPU
Examples
>>> x = ms.Tensor([[1, 2, 3], [4, 5, 6]], mstype.float32) >>> target = ms.Tensor([[6, 5, 4], [3, 2, 1]], mstype.float32) >>> output = ops.l1_loss(x, target, reduction="mean") >>> print(output) 3.0