mindspore.nn.MAELoss
- class mindspore.nn.MAELoss(reduction='mean')[source]
MAELoss creates a standard to measure the average absolute error between \(x\) and \(y\) element-wise, where \(x\) is the input and \(y\) is the target.
For simplicity, let \(x\) and \(y\) be 1-dimensional Tensor with length \(M\) and \(N\), the unreduced loss (i.e. with argument reduction set to ‘none’) of \(x\) and \(y\) is given as:
\[\begin{split}MAE = \begin{cases} \sqrt{\frac{1}{M}\sum_{m=1,n=1}^{M,N}{|x_m-y_n|}}, & \text {if M > N } \\\\ \sqrt{\frac{1}{N}\sum_{m=1,n=1}^{M,N}{|x_m-y_n|}}, &\text{if M < N } \end{cases}\end{split}\]- Parameters
reduction (str) – Type of reduction to be applied to loss. The optional values are “mean”, “sum”, and “none”. Default: “mean”.
- Inputs:
logits (Tensor) - Tensor of shape \((x_1, x_2, ..., x_M)\).
labels (Tensor) - Tensor of shape \((y_1, y_2, ..., y_N)\).
- Outputs:
Tensor, weighted loss float tensor.
- Raises
ValueError – If reduction is not one of ‘none’, ‘mean’, ‘sum’.
- Supported Platforms:
Ascend
GPU
CPU
Examples
>>> loss = nn.MAELoss() >>> logits = Tensor(np.array([1, 2, 3]), mindspore.float32) >>> labels = Tensor(np.array([1, 2, 2]), mindspore.float32) >>> output = loss(logits, labels) >>> print(output) 0.33333334