mindspore.mint.nn.MSELoss

View Source On Gitee
class mindspore.mint.nn.MSELoss(reduction='mean')[source]

Calculates the mean squared error between the predicted value and the label value.

For simplicity, let x and y be 1-dimensional Tensor with length N, the unreduced loss (i.e. with argument reduction set to 'none') of x and y is given as:

(x,y)=L={l1,,lN},withln=(xnyn)2.

where N is the batch size. If reduction is not 'none', then:

(x,y)={mean(L),if reduction='mean';sum(L),if reduction='sum'.
Parameters

reduction (str, optional) –

Apply specific reduction method to the output: 'none' , 'mean' , 'sum' . Default: 'mean' .

  • 'none': no reduction will be applied.

  • 'mean': compute and return the mean of elements in the output.

  • 'sum': the output elements will be summed.

Inputs:
  • logits (Tensor) - The predicted value of the input. Tensor of any dimension. The data type needs to be consistent with the labels. It should also be broadcastable with the labels.

  • labels (Tensor) - The input label. Tensor of any dimension. The data type needs to be consistent with the logits. It should also be broadcastable with the logits.

Outputs:
  • Tensor. If reduction is 'mean' or 'sum', the shape of output is Tensor Scalar.

  • If reduction is 'none', the shape of output is the broadcasted shape of logits and labels .

Raises
  • ValueError – If reduction is not one of 'mean', 'sum' or 'none'.

  • ValueError – If logits and labels are not broadcastable.

  • TypeError – If logits and labels are in different data type.

Supported Platforms:

Ascend

Examples

>>> import mindspore
>>> from mindspore import Tensor, mint
>>> import numpy as np
>>> # Case 1: logits.shape = labels.shape = (3,)
>>> loss = mint.nn.MSELoss()
>>> logits = Tensor(np.array([1, 2, 3]), mindspore.float32)
>>> labels = Tensor(np.array([1, 1, 1]), mindspore.float32)
>>> output = loss(logits, labels)
>>> print(output)
1.6666667
>>> # Case 2: logits.shape = (3,), labels.shape = (2, 3)
>>> loss = mint.nn.MSELoss(reduction='none')
>>> logits = Tensor(np.array([1, 2, 3]), mindspore.float32)
>>> labels = Tensor(np.array([[1, 1, 1], [1, 2, 2]]), mindspore.float32)
>>> output = loss(logits, labels)
>>> print(output)
[[0. 1. 4.]
 [0. 0. 1.]]