mindspore.nn.BCELoss

class mindspore.nn.BCELoss(weight=None, reduction="none")[source]

BCELoss creates a criterion to measure the binary cross entropy between the true labels and predicted labels.

Set the predicted labels as \(x\), true labels as \(y\), the output loss as \(\ell(x, y)\). Let,

\[L = \{l_1,\dots,l_N\}^\top, \quad l_n = - w_n \left[ y_n \cdot \log x_n + (1 - y_n) \cdot \log (1 - x_n) \right]\]

Then,

\[\begin{split}\ell(x, y) = \begin{cases} L, & \text{if reduction} = \text{'none';}\\ \operatorname{mean}(L), & \text{if reduction} = \text{'mean';}\\ \operatorname{sum}(L), & \text{if reduction} = \text{'sum'.} \end{cases}\end{split}\]

Note

Note that the predicted labels should always be the output of sigmoid and the true labels should be numbers between 0 and 1.

Parameters
  • weight (Tensor, optional) – A rescaling weight applied to the loss of each batch element. And it must have the same shape and data type as inputs. Default: None

  • reduction (str) – Specifies the reduction to be applied to the output. Its value must be one of ‘none’, ‘mean’, ‘sum’. Default: ‘none’.

Inputs:
  • logits (Tensor) - The input tensor with shape \((N, *)\) where \(*\) means, any number of additional dimensions. The data type must be float16 or float32.

  • labels (Tensor) - The label tensor with shape \((N, *)\), the same shape and data type as logits.

Outputs:

Tensor or Scalar, if reduction is ‘none’, then output is a tensor and has the same shape as logits. Otherwise, the output is a scalar.

Raises
  • TypeError – If dtype of logits, labels or weight (if given) is neither float16 not float32.

  • ValueError – If reduction is not one of ‘none’, ‘mean’, ‘sum’.

  • ValueError – If shape of logits is not the same as labels or weight (if given).

Supported Platforms:

Ascend GPU CPU

Examples

>>> weight = Tensor(np.array([[1.0, 2.0, 3.0], [4.0, 3.3, 2.2]]), mindspore.float32)
>>> loss = nn.BCELoss(weight=weight, reduction='mean')
>>> logits = Tensor(np.array([[0.1, 0.2, 0.3], [0.5, 0.7, 0.9]]), mindspore.float32)
>>> labels = Tensor(np.array([[0, 1, 0], [0, 0, 1]]), mindspore.float32)
>>> output = loss(logits, labels)
>>> print(output)
1.8952923