mindspore.nn.BCELoss

class mindspore.nn.BCELoss(weight=None, reduction="none")[source]

BCELoss creates a criterion to measure the binary cross entropy between the true labels and predicted labels.

Set the predicted labels as x, true labels as y, the output loss as (x,y). Let,

L={l1,,lN},ln=wn[ynlogxn+(1yn)log(1xn)]

Then,

(x,y)={L,if reduction='none';mean(L),if reduction='mean';sum(L),if reduction='sum'.

Note

Note that the predicted labels should always be the output of sigmoid and the true labels should be numbers between 0 and 1.

Parameters
  • weight (Tensor, optional) – A rescaling weight applied to the loss of each batch element. And it must have the same shape and data type as inputs. Default: None

  • reduction (str) – Specifies the reduction to be applied to the output. Its value must be one of ‘none’, ‘mean’, ‘sum’. Default: ‘none’.

Inputs:
  • logits (Tensor) - The input tensor with shape (N,) where means, any number of additional dimensions. The data type must be float16 or float32.

  • labels (Tensor) - The label tensor with shape (N,), the same shape and data type as logits.

Outputs:

Tensor or Scalar, if reduction is ‘none’, then output is a tensor and has the same shape as logits. Otherwise, the output is a scalar.

Raises
  • TypeError – If dtype of logits, labels or weight (if given) is neither float16 not float32.

  • ValueError – If reduction is not one of ‘none’, ‘mean’, ‘sum’.

  • ValueError – If shape of logits is not the same as labels or weight (if given).

Supported Platforms:

Ascend GPU CPU

Examples

>>> weight = Tensor(np.array([[1.0, 2.0, 3.0], [4.0, 3.3, 2.2]]), mindspore.float32)
>>> loss = nn.BCELoss(weight=weight, reduction='mean')
>>> logits = Tensor(np.array([[0.1, 0.2, 0.3], [0.5, 0.7, 0.9]]), mindspore.float32)
>>> labels = Tensor(np.array([[0, 1, 0], [0, 0, 1]]), mindspore.float32)
>>> output = loss(logits, labels)
>>> print(output)
1.8952923