mindspore.ops.soft_margin_loss

mindspore.ops.soft_margin_loss(input, target, reduction='mean')[source]

Calculate the soft margin loss of input and target.

Creates a criterion that optimizes a two-class classification logistic loss between input tensor \(x\) and target tensor \(y\) (containing 1 or -1).

\[\text{loss}(x, y) = \sum_i \frac{\log(1 + \exp(-y[i]*x[i]))}{\text{x.nelement}()}\]

where \(x.nelement()\) is the number of elements of \(x\).

Warning

This is an experimental API that is subject to change or deletion.

Parameters
  • input (Tensor) – Predict data. Data type must be float16 or float32.

  • target (Tensor) – Ground truth data, with the same type and shape as logits.

  • reduction (str, optional) – Implements the reduction method to the output with 'none' , 'mean' , or 'sum' , respectively indicate that no calculation is specified, that the mean is used, and that is calculated using summation. Default: 'mean' .

Outputs:

Tensor or Scalar. If reduction is 'none', its shape is the same as logits. Otherwise, a scalar value will be returned.

Raises
  • TypeError – If input or target is not a Tensor.

  • TypeError – If dtype of input or target is neither float16 nor float32.

  • ValueError – If shape of input is not the same as that of target.

  • ValueError – If reduction is not one of 'none', 'mean' or 'sum'.

Supported Platforms:

Ascend GPU

Examples

>>> import mindspore
>>> import numpy as np
>>> from mindspore import Tensor, ops
>>> logits = Tensor(np.array([[0.3, 0.7], [0.5, 0.5]]), mindspore.float32)
>>> labels = Tensor(np.array([[-1, 1], [1, -1]]), mindspore.float32)
>>> output = ops.soft_margin_loss(logits, labels)
>>> print(output)
0.6764238