mindspore.nn.SoftMarginLoss
- class mindspore.nn.SoftMarginLoss(reduction='mean')[source]
A loss class for two-class classification problems.
SoftMarginLoss creates a criterion that optimizes a two-class classification logistic loss between input tensor \(x\) and labels tensor \(y\) (containing 1 or -1).
\[\text{loss}(x, y) = \sum_i \frac{\log(1 + \exp(-y[i]*x[i]))}{\text{x.nelement}()}\]\(x.nelement()\) represents the number of element of x .
- Parameters
reduction (str) – Apply specific reduction method to the output:
'none'
,'mean'
,'sum'
. Default:"mean"
.
- Inputs:
logits (Tensor) - Predict data. Data type must be float16 or float32.
labels (Tensor) - Ground truth data, with the same type and shape as logits.
- Outputs:
Tensor or Scalar, if reduction is “none”, its shape is the same as logits. Otherwise, a scalar value will be returned.
- Raises
TypeError – If logits or labels is not a Tensor.
TypeError – If dtype of logits or labels is neither float16 nor float32.
ValueError – If shape of logits is not the same as labels.
ValueError – If reduction is not one of ‘none’, ‘mean’, ‘sum’.
- Supported Platforms:
Ascend
GPU
Examples
>>> import mindspore >>> from mindspore import Tensor, nn >>> import numpy as np >>> loss = nn.SoftMarginLoss() >>> logits = Tensor(np.array([[0.3, 0.7], [0.5, 0.5]]), mindspore.float32) >>> labels = Tensor(np.array([[-1, 1], [1, -1]]), mindspore.float32) >>> output = loss(logits, labels) >>> print(output) 0.6764238