mindspore.ops.multilabel_soft_margin_loss

mindspore.ops.multilabel_soft_margin_loss(input, target, weight=None, reduction='mean')[source]

Calculates the MultiLabelSoftMarginLoss. The multi-label soft margin loss is a commonly used loss function in multi-label classification tasks where an input sample can belong to multiple classes. Given an input \(input\) and binary labels \(output\) of size \((N,C)\), where \(N\) denotes the number of samples and \(C\) denotes the number of classes.

\[\mathcal{loss\left( input , output \right)} = - \frac{1}{N}\frac{1}{C}\sum_{i = 1}^{N} \sum_{j = 1}^{C}\left(output_{ij}\log\frac{1}{1 + e^{- input_{ij}}} + \left( 1 - output_{ij} \right)\log\frac{e^{-input_{ij}}}{1 + e^{-input_{ij}}} \right)\]

where \(input_{ij}\) represents the predicted score of sample \(i\) for class \(j\). \(output_{ij}\) represents the binary label of sample \(i\) for class \(j\), where sample \(i\) belongs to class \(j\) if \(output_{ij}=1\) , and sample \(i\) does not belong to class \(j\) if \(output_{ij}=0\). For a multi-label classification task, each sample may have multiple labels with a value of 1 in the binary label \(output\). weight will multiply to the loss of each class if given.

Parameters
  • input (Tensor) – A tensor of shape \((N, C)\) , where N is batch size and C is number of classes.

  • target (Tensor) – The label target Tensor which has the same shape as input.

  • weight (Union[Tensor, int, float]) – The manual rescaling weight given to each class. Default: None.

  • reduction (str, optional) –

    Apply specific reduction method to the output: 'none' , 'mean' , 'sum' . Default: 'mean' .

    • 'none': no reduction will be applied.

    • 'mean': compute and return the weighted mean of elements in the output.

    • 'sum': the output elements will be summed.

Returns

Tensor, the data type is the same as input, if the reduction is 'none', its shape is \((N)\) , otherwise it is zero.

Supported Platforms:

Ascend GPU CPU

Examples

>>> from mindspore import Tensor, ops
>>> input = Tensor([[0.3, 0.6, 0.6], [0.9, 0.4, 0.2]])
>>> target = Tensor([[0.0, 0.0, 1.0], [0.0, 0.0, 1.0]])
>>> loss = ops.multilabel_soft_margin_loss(input, target, reduction='mean')
>>> print(loss.asnumpy())
0.84693956