mindspore.ops.multilabel_soft_margin_loss

mindspore.ops.multilabel_soft_margin_loss(input, target, weight=None, reduction='mean')[source]

Calculates the MultiLabelSoftMarginLoss. The multi-label soft margin loss is a commonly used loss function in multi-label classification tasks where an input sample can belong to multiple classes. Given an input input and binary labels output of size (N,C), where N denotes the number of samples and C denotes the number of classes.

loss(input,output)=1N1Ci=1Nj=1C(outputijlog11+einputij+(1outputij)logeinputij1+einputij)

where inputij represents the predicted score of sample i for class j. outputij represents the binary label of sample i for class j, where sample i belongs to class j if outputij=1 , and sample i does not belong to class j if outputij=0. For a multi-label classification task, each sample may have multiple labels with a value of 1 in the binary label output. weight will multiply to the loss of each class if given.

Parameters
  • input (Tensor) – A tensor of shape (N,C) , where N is batch size and C is number of classes.

  • target (Tensor) – The label target Tensor which has the same shape as input.

  • weight (Union[Tensor, int, float]) – The manual rescaling weight given to each class. Default: None.

  • reduction (str, optional) –

    Apply specific reduction method to the output: 'none' , 'mean' , 'sum' . Default: 'mean' .

    • 'none': no reduction will be applied.

    • 'mean': compute and return the weighted mean of elements in the output.

    • 'sum': the output elements will be summed.

Returns

Tensor, the data type is the same as input, if the reduction is 'none', its shape is (N) , otherwise it is zero.

Supported Platforms:

Ascend GPU CPU

Examples

>>> from mindspore import Tensor, ops
>>> input = Tensor([[0.3, 0.6, 0.6], [0.9, 0.4, 0.2]])
>>> target = Tensor([[0.0, 0.0, 1.0], [0.0, 0.0, 1.0]])
>>> loss = ops.multilabel_soft_margin_loss(input, target, reduction='mean')
>>> print(loss.asnumpy())
0.84693956