mindspore.nn.MultiLabelSoftMarginLoss

class mindspore.nn.MultiLabelSoftMarginLoss(weight=None, reduction='mean')[source]

Calculates the MultiLabelSoftMarginLoss. The multi-label soft margin loss is a commonly used loss function in multi-label classification tasks where an input sample can belong to multiple classes. Given an input x and binary labels y of size (N,C), where N denotes the number of samples and C denotes the number of classes.

loss(x,y)=1N1Ci=1Nj=1C(yijlog11+exij+(1yij)logexij1+exij)

where xij represents the predicted score of sample i for class j. yij represents the binary label of sample i for class j, where sample i belongs to class j if yij=1 , and sample i does not belong to class j if yij=0. For a multi-label classification task, each sample may have multiple labels with a value of 1 in the binary label y. weight will multiply to the loss of each class if given.

Parameters
  • weight (Union[Tensor, int, float]) – The manual rescaling weight given to each class. Default: None .

  • reduction (str, optional) –

    Apply specific reduction method to the output: 'none' , 'mean' , 'sum' . Default: 'mean' .

    • 'none': no reduction will be applied.

    • 'mean': compute and return the weighted mean of elements in the output.

    • 'sum': the output elements will be summed.

Inputs:
  • x (Tensor) - A tensor of shape (N,C), where N is batch size and C is number of classes.

  • target (Tensor) - The label target Tensor which has the same shape as x.

Outputs:

Tensor, the data type is the same as x, if the reduction is 'none', its shape is (N), otherwise it is zero.

Raises

ValueError – If the rank of x or target is not 2.

Supported Platforms:

Ascend GPU CPU

Examples

>>> import mindspore as ms
>>> import mindspore.nn as nn
>>> x = ms.Tensor([[0.3, 0.6, 0.6], [0.9, 0.4, 0.2]])
>>> target = ms.Tensor([[0.0, 0.0, 1.0], [0.0, 0.0, 1.0]])
>>> loss = nn.MultiLabelSoftMarginLoss(reduction='mean')
>>> out = loss(x, target)
>>> print(out.asnumpy())
0.84693956