mindspore.nn.MultiLabelSoftMarginLoss
- class mindspore.nn.MultiLabelSoftMarginLoss(weight=None, reduction='mean')[source]
Calculates the MultiLabelSoftMarginLoss. The multi-label soft margin loss is a commonly used loss function in multi-label classification tasks where an input sample can belong to multiple classes. Given an input
and binary labels of size , where denotes the number of samples and denotes the number of classes.where
represents the predicted score of sample for class . represents the binary label of sample for class , where sample belongs to class if , and sample does not belong to class if . For a multi-label classification task, each sample may have multiple labels with a value of 1 in the binary label . weight will multiply to the loss of each class if given.- Parameters
weight (Union[Tensor, int, float]) – The manual rescaling weight given to each class. Default:
None
.reduction (str, optional) –
Apply specific reduction method to the output:
'none'
,'mean'
,'sum'
. Default:'mean'
.'none'
: no reduction will be applied.'mean'
: compute and return the weighted mean of elements in the output.'sum'
: the output elements will be summed.
- Inputs:
x (Tensor) - A tensor of shape
, where N is batch size and C is number of classes.target (Tensor) - The label target Tensor which has the same shape as x.
- Outputs:
Tensor, the data type is the same as x, if the reduction is
'none'
, its shape is (N), otherwise it is zero.
- Raises
ValueError – If the rank of x or target is not 2.
- Supported Platforms:
Ascend
GPU
CPU
Examples
>>> import mindspore as ms >>> import mindspore.nn as nn >>> x = ms.Tensor([[0.3, 0.6, 0.6], [0.9, 0.4, 0.2]]) >>> target = ms.Tensor([[0.0, 0.0, 1.0], [0.0, 0.0, 1.0]]) >>> loss = nn.MultiLabelSoftMarginLoss(reduction='mean') >>> out = loss(x, target) >>> print(out.asnumpy()) 0.84693956