mindspore.ops.multilabel_soft_margin_loss
- mindspore.ops.multilabel_soft_margin_loss(input, target, weight=None, reduction='mean')[source]
Calculates the MultiLabelSoftMarginLoss. The multi-label soft margin loss is a commonly used loss function in multi-label classification tasks where an input sample can belong to multiple classes. Given an input
and binary labels of size , where denotes the number of samples and denotes the number of classes.where
represents the predicted score of sample for class . represents the binary label of sample for class , where sample belongs to class if , and sample does not belong to class if . For a multi-label classification task, each sample may have multiple labels with a value of 1 in the binary label . weight will multiply to the loss of each class if given.- Parameters
input (Tensor) – A tensor of shape
, where N is batch size and C is number of classes.target (Tensor) – The label target Tensor which has the same shape as input.
weight (Union[Tensor, int, float]) – The manual rescaling weight given to each class. Default:
None
.reduction (str, optional) –
Apply specific reduction method to the output:
'none'
,'mean'
,'sum'
. Default:'mean'
.'none'
: no reduction will be applied.'mean'
: compute and return the weighted mean of elements in the output.'sum'
: the output elements will be summed.
- Returns
Tensor, the data type is the same as input, if the reduction is
'none'
, its shape is , otherwise it is zero.
- Supported Platforms:
Ascend
GPU
CPU
Examples
>>> from mindspore import Tensor, ops >>> input = Tensor([[0.3, 0.6, 0.6], [0.9, 0.4, 0.2]]) >>> target = Tensor([[0.0, 0.0, 1.0], [0.0, 0.0, 1.0]]) >>> loss = ops.multilabel_soft_margin_loss(input, target, reduction='mean') >>> print(loss.asnumpy()) 0.84693956