mindspore.nn.NLLLoss
- class mindspore.nn.NLLLoss(weight=None, ignore_index=- 100, reduction='mean')[source]
Gets the negative log likelihood loss between logits and labels.
The nll loss with
can be described as:where
is the logits, is the labels, is the weight, is the batch size, belonging to is class index, where is the number of classes.If reduction is not ‘none’ (default ‘mean’), then
- Parameters
weight (Tensor) – The rescaling weight to each class. If the value is not None, the shape is
. The data type only supports float32 or float16. Default:None
.ignore_index (int) – Specifies a target value that is ignored (typically for padding value) and does not contribute to the gradient. Default:
-100
.reduction (str) – Apply specific reduction method to the output:
'none'
,'mean'
, or'sum'
. Default:'mean'
.
- Inputs:
logits (Tensor) - Tensor of shape
or for -dimensional data, where C = number of classes. Data type must be float16 or float32. inputs needs to be logarithmic probability.labels (Tensor) -
or for -dimensional data. Data type must be int32.
- Returns
Tensor, the computed negative log likelihood loss value.
- Raises
- Supported Platforms:
Ascend
GPU
CPU
Examples
>>> import mindspore as ms >>> import mindspore.nn as nn >>> import numpy as np >>> logits = ms.Tensor(np.random.randn(3, 5), ms.float32) >>> labels = ms.Tensor(np.array([1, 0, 4]), ms.int32) >>> loss = nn.NLLLoss() >>> output = loss(logits, labels)