mindspore.nn.PoissonNLLLoss
- class mindspore.nn.PoissonNLLLoss(log_input=True, full=False, eps=1e-08, reduction='mean')[source]
Poisson negative log likelihood loss.
The loss is:
where
is the loss, is the target, is the input.If log_input is True, use
instead of . When calculating logarithms, the lower bound of input is set to eps to avoid numerical errors.If full is False, the last term
will be omitted, otherwise the last term will be approximated using Stirling formula:Note
Calculating the logarithm of a negative number or the exponent of a large positive number under Ascend will have a different range of return values and results different from those under GPU and CPU.
- Parameters
log_input (bool, optional) – Whether use log input. Default:
True
.full (bool, optional) – Whether include the Stirling approximation term in the loss calculation. Default:
False
.eps (float, optional) – Lower bound of input when calculating logarithms. Default:
1e-08
.reduction (str, optional) –
Apply specific reduction method to the output:
'none'
,'mean'
,'sum'
. Default:'mean'
.'none'
: no reduction will be applied.'mean'
: compute and return the mean of elements in the output.'sum'
: the output elements will be summed.
- Inputs:
input (Tensor) - The input Tensor. The shape can be any number of dimensions.
target (Tensor) - The label Tensor which has the same shape as input.
- Outputs:
Tensor or Scalar, if reduction is
'none'
, then output is a tensor and has the same shape as input. Otherwise it is a scalar.
- Raises
- Supported Platforms:
Ascend
GPU
CPU
Examples
>>> import mindspore as ms >>> import mindspore.nn as nn >>> x = ms.Tensor([[0.3, 0.7], [0.5, 0.5]]) >>> target = ms.Tensor([[1.0, 2.0], [3.0, 4.0]]) >>> loss = nn.PoissonNLLLoss() >>> output = loss(x, target) >>> print(output.asnumpy()) 0.3652635