mindspore.nn.PoissonNLLLoss

class mindspore.nn.PoissonNLLLoss(log_input=True, full=False, eps=1e-08, reduction='mean')[source]

Poisson negative log likelihood loss.

The loss is:

LD=i=0|D|(xiyilnxi+lnyi!)

where LD is the loss, yi is the target, xi is the input.

If log_input is True, use exiyixi instead of xiyilnxi. When calculating logarithms, the lower bound of input is set to eps to avoid numerical errors.

If full is False, the last term lnyi! will be omitted, otherwise the last term will be approximated using Stirling formula:

n!2πn(ne)n

Note

Calculating the logarithm of a negative number or the exponent of a large positive number under Ascend will have a different range of return values and results different from those under GPU and CPU.

Parameters
  • log_input (bool, optional) – Whether use log input. Default: True .

  • full (bool, optional) – Whether include the Stirling approximation term in the loss calculation. Default: False .

  • eps (float, optional) – Lower bound of input when calculating logarithms. Default: 1e-08 .

  • reduction (str, optional) –

    Apply specific reduction method to the output: 'none' , 'mean' , 'sum' . Default: 'mean' .

    • 'none': no reduction will be applied.

    • 'mean': compute and return the mean of elements in the output.

    • 'sum': the output elements will be summed.

Inputs:
  • input (Tensor) - The input Tensor. The shape can be any number of dimensions.

  • target (Tensor) - The label Tensor which has the same shape as input.

Outputs:

Tensor or Scalar, if reduction is 'none', then output is a tensor and has the same shape as input. Otherwise it is a scalar.

Raises
  • TypeError – If reduction is not a str.

  • TypeError – If neither input nor target is a tensor.

  • TypeError – If dtype of input or target is not currently supported.

Supported Platforms:

Ascend GPU CPU

Examples

>>> import mindspore as ms
>>> import mindspore.nn as nn
>>> x = ms.Tensor([[0.3, 0.7], [0.5, 0.5]])
>>> target = ms.Tensor([[1.0, 2.0], [3.0, 4.0]])
>>> loss = nn.PoissonNLLLoss()
>>> output = loss(x, target)
>>> print(output.asnumpy())
0.3652635