mindspore.ops.log_softmax

View Source On Gitee
mindspore.ops.log_softmax(logits, axis=- 1)[source]

Applies the Log Softmax function to the input tensor on the specified axis. Supposes a slice in the given axis, x for each element xi, the Log Softmax function is shown as follows:

output(xi)=log(exp(xi)j=0N1exp(xj)),

where N is the length of the Tensor.

Parameters
  • logits (Tensor) – The input Tensor, which is the x in the formula above, it's shape is (N,), where means, any number of additional dimensions, with float16 or float32 data type.

  • axis (int) – The axis to perform the Log softmax operation. Default: -1 .

Returns

Tensor, with the same type and shape as the logits.

Raises
  • TypeError – If axis is not an int.

  • TypeError – If dtype of logits is neither float16 nor float32.

  • ValueError – If axis is not in range [-len(logits.shape), len(logits.shape)).

  • ValueError – If dimension of logits is less than 1.

Supported Platforms:

Ascend GPU CPU

Examples

>>> import mindspore
>>> import numpy as np
>>> from mindspore import Tensor, ops
>>> logits = Tensor(np.array([1, 2, 3, 4, 5]), mindspore.float32)
>>> output = ops.log_softmax(logits)
>>> print(output)
[-4.4519143 -3.4519143 -2.4519143 -1.4519144 -0.4519144]