mindspore.ops.LogSoftmax

class mindspore.ops.LogSoftmax(*args, **kwargs)[source]

Log Softmax activation function.

Applies the Log Softmax function to the input tensor on the specified axis. Suppose a slice in the given aixs, \(x\) for each element \(x_i\), the Log Softmax function is shown as follows:

\[\text{output}(x_i) = \log \left(\frac{\exp(x_i)} {\sum_{j = 0}^{N-1}\exp(x_j)}\right),\]

where \(N\) is the length of the Tensor.

Parameters

axis (int) – The axis to perform the Log softmax operation. Default: -1.

Inputs:
  • logits (Tensor) - The input of Log Softmax, with float16 or float32 data type.

Outputs:

Tensor, with the same type and shape as the logits.

Raises
  • TypeError – If axis is not an int.

  • TypeError – If dtype of logits is neither float16 nor float32.

  • ValueError – If axis is not in range [-len(logits), len(logits)].

Supported Platforms:

Ascend GPU CPU

Examples

>>> input_x = Tensor(np.array([1, 2, 3, 4, 5]), mindspore.float32)
>>> log_softmax = ops.LogSoftmax()
>>> output = log_softmax(input_x)
>>> print(output)
[-4.4519143 -3.4519143 -2.4519143 -1.4519144 -0.4519144]