mindspore.mint.special.log_softmax

View Source On Gitee
mindspore.mint.special.log_softmax(input, dim=None, *, dtype=None)[source]

Applies the Log Softmax function to the input tensor on the specified axis. Supposes a slice in the given axis, \(x\) for each element \(x_i\), the Log Softmax function is shown as follows:

\[\text{output}(x_i) = \log \left(\frac{\exp(x_i)} {\sum_{j = 0}^{N-1}\exp(x_j)}\right),\]

where \(N\) is the length of the Tensor.

Parameters
  • input (Tensor) – The input Tensor.

  • dim (int, optional) – The axis to perform the Log softmax operation. Default: None .

Keyword Arguments

dtype (mindspore.dtype, optional) – The desired dtype of returned Tensor. If not set to None, the input Tensor will be cast to dtype before the operation is performed. This is useful for preventing overflows. If set to None, stay the same as original Tensor. Default: None .

Returns

Tensor, with the same shape as the input.

Raises
  • TypeError – If dim is not an int.

  • ValueError – If dim is not in range [-len(input.shape), len(input.shape)).

Supported Platforms:

Ascend

Examples

>>> import mindspore
>>> import numpy as np
>>> from mindspore import Tensor, mint
>>> logits = Tensor(np.array([1, 2, 3, 4, 5]), mindspore.float32)
>>> output = mint.special.log_softmax(logits, dim=-1)
>>> print(output)
[-4.4519143 -3.4519143 -2.4519143 -1.4519144 -0.4519144]