mindspore.mint.nn.LogSoftmax

View Source On Gitee
class mindspore.mint.nn.LogSoftmax(dim=None)[source]

Applies the Log Softmax function to the input tensor on the specified axis. Supposes a slice in the given axis, x for each element xi, the Log Softmax function is shown as follows:

output(xi)=log(exp(xi)j=0N1exp(xj)),

where N is the length of the Tensor.

Parameters

dim (int, optional) – The axis to perform the Log softmax operation. Default: None .

Returns

Tensor, with the same shape as the input.

Raises

ValueError – If dim is not in range [-len(input.shape), len(input.shape)).

Supported Platforms:

Ascend

Examples

>>> import mindspore
>>> from mindspore import Tensor, mint
>>> import numpy as np
>>> x = Tensor(np.array([[-1.0, 4.0, -8.0], [2.0, -5.0, 9.0]]), mindspore.float32)
>>> log_softmax = mint.nn.LogSoftmax(dim=-1)
>>> output = log_softmax(x)
>>> print(output)
[[-5.00672150e+00 -6.72150636e-03 -1.20067215e+01]
 [-7.00091219e+00 -1.40009127e+01 -9.12250078e-04]]