mindspore.mint.nn.LogSoftmax
- class mindspore.mint.nn.LogSoftmax(dim=None)[source]
Applies the Log Softmax function to the input tensor on the specified axis. Supposes a slice in the given axis, \(x\) for each element \(x_i\), the Log Softmax function is shown as follows:
\[\text{output}(x_i) = \log \left(\frac{\exp(x_i)} {\sum_{j = 0}^{N-1}\exp(x_j)}\right),\]where \(N\) is the length of the Tensor.
- Parameters
dim (int, optional) – The axis to perform the Log softmax operation. Default:
None
.- Returns
Tensor, with the same shape as the input.
- Raises
ValueError – If dim is not in range [-len(input.shape), len(input.shape)).
- Supported Platforms:
Ascend
Examples
>>> import mindspore >>> from mindspore import Tensor, mint >>> import numpy as np >>> x = Tensor(np.array([[-1.0, 4.0, -8.0], [2.0, -5.0, 9.0]]), mindspore.float32) >>> log_softmax = mint.nn.LogSoftmax(dim=-1) >>> output = log_softmax(x) >>> print(output) [[-5.00672150e+00 -6.72150636e-03 -1.20067215e+01] [-7.00091219e+00 -1.40009127e+01 -9.12250078e-04]]