mindspore.ops.LogSoftmax
- class mindspore.ops.LogSoftmax(axis=- 1)[source]
Log Softmax activation function.
Applies the Log Softmax function to the input tensor on the specified axis. Supposes a slice in the given aixs, \(x\) for each element \(x_i\), the Log Softmax function is shown as follows:
\[\text{output}(x_i) = \log \left(\frac{\exp(x_i)} {\sum_{j = 0}^{N-1}\exp(x_j)}\right),\]where \(N\) is the length of the Tensor.
- Parameters
axis (int) – The axis to perform the Log softmax operation. Default: -1.
- Inputs:
logits (Tensor) - Tensor of shape \((N, *)\), where \(*\) means, any number of additional dimensions, with float16 or float32 data type.
- Outputs:
Tensor, with the same type and shape as the logits.
- Raises
TypeError – If axis is not an int.
TypeError – If dtype of logits is neither float16 nor float32.
ValueError – If axis is not in range [-len(logits.shape), len(logits.shape)).
- Supported Platforms:
Ascend
GPU
CPU
Examples
>>> logits = Tensor(np.array([1, 2, 3, 4, 5]), mindspore.float32) >>> log_softmax = ops.LogSoftmax() >>> output = log_softmax(logits) >>> print(output) [-4.4519143 -3.4519143 -2.4519143 -1.4519144 -0.4519144]