mindspore.mint.nn.functional.softmax
- mindspore.mint.nn.functional.softmax(input, dim=None, dtype=None)[source]
Applies the Softmax operation to the input tensor on the specified axis. Suppose a slice in the given axis
, then for each element , the Softmax function is shown as follows:where
is the length of the tensor.- Parameters
input (Tensor) – Tensor of shape
, where means, any number of additional dimensions.dim (int, optional) – The dim to perform the Softmax operation. Default:
None
.dtype (
mindspore.dtype
, optional) – When set, input will be converted to the specified type, dtype, before execution, and dtype of returned Tensor will also be dtype. Default:None
.
- Returns
Tensor, with the same type and shape as the input.
- Raises
TypeError – If dim is not an int.
- Supported Platforms:
Ascend
Examples
>>> import mindspore >>> import numpy as np >>> from mindspore import Tensor, mint >>> input = Tensor(np.array([1, 2, 3, 4, 5]), mindspore.float32) >>> output = mint.nn.functional.softmax(input) >>> print(output) [0.01165623 0.03168492 0.08612854 0.23412167 0.6364086 ]