mindspore.ops.Softmax
- class mindspore.ops.Softmax(*args, **kwargs)[source]
Softmax operation.
Applies the Softmax operation to the input tensor on the specified axis. Supposes a slice in the given aixs \(x\), then for each element \(x_i\), the Softmax function is shown as follows:
\[\text{output}(x_i) = \frac{exp(x_i)}{\sum_{j = 0}^{N-1}\exp(x_j)},\]where \(N\) is the length of the tensor.
- Inputs:
logits (Tensor) - Tensor of shape \((N, *)\), where \(*\) means, any number of additional dimensions, with float16 or float32 data type.
- Outputs:
Tensor, with the same type and shape as the logits.
- Raises
TypeError – If axis is neither an int nor a tuple.
TypeError – If dtype of logits is neither float16 nor float32.
ValueError – If axis is a tuple whose length is less than 1.
ValueError – If axis is a tuple whose elements are not all in range [-len(logits.shape), len(logits.shape)).
- Supported Platforms:
Ascend
GPU
CPU
Examples
>>> logits = Tensor(np.array([1, 2, 3, 4, 5]), mindspore.float32) >>> softmax = ops.Softmax() >>> output = softmax(logits) >>> print(output) [0.01165623 0.03168492 0.08612854 0.23412167 0.6364086 ]