mindspore.ops.Softmax

class mindspore.ops.Softmax(*args, **kwargs)[source]

Softmax operation.

Applies the Softmax operation to the input tensor on the specified axis. Suppose a slice in the given aixs \(x\), then for each element \(x_i\), the Softmax function is shown as follows:

\[\text{output}(x_i) = \frac{exp(x_i)}{\sum_{j = 0}^{N-1}\exp(x_j)},\]

where \(N\) is the length of the tensor.

Parameters

axis (Union[int, tuple]) – The axis to perform the Softmax operation. Default: -1.

Inputs:
  • logits (Tensor) - The input of Softmax, with float16 or float32 data type.

Outputs:

Tensor, with the same type and shape as the logits.

Raises
  • TypeError – If axis is neither an int nor a tuple.

  • TypeError – If dtype of logits is neither float16 nor float32.

  • ValueError – If axis is a tuple whose length is less than 1.

  • ValueError – If axis is a tuple whose elements are not all in range [-len(logits), len(logits)).

Supported Platforms:

Ascend GPU CPU

Examples

>>> input_x = Tensor(np.array([1, 2, 3, 4, 5]), mindspore.float32)
>>> softmax = ops.Softmax()
>>> output = softmax(input_x)
>>> print(output)
[0.01165623 0.03168492 0.08612854 0.23412167 0.6364086 ]