mindspore.nn.Softmax
- class mindspore.nn.Softmax(axis=- 1)[source]
Softmax activation function.
Applies the Softmax function to an n-dimensional input Tensor.
The input is a Tensor of logits transformed with exponential function and then normalized to lie in range [0, 1] and sum up to 1.
Softmax is defined as:
\[\text{softmax}(x_{i}) = \frac{\exp(x_i)}{\sum_{j=0}^{n-1}\exp(x_j)},\]where \(x_{i}\) is the \(i\)-th slice in the given dimension of the input Tensor.
- Parameters
axis (Union[int, tuple[int]]) – The axis to apply Softmax operation, -1 means the last dimension. Default: -1.
- Inputs:
x (Tensor) - The input of Softmax with data type of float16 or float32.
- Outputs:
Tensor, which has the same type and shape as x with values in the range[0,1].
- Raises
TypeError – If axis is neither an int nor a tuple.
TypeError – If dtype of x is neither float16 nor float32.
ValueError – If axis is a tuple whose length is less than 1.
ValueError – If axis is a tuple whose elements are not all in range [-len(x), len(x)).
- Supported Platforms:
Ascend
GPU
CPU
Examples
>>> input_x = Tensor(np.array([-1, -2, 0, 2, 1]), mindspore.float16) >>> softmax = nn.Softmax() >>> output = softmax(input_x) >>> print(output) [0.03168 0.01166 0.0861 0.636 0.2341 ]