mindspore.ops.softmin

mindspore.ops.softmin(x, axis=- 1, *, dtype=None)[source]

Applies the Softmin operation to the input tensor on the specified axis. Suppose a slice in the given axis \(x\), then for each element \(x_i\), the Softmin function is shown as follows:

\[\text{output}(x_i) = \frac{exp(-x_i)}{\sum_{j = 0}^{N-1}\exp(-x_j)},\]

where \(N\) is the length of the tensor.

Parameters
  • axis (Union[int, tuple[int]], optional) – The axis to perform the Softmin operation. Default: -1.

  • x (Tensor) – Tensor of shape \((N, *)\), where \(*\) means, any number of additional dimensions, with float16 or float32 data type.

Keyword Arguments

dtype (mindspore.dtype, optional) – When set, x will be converted to the specified type, dtype, before execution, and dtype of returned Tensor will also be dtype. Default: None.

Returns

Tensor, with the same type and shape as the logits.

Raises
  • TypeError – If axis is not an int or a tuple.

  • TypeError – If dtype of x is neither float16 nor float32.

  • ValueError – If axis is a tuple whose length is less than 1.

  • ValueError – If axis is a tuple whose elements are not all in range [-len(logits.shape), len(logits.shape)).

Supported Platforms:

Ascend GPU CPU

Examples

>>> x = Tensor(np.array([-1, -2, 0, 2, 1]), mindspore.float16)
>>> output = ops.softmin(x)
>>> print(output)
[0.2341  0.636  0.0862  0.01165  0.03168 ]