mindspore.ops.Softplus
- class mindspore.ops.Softplus[source]
Softplus activation function.
Softplus is a smooth approximation to the ReLU function. It can be used to constrain the output of a machine to always be positive. The function is shown as follows:
\[\text{output} = \log(1 + \exp(\text{x})),\]- Inputs:
input_x (Tensor) - Tensor of shape \((N, *)\), where \(*\) means, any number of additional dimensions, with float16 or float32 data type.
- Outputs:
Tensor, with the same type and shape as the input_x.
- Raises
- Supported Platforms:
Ascend
GPU
CPU
Examples
>>> input_x = Tensor(np.array([1, 2, 3, 4, 5]), mindspore.float32) >>> softplus = ops.Softplus() >>> output = softplus(input_x) >>> print(output) [1.3132615 2.126928 3.0485873 4.01815 5.0067153]