mindspore.ops.Softplus
- class mindspore.ops.Softplus[source]
Softplus activation function.
Softplus is a smooth approximation to the ReLU function. It can be used to constrain the output of a machine to always be positive. The function is shown as follows:
\[\text{output} = \log(1 + \exp(\text{x}))\]- Inputs:
input_x (Tensor) - Tensor of any dimension, with float16, float32 or float64(CPU, GPU) data type.
- Outputs:
Tensor, with the same type and shape as the input_x.
- Raises
- Supported Platforms:
Ascend
GPU
CPU
Examples
>>> import mindspore >>> import numpy as np >>> from mindspore import Tensor, ops >>> input_x = Tensor(np.array([1, 2, 3, 4, 5]), mindspore.float32) >>> softplus = ops.Softplus() >>> output = softplus(input_x) >>> print(output) [1.3132615 2.126928 3.0485873 4.01815 5.0067153]