mindspore.ops.Softplus
- class mindspore.ops.Softplus(*args, **kwargs)[source]
Softplus activation function.
Softplus is a smooth approximation to the ReLU function. It can be used to constrain the output of a machine to always be positive. The function is shown as follows:
\[\text{output} = \log(1 + \exp(\text{input_x})),\]- Inputs:
input_x (Tensor) - The input tensor whose data type must be float.
- Outputs:
Tensor, with the same type and shape as the input_x.
- Supported Platforms:
Ascend
GPU
Examples
>>> input_x = Tensor(np.array([1, 2, 3, 4, 5]), mindspore.float32) >>> softplus = ops.Softplus() >>> output = softplus(input_x) >>> print(output) [1.3132615 2.126928 3.0485873 4.01815 5.0067153]