mindspore.ops.softshrink
- mindspore.ops.softshrink(x, lambd=0.5)[source]
Applies the Softshrink function element-wise.
\[\begin{split}\text{SoftShrink}(x) = \begin{cases} x - \lambda, & \text{ if } x > \lambda \\ x + \lambda, & \text{ if } x < -\lambda \\ 0, & \text{ otherwise } \end{cases}\end{split}\]SoftShrink Activation Function Graph:
- Parameters
- Returns
Tensor, has the same shape and data type as x.
- Raises
TypeError – If lambd is not a float.
TypeError – If x is not a Tensor.
TypeError – If dtype of x is neither float16 nor float32.
ValueError – If lambd is less than 0.
- Supported Platforms:
Ascend
GPU
CPU
Examples
>>> import mindspore >>> from mindspore import Tensor >>> from mindspore import ops >>> import numpy as np >>> x = Tensor(np.array([[ 0.5297, 0.7871, 1.1754], [ 0.7836, 0.6218, -1.1542]]), mindspore.float32) >>> output = ops.softshrink(x) >>> print(output) [[ 0.02979 0.287 0.676 ] [ 0.2837 0.1216 -0.6543 ]]