mindspore.nn.SoftShrink
- class mindspore.nn.SoftShrink(lambd=0.5)[源代码]
Applies the SoftShrink function element-wise.
\[\begin{split}\text{SoftShrink}(x) = \begin{cases} x - \lambda, & \text{ if } x > \lambda \\ x + \lambda, & \text{ if } x < -\lambda \\ 0, & \text{ otherwise } \end{cases}\end{split}\]- Parameters
lambd – the \(\lambda\) must be no less than zero for the SoftShrink formulation. Default: 0.5.
- Inputs:
input_x (Tensor) - The input of SoftShrink with data type of float16 or float32. Any number of additional dimensions.
- Outputs:
Tensor, has the same shape and data type as input_x.
- Raises
TypeError – If lambd is not a float.
TypeError – If input_x is not a Tensor.
TypeError – If dtype of input_x is neither float16 nor float32.
ValueError – If lambd is less than 0.
- Supported Platforms:
Ascend
Examples
>>> input_x = Tensor(np.array([[ 0.5297, 0.7871, 1.1754], [ 0.7836, 0.6218, -1.1542]]), mstype.float16) >>> softshrink = nn.SoftShrink() >>> output = softshrink(input_x) >>> print(output) [[ 0.02979 0.287 0.676 ] [ 0.2837 0.1216 -0.6543 ]]