mindspore.nn.HShrink

View Source On Gitee
class mindspore.nn.HShrink(lambd=0.5)[source]

Applies Hard Shrink activation function element-wise.

The formula is defined as follows:

\[\begin{split}\text{HardShrink}(x) = \begin{cases} x, & \text{ if } x > \lambda \\ x, & \text{ if } x < -\lambda \\ 0, & \text{ otherwise } \end{cases}\end{split}\]

HShrink Activation Function Graph:

../../_images/HShrink.png
Parameters

lambd (number, optional) – The threshold \(\lambda\) defined by the Hard Shrink formula. Default: 0.5 .

Inputs:
  • input (Tensor) - The input of Hard Shrink. Supported dtypes:

    • Ascend: float16, float32, bfloat16.

    • CPU/GPU: float16, float32.

Outputs:

Tensor, the same shape and data type as the input.

Raises
  • TypeError – If lambd is not a float, int or bool.

  • TypeError – If input is not a tensor.

  • TypeError – If dtype of input is not float16, float32 or bfloat16.

Supported Platforms:

Ascend GPU CPU

Examples

>>> import mindspore
>>> from mindspore import Tensor, nn
>>> import numpy as np
>>> input = Tensor(np.array([[0.5, 1, 2.0], [0.0533, 0.0776, -2.1233]]), mindspore.float32)
>>> hshrink = nn.HShrink()
>>> output = hshrink(input)
>>> print(output)
[[ 0.      1.      2.    ]
 [ 0.      0.     -2.1233]]