mindspore.nn.HSwish

class mindspore.nn.HSwish[source]

Hard swish activation function.

Applies hswish-type activation element-wise. The input is a Tensor with any valid shape.

Hard swish is defined as:

\[\text{hswish}(x_{i}) = x_{i} * \frac{ReLU6(x_{i} + 3)}{6},\]

where \(x_{i}\) is the \(i\)-th slice in the given dimension of the input Tensor.

Inputs:
  • x (Tensor) - The input of HSwish, data type must be float16 or float32. The shape is \((N,*)\) where \(*\) means, any number of additional dimensions.

Outputs:

Tensor, with the same type and shape as the x.

Raises

TypeError – If dtype of x is neither float16 nor float32.

Supported Platforms:

GPU CPU

Examples

>>> x = Tensor(np.array([-1, -2, 0, 2, 1]), mindspore.float16)
>>> hswish = nn.HSwish()
>>> result = hswish(x)
>>> print(result)
[-0.3333  -0.3333  0  1.666  0.6665]