mindspore.nn.HSwish
- class mindspore.nn.HSwish[source]
Applies hswish-type activation element-wise. The input is a Tensor with any valid shape.
Hard swish is defined as:
\[\text{hswish}(x_{i}) = x_{i} * \frac{ReLU6(x_{i} + 3)}{6},\]HSwish Activation Function Graph:
- Inputs:
x (Tensor) - The input of HSwish, data type must be float16 or float32. The shape is \((N,*)\) where \(*\) means, any number of additional dimensions.
- Outputs:
Tensor, with the same type and shape as the x.
- Raises
TypeError – If dtype of x is neither float16 nor float32.
- Supported Platforms:
Ascend
GPU
CPU
Examples
>>> import mindspore >>> from mindspore import Tensor, nn >>> import numpy as np >>> x = Tensor(np.array([-1, -2, 0, 2, 1]), mindspore.float16) >>> hswish = nn.HSwish() >>> result = hswish(x) >>> print(result) [-0.3333 -0.3333 0. 1.667 0.6665]