mindspore.nn.HSwish
- class mindspore.nn.HSwish[source]
Hard swish activation function.
Applies hswish-type activation element-wise. The input is a Tensor with any valid shape.
Hard swish is defined as:
\[\text{hswish}(x_{i}) = x_{i} * \frac{ReLU6(x_{i} + 3)}{6},\]where \(x_{i}\) is the \(i\)-th slice in the given dimension of the input Tensor.
- Inputs:
input_data (Tensor) - The input of HSwish, data type must be float16 or float32.
- Outputs:
Tensor, with the same type and shape as the input_data.
- Raises
TypeError – If dtype of input_data is neither float16 nor float32.
- Supported Platforms:
GPU
CPU
Examples
>>> input_x = Tensor(np.array([-1, -2, 0, 2, 1]), mindspore.float16) >>> hswish = nn.HSwish() >>> result = hswish(input_x) >>> print(result) [-0.3333 -0.3333 0 1.666 0.6665]