mindspore.ops.HSwish
- class mindspore.ops.HSwish[source]
Hard swish activation function.
Applies hswish-type activation element-wise. The input is a Tensor with any valid shape.
Hard swish is defined as:
where
is an element of the input Tensor.- Inputs:
input_x (Tensor) - Tensor of shape
, where means, any number of additional dimensions, with float16 or float32 data type.
- Outputs:
Tensor, with the same type and shape as the input_x.
- Raises
- Supported Platforms:
GPU
CPU
Examples
>>> hswish = ops.HSwish() >>> input_x = Tensor(np.array([-1, -2, 0, 2, 1]), mindspore.float16) >>> result = hswish(input_x) >>> print(result) [-0.3333 -0.3333 0 1.666 0.6665]