mindspore.mint.nn.functional.hardswish

mindspore.mint.nn.functional.hardswish(input)[source]

Hard Swish activation function. The input is a Tensor with any valid shape.

Hard swish is defined as:

HardSwish(input)={0, if input3,input, if input+3,input(input+3)/6, otherwise 

HardSwish Activation Function Graph:

../../_images/Hardswish.png
Parameters

input (Tensor) – The input Tensor.

Returns

Tensor, with the same type and shape as the input.

Raises
  • TypeError – If input is not a Tensor.

  • TypeError – If input is neither int nor float.

Supported Platforms:

Ascend

Examples

>>> import mindspore
>>> import numpy as np
>>> from mindspore import Tensor, mint
>>> input = Tensor(np.array([-1, -2, 0, 2, 1]), mindspore.float16)
>>> output = mint.nn.functional.hardswish(input)
>>> print(output)
[-0.3333  -0.3333  0  1.667  0.6665]