mindspore.nn.SiLU

class mindspore.nn.SiLU[source]

Sigmoid Linear Unit activation function.

Applies the sigmoid linear unit function element-wise.

SiLU(x)=xσ(x),

where xi is input, σ(x) is Sigmoid function.

sigmoid(xi)=11+exp(xi),

The picture about SiLU looks like this SiLU .

Inputs:
  • x (Tensor) - Input with the data type float16 or float32.

Outputs:

Tensor, with the same type and shape as the x.

Raises

TypeError – If dtype of x is neither float16 nor float32.

Supported Platforms:

Ascend GPU CPU

Examples

>>> x = Tensor(np.array([-1, 2, -3, 2, -1]), mindspore.float16)
>>> silu = nn.SiLU()
>>> output = silu(x)
>>> print(output)
[-0.269  1.762  -0.1423  1.762  -0.269]