mindspore.nn.SiLU

class mindspore.nn.SiLU[source]

Sigmoid Linear Unit activation function.

Applies the silu linear unit function element-wise.

\[\text{SiLU}(x) = x * \sigma(x),\]

where \(x_i\) is input, \(\sigma(x)\) is Sigmoid function.

\[\text{sigmoid}(x_i) = \frac{1}{1 + \exp(-x_i)},\]

SiLU Activation Function Graph:

../../_images/SiLU.png
Inputs:
  • x (Tensor) - Input with the data type float16 or float32.

Outputs:

Tensor, with the same type and shape as the x.

Raises

TypeError – If dtype of x is neither float16 nor float32.

Supported Platforms:

Ascend GPU CPU

Examples

>>> import mindspore
>>> from mindspore import Tensor, nn
>>> import numpy as np
>>> x = Tensor(np.array([-1, 2, -3, 2, -1]), mindspore.float16)
>>> silu = nn.SiLU()
>>> output = silu(x)
>>> print(output)
[-0.269  1.762  -0.1423  1.762  -0.269]