mindspore.nn.SiLU

class mindspore.nn.SiLU[source]

Applies the silu linear unit function element-wise.

\[\text{SiLU}(x) = x * \sigma(x),\]

where \(x_i\) is an element of the input, \(\sigma(x)\) is Sigmoid function.

\[\text{sigmoid}(x_i) = \frac{1}{1 + \exp(-x_i)},\]

SiLU Activation Function Graph:

../../_images/SiLU.png
Inputs:
  • input (Tensor) - input is \(x\) in the preceding formula. Input with the data type float16 or float32. Tensor of any dimension.

Outputs:

Tensor, with the same type and shape as the input.

Raises

TypeError – If dtype of input is neither float16 nor float32.

Supported Platforms:

Ascend GPU CPU

Examples

>>> import mindspore
>>> from mindspore import Tensor, nn
>>> import numpy as np
>>> input = Tensor(np.array([-1, 2, -3, 2, -1]), mindspore.float16)
>>> silu = nn.SiLU()
>>> output = silu(input)
>>> print(output)
[-0.269  1.762  -0.1423  1.762  -0.269]