mindspore.nn.SiLU
- class mindspore.nn.SiLU[source]
Sigmoid Linear Unit activation function.
Applies the sigmoid linear unit function element-wise.
\[\text{SiLU}(x) = x * \sigma(x),\]where \(x_i\) is input, \(\sigma(x)\) is Sigmoid function, which is defined as:
\[\text{sigmoid}(x_i) = \frac{1}{1 + \exp(-x_i)},\]The picture about SiLU looks like this SiLU .
- Inputs:
x (Tensor) - Input with the data type float16 or float32. Tensor of arbitrary dimensions.
- Outputs:
Tensor, with the same type and shape as the x.
- Raises
TypeError – If dtype of x is neither float16 nor float32.
- Supported Platforms:
Ascend
GPU
CPU
Examples
>>> x = Tensor(np.array([-1, 2, -3, 2, -1]), mindspore.float16) >>> silu = nn.SiLU() >>> output = silu(x) >>> print(output) [-0.269 1.762 -0.1423 1.762 -0.269]