mindspore.mint.nn.SiLU
- class mindspore.mint.nn.SiLU[source]
Calculates the SiLU activation function element-wise. It is also sometimes referred to as Swish function.
The SiLU function is defined as follows:
\[\text{SiLU}(x) = x * \sigma(x),\]where \(x_i\) is an element of the input, \(\sigma(x)\) is Sigmoid function.
\[\text{sigmoid}(x_i) = \frac{1}{1 + \exp(-x_i)},\]SiLU Activation Function Graph:
Warning
This is an experimental API that is subject to change or deletion.
- Inputs:
input (Tensor) - input is \(x\) in the preceding formula. Input with the data type float16 or float32. Tensor of any dimension.
- Outputs:
Tensor, with the same type and shape as the input.
- Raises
TypeError – If dtype of input is neither float16 nor float32.
- Supported Platforms:
Ascend
Examples
>>> import mindspore >>> from mindspore import Tensor, mint >>> import numpy as np >>> input = Tensor(np.array([-1, 2, -3, 2, -1]), mindspore.float16) >>> silu = mint.nn.SiLU() >>> output = silu(input) >>> print(output) [-0.269 1.762 -0.1423 1.762 -0.269]