mindspore.ops.silu
- mindspore.ops.silu(input)[source]
Computes Sigmoid Linear Unit of input element-wise, also known as Swish function. The SiLU function is defined as:
where
is an element of the input, is Sigmoid function.SiLU Function Graph:
- Parameters
input (Tensor) – input is
in the preceding formula. Input with the data type float16 or float32.- Returns
Tensor, with the same type and shape as the input.
- Raises
TypeError – If dtype of input is neither float16 nor float32.
- Supported Platforms:
Ascend
GPU
CPU
Examples
>>> import mindspore >>> from mindspore import Tensor, ops >>> import numpy as np >>> input = Tensor(np.array([-1, 2, -3, 2, -1]), mindspore.float16) >>> output = ops.silu(input) >>> print(output) [-0.269 1.762 -0.1423 1.762 -0.269]