mindspore.ops.swiglu
- mindspore.ops.swiglu(input, dim=- 1)[source]
Computes SwiGLU (Swish-Gated Linear Unit activation function) of input tensor. SwiGLU is a variant of the
mindspore.ops.GLU
activation function, it is defined as:Warning
This is an experimental API that is subject to change or deletion.
where
is the first half of the input matrices and is the second half, Swish(a)=a (a), is themindspore.ops.sigmoid()
activation function and is the Hadamard product.- Parameters
- Returns
Tensor, the same dtype as the input, with the shape
where .- Raises
TypeError – If dtype of input is not float16, float32 or bfloat16.
TypeError – If input is not a Tensor.
RuntimeError – If the dimension specified by dim is not divisible by 2.
- Supported Platforms:
Ascend
Examples
>>> from mindspore import Tensor, ops >>> input = Tensor([[-0.12, 0.123, 31.122], [2.1223, 4.1212121217, 0.3123]], dtype=mindspore.float32) >>> output = ops.swiglu(input, 0) >>> print(output) [[-0.11970687 0.2690224 9.7194 ]]