mindspore.ops.glu
- mindspore.ops.glu(x, axis=- 1)[source]
Computes GLU (Gated Linear Unit activation function) of input tensors.
\[{GLU}(a, b)= a \otimes \sigma(b)\]where \(a\) is the first half of the input matrices and \(b\) is the second half.
Here \(\sigma\) is the sigmoid function, and \(\otimes\) is the Hadamard product. See Language Modeling with Gated Convluational Networks.
- Parameters
- Returns
Tensor, the same dtype as the x, with the shape \((\ast_1, M, \ast_2)\) where \(M=N/2\).
- Raises
- Supported Platforms:
Ascend
CPU
Examples
>>> from mindspore import Tensor, ops >>> input = Tensor([[0.1,0.2,0.3,0.4],[0.5,0.6,0.7,0.8]]) >>> output = ops.glu(input) >>> print(output) [[0.05744425 0.11973753] [0.33409387 0.41398472]]