mindspore.ops.glu

mindspore.ops.glu(x, axis=- 1)[source]

Computes GLU (Gated Linear Unit activation function) of input tensors.

GLU(a,b)=aσ(b)

where a is the first half of the input matrices and b is the second half.

Here σ is the sigmoid function, and is the Hadamard product. See Language Modeling with Gated Convluational Networks.

Parameters
  • x (Tensor) – Tensor to be split. Its dtype is Number, and shape is (1,N,2) where * means, any number of additional dimensions.

  • axis (int, optional) – the axis to split the input. It must be int. Default: -1 , the last axis of x.

Returns

Tensor, the same dtype as the x, with the shape (1,M,2) where M=N/2.

Raises
Supported Platforms:

Ascend GPU CPU

Examples

>>> from mindspore import Tensor, ops
>>> input = Tensor([[0.1,0.2,0.3,0.4],[0.5,0.6,0.7,0.8]])
>>> output = ops.glu(input)
>>> print(output)
[[0.05744425 0.11973753]
 [0.33409387 0.41398472]]