mindspore.ops.glu
- mindspore.ops.glu(x, axis=- 1)[source]
Computes GLU (Gated Linear Unit activation function) of input tensors .
where
is the first half of the input matrices and is the second half.Here
is the sigmoid function, and is the Hadamard product. See Language Modeling with Gated Convluational Networks.- Parameters
- Returns
Tensor, the same dtype as the x, with the shape
where .- Raises
- Supported Platforms:
Ascend
GPU
CPU
Examples
>>> input = Tensor([[0.1,0.2,0.3,0.4],[0.5,0.6,0.7,0.8]]) >>> output = ops.glu(input) >>> print(output) [[0.05744425 0.11973753] [0.33409387 0.41398472]]