mindspore.ops.GeLU
- class mindspore.ops.GeLU(*args, **kwargs)[source]
Gaussian Error Linear Units activation function.
GeLU is described in the paper Gaussian Error Linear Units (GELUs). And also please refer to BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding.
GeLU is defined as follows:
\[\text{output} = 0.5 * x * (1 + erf(x / \sqrt{2})),\]where \(erf\) is the “Gauss error function” .
- Inputs:
input_x (Tensor) - Input to compute the GeLU with data type of float16 or float32.
- Outputs:
Tensor, with the same type and shape as input.
- Raises
- Supported Platforms:
Ascend
GPU
CPU
Examples
>>> tensor = Tensor(np.array([1.0, 2.0, 3.0]), mindspore.float32) >>> gelu = ops.GeLU() >>> result = gelu(tensor) >>> print(result) [0.841192 1.9545976 2.9963627]