mindspore.ops.GeLU
- class mindspore.ops.GeLU[source]
Gaussian Error Linear Units activation function.
GeLU is described in the paper Gaussian Error Linear Units (GELUs). And also please refer to BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding.
GeLU is defined as follows:
\[\text{output} = 0.5 * x * (1 + tanh(x / \sqrt{2})),\]where \(tanh\) is the hyperbolic tangent.
- Inputs:
x (Tensor) - Input to compute the GeLU with data type of float16 or float32.
- Outputs:
Tensor, with the same type and shape as x.
- Supported Platforms:
Ascend
GPU
CPU
Examples
>>> x = Tensor(np.array([1.0, 2.0, 3.0]), mindspore.float32) >>> gelu = ops.GeLU() >>> result = gelu(x) >>> print(result) [0.841192 1.9545976 2.9963627]