mindspore.ops.FastGeLU
- class mindspore.ops.FastGeLU(*args, **kwargs)[source]
Fast Gaussian Error Linear Units activation function.
FastGeLU is defined as follows:
\[\text{output} = \frac {x} {1 + \exp(-1.702 * \left| x \right|)} * \exp(0.851 * (x - \left| x \right|)),\]where \(x\) is the element of the input.
- Inputs:
input_x (Tensor) - Input to compute the FastGeLU with data type of float16 or float32.
- Outputs:
Tensor, with the same type and shape as input.
- Raises
TypeError – If dtype of input_x is neither float16 nor float32.
- Supported Platforms:
Ascend
Examples
>>> tensor = Tensor(np.array([[-1.0, 4.0, -8.0], [2.0, -5.0, 9.0]]), mindspore.float32) >>> fast_gelu = P.FastGeLU() >>> output = fast_gelu(tensor) >>> print(output) [[-1.5420423e-01 3.9955849e+00 -9.7664278e-06] [ 1.9356585e+00 -1.0070159e-03 8.9999981e+00]]