mindspore.ops.FastGeLU

class mindspore.ops.FastGeLU[源代码]

Fast Gaussian Error Linear Units activation function.

FastGeLU is defined as follows:

\[\text{output} = \frac {x} {1 + \exp(-1.702 * \left| x \right|)} * \exp(0.851 * (x - \left| x \right|)),\]

where \(x\) is the element of the input.

Inputs:
  • x (Tensor) - Input to compute the FastGeLU with data type of float16 or float32.

Outputs:

Tensor, with the same type and shape as x.

Raises

TypeError – If dtype of x is neither float16 nor float32.

Supported Platforms:

Ascend

Examples

>>> x = Tensor(np.array([[-1.0, 4.0, -8.0], [2.0, -5.0, 9.0]]), mindspore.float32)
>>> fast_gelu = ops.FastGeLU()
>>> output = fast_gelu(x)
>>> print(output)
[[-1.5418735e-01  3.9921875e+00 -9.7473649e-06]
 [ 1.9375000e+00 -1.0052517e-03  8.9824219e+00]]