mindspore.nn.FastGelu

class mindspore.nn.FastGelu[source]

Fast Gaussian error linear unit activation function.

Applies FastGelu function to each element of the input. The input is a Tensor with any valid shape.

FastGelu is defined as:

\[FastGelu(x_i) = \frac {x_i} {1 + \exp(-1.702 * \left| x_i \right|)} * \exp(0.851 * (x_i - \left| x_i \right|))\]

where \(x_i\) is the element of the input.

Inputs:
  • input_data (Tensor) - The input of FastGelu with data type of float16 or float32.

Outputs:

Tensor, with the same type and shape as the input_data.

Raises

TypeError – If dtype of input_data is neither float16 nor float32.

Supported Platforms:

Ascend

Examples

>>> input_x = Tensor(np.array([[-1.0, 4.0, -8.0], [2.0, -5.0, 9.0]]), mindspore.float32)
>>> fast_gelu = nn.FastGelu()
>>> output = fast_gelu(input_x)
>>> print(output)
[[-1.5420423e-01  3.9955850e+00 -9.7664279e-06]
 [ 1.9356586e+00 -1.0070159e-03  8.9999981e+00]]