mindspore.nn.GELU
- class mindspore.nn.GELU[source]
Gaussian error linear unit activation function.
Applies GELU function to each element of the input. The input is a Tensor with any valid shape.
GELU is defined as:
\[GELU(x_i) = x_i*P(X < x_i),\]where \(P\) is the cumulative distribution function of standard Gaussian distribution and \(x_i\) is the element of the input.
The picture about GELU looks like this GELU.
- Inputs:
input_data (Tensor) - The input of GELU with data type of float16 or float32.
- Outputs:
Tensor, with the same type and shape as the input_data.
- Raises
TypeError – If dtype of input_data is neither float16 nor float32.
- Supported Platforms:
Ascend
GPU
CPU
Examples
>>> input_x = Tensor(np.array([[-1.0, 4.0, -8.0], [2.0, -5.0, 9.0]]), mindspore.float32) >>> gelu = nn.GELU() >>> output = gelu(input_x) >>> print(output) [[-1.5880802e-01 3.9999299e+00 -3.1077917e-21] [ 1.9545976e+00 -2.2918017e-07 9.0000000e+00]]