mindspore.mint.nn.GELU

查看源文件
class mindspore.mint.nn.GELU(approximate='none')[源代码]

高斯误差线性单元激活函数(Gaussian Error Linear Unit)。

GELU的详细描述可以参考论文: Gaussian Error Linear Units (GELUs) , 或论文: BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding

更多参考详见 mindspore.mint.nn.functional.gelu()

GELU函数图:

../../_images/GELU.png
支持平台:

Ascend

样例:

>>> import mindspore
>>> from mindspore import Tensor, mint
>>> import numpy as np
>>> input = Tensor(np.array([[-1.0, 4.0, -8.0], [2.0, -5.0, 9.0]]), mindspore.float32)
>>> gelu = mint.nn.GELU()
>>> output = gelu(input)
>>> print(output)
[[-1.58655241e-01  3.99987316e+00 -0.00000000e+00]
 [ 1.95449972e+00 -1.41860323e-06  9.0000000e+00]]
>>> gelu = mint.nn.GELU(approximate="tanh")
>>> output = gelu(input)
>>> print(output)
[[-1.58808023e-01  3.99992990e+00 -3.10779147e-21]
 [ 1.95459759e+00 -2.29180174e-07  9.0000000e+00]]