mindspore.nn.CELU

class mindspore.nn.CELU(alpha=1.0)[源代码]

Continuously differentiable exponential linear units activation function.

Applies the continuously differentiable exponential linear units function element-wise.

CELU(x)=max(0,x)+min(0,α(exp(x/α)1))

It returns element-wise max(0,x)+min(0,α(exp(x/α)1)).

The picture about CELU looks like this CELU.

Parameters

alpha (float) – The α value for the Celu formulation. Default: 1.0

Inputs:
  • x (Tensor) - The input of CELU. The required dtype is float16 or float32. The shape is (N,) where means, any number of additional dimensions.

Outputs:

Tensor, with the same type and shape as the x.

Raises
  • TypeError – If alpha is not a float.

  • ValueError – If alpha has the value of 0.

  • TypeError – If x is not a Tensor.

  • TypeError – If the dtype of ‘input_x’ is neither float16 nor float32.

Supported Platforms:

Ascend GPU CPU

Examples

>>> x = Tensor(np.array([-2.0, -1.0, 1.0, 2.0]), mindspore.float32)
>>> celu = nn.CELU()
>>> output = celu(x)
>>> print(output)
[-0.86466473 -0.63212055  1.          2.        ]