mindspore.nn.ELU

class mindspore.nn.ELU(alpha=1.0)[source]

Applies the exponential linear unit function element-wise.

The activation function is defined as:

Ei={xi,if xi0;α(exp(xi)1),otherwise.

where xi represents the element of the input and α represents the alpha parameter.

ELU Activation Function Graph:

../../_images/ELU.png
Parameters

alpha (float) – The alpha value of ELU, the data type is float. Default: 1.0 . Only alpha equal to 1.0 is supported currently.

Inputs:
  • input_x (Tensor) - The input of ELU is a Tensor of any dimension with data type of float16 or float32.

Outputs:

Tensor, with the same type and shape as the input_x.

Raises
  • TypeError – If alpha is not a float.

  • TypeError – If dtype of input_x is neither float16 nor float32.

  • ValueError – If alpha is not equal to 1.0.

Supported Platforms:

Ascend GPU CPU

Examples

>>> import mindspore
>>> from mindspore import Tensor, nn
>>> import numpy as np
>>> x = Tensor(np.array([-1, -2, 0, 2, 1]), mindspore.float32)
>>> elu = nn.ELU()
>>> result = elu(x)
>>> print(result)
[-0.63212055  -0.86466473  0.  2.  1.]