mindspore.nn.ELU
- class mindspore.nn.ELU(alpha=1.0)[source]
Exponential Linear Unit activation function.
Applies the exponential linear unit function element-wise. The activation function is defined as:
\[E_{i} = \begin{cases} x_i, &\text{if } x_i \geq 0; \cr \alpha * (\exp(x_i) - 1), &\text{otherwise.} \end{cases}\]where \(x_i\) represents the element of the input and \(\alpha\) represents the alpha parameter.
ELU Activation Function Graph:
- Parameters
alpha (float) – The alpha value of ELU, the data type is float. Default:
1.0
.
- Inputs:
x (Tensor) - The input of ELU is a Tensor of any dimension with data type of float16 or float32.
- Outputs:
Tensor, with the same type and shape as the x.
- Raises
TypeError – If alpha is not a float.
TypeError – If dtype of x is neither float16 nor float32.
ValueError – If alpha is not equal to 1.0.
- Supported Platforms:
Ascend
GPU
CPU
Examples
>>> import mindspore >>> from mindspore import Tensor, nn >>> import numpy as np >>> x = Tensor(np.array([-1, -2, 0, 2, 1]), mindspore.float32) >>> elu = nn.ELU() >>> result = elu(x) >>> print(result) [-0.63212055 -0.86466473 0. 2. 1.]