mindspore.mint.nn.ELU
- class mindspore.mint.nn.ELU(alpha=1.0, inplace=False)[source]
Exponential Linear Unit activation function
Applies the exponential linear unit function element-wise.The activation function is defined as:
where
represents the element of the input and represents the alpha parameter, and alpha represents the smoothness of the ELU.ELU Activation Function Graph:
Warning
This is an experimental API that is subject to change or deletion.
- Parameters
- Inputs:
input (Tensor) - The input of ELU is a Tensor of any dimension.
- Outputs:
Tensor, with the same shape and type as the input.
- Raises
RuntimeError – If the dtype of input is not float16, float32 or bfloat16.
TypeError – If the dtype of alpha is not float.
- Supported Platforms:
Ascend
Examples
>>> import mindspore >>> from mindspore import Tensor, mint >>> import numpy as np >>> input = Tensor(np.array([-1, -2, 0, 2, 1]), mindspore.float32) >>> elu = mint.nn.ELU() >>> result = elu(input) >>> print(result) [-0.63212055 -0.86466473 0. 2. 1.]