mindspore.nn.LeakyReLU
- class mindspore.nn.LeakyReLU(alpha=0.2)[source]
Leaky ReLU activation function.
LeakyReLU is similar to ReLU, but LeakyReLU has a slope that makes it not equal to 0 at x < 0. The activation function is defined as:
\[\text{leaky_relu}(x) = \begin{cases}x, &\text{if } x \geq 0; \cr \text{alpha} * x, &\text{otherwise.}\end{cases}\]See https://ai.stanford.edu/~amaas/papers/relu_hybrid_icml2013_final.pdf
- Inputs:
x (Tensor) - The input of LeakyReLU. The shape is \((N,*)\) where \(*\) means, any number of additional dimensions.
- Outputs:
Tensor, has the same type and shape as the x.
- Raises
TypeError – If alpha is not a float or an int.
- Supported Platforms:
Ascend
GPU
CPU
Examples
>>> x = Tensor(np.array([[-1.0, 4.0, -8.0], [2.0, -5.0, 9.0]]), mindspore.float32) >>> leaky_relu = nn.LeakyReLU() >>> output = leaky_relu(x) >>> print(output) [[-0.2 4. -1.6] [ 2. -1. 9. ]]