mindspore.mint.nn.ReLU
- class mindspore.mint.nn.ReLU[source]
Applies ReLU (Rectified Linear Unit activation function) element-wise.
\[\text{ReLU}(input) = (input)^+ = \max(0, input),\]It returns element-wise \(\max(0, input)\).
Note
The neurons with the negative output will be suppressed and the active neurons will stay the same.
ReLU Activation Function Graph:
- Inputs:
input (Tensor) - The input of ReLU is a Tensor of any dimension.
- Outputs:
Tensor, with the same type and shape as the input.
- Raises
TypeError – If dtype of input is not supported.
- Supported Platforms:
Ascend
Examples
>>> import numpy as np >>> import mindspore >>> from mindspore import Tensor, mint >>> input = Tensor(np.array([-1, 2, -3, 2, -1]), mindspore.float16) >>> relu = mint.nn.ReLU() >>> output = relu(input) >>> print(output) [0. 2. 0. 2. 0.]