mindspore.nn.ReLU
- class mindspore.nn.ReLU[source]
Rectified Linear Unit activation function.
Applies the rectified linear unit function element-wise.
\[\text{ReLU}(x) = (x)^+ = \max(0, x),\]It returns element-wise \(\max(0, x)\), specially, the neurons with the negative output will be suppressed and the active neurons will stay the same.
The picture about ReLU looks like this ReLU.
- Inputs:
x (Tensor) - The input of ReLU. The data type is Number. The shape is \((N,*)\) where \(*\) means, any number of additional dimensions.
- Outputs:
Tensor, with the same type and shape as the x.
- Raises
TypeError – If dtype of x is not a number.
- Supported Platforms:
Ascend
GPU
CPU
Examples
>>> x = Tensor(np.array([-1, 2, -3, 2, -1]), mindspore.float16) >>> relu = nn.ReLU() >>> output = relu(x) >>> print(output) [0. 2. 0. 2. 0.]