mindspore.ops.relu
- mindspore.ops.relu(x)[source]
Computes ReLU (Rectified Linear Unit activation function) of input tensors element-wise.
It returns \(\max(x,\ 0)\) element-wise. Specially, the neurons with the negative output will be suppressed and the active neurons will stay the same.
\[ReLU(x) = (x)^+ = max(0, x)\]Note
In general, this operator is more commonly used. The difference from ReLuV2 is that the ReLuV2 will output one more Mask.
- Parameters
x (Tensor) – Tensor of shape \((N, *)\), where \(*\) means, any number of additional dimensions, data type is number.
- Returns
Tensor of shape \((N, *)\), with the same dtype and shape as the x.
- Raises
- Supported Platforms:
Ascend
GPU
CPU
Examples
>>> input_x = Tensor(np.array([[-1.0, 4.0, -8.0], [2.0, -5.0, 9.0]]), mindspore.float32) >>> output = ops.relu(input_x) >>> print(output) [[0. 4. 0.] [2. 0. 9.]]