mindspore.ops.relu
- mindspore.ops.relu(input)[source]
Computes ReLU (Rectified Linear Unit activation function) of input tensors element-wise.
It returns
element-wise. Specially, the neurons with the negative output will be suppressed and the active neurons will stay the same.Note
In general, this operator is more commonly used. The difference from ReLuV2 is that the ReLuV2 will output one more Mask.
- Parameters
input (Tensor) – Tensor of shape
, where means, any number of additional dimensions, data type is number.- Returns
Tensor of shape
, with the same dtype and shape as the input.- Raises
- Supported Platforms:
Ascend
GPU
CPU
Examples
>>> input_x = Tensor(np.array([[-1.0, 4.0, -8.0], [2.0, -5.0, 9.0]]), mindspore.float32) >>> output = ops.relu(input_x) >>> print(output) [[0. 4. 0.] [2. 0. 9.]]