mindspore.ops.relu
- mindspore.ops.relu(input)[source]
Computes ReLU (Rectified Linear Unit activation function) of input tensors element-wise.
It returns
element-wise. Specially, the neurons with the negative output will be suppressed and the active neurons will stay the same.Note
In general, this operator is more commonly used. The difference from ReLuV2 is that the ReLuV2 will output one more Mask.
- Parameters
input (Tensor) – Input Tensor of numeric types.
- Returns
Tensor, has the same dtype and shape as input_x.
- Raises
- Supported Platforms:
Ascend
GPU
CPU
Examples
>>> import mindspore >>> import numpy as np >>> from mindspore import Tensor, ops >>> input_x = Tensor(np.array([[-1.0, 4.0, -8.0], [2.0, -5.0, 9.0]]), mindspore.float32) >>> output = ops.relu(input_x) >>> print(output) [[0. 4. 0.] [2. 0. 9.]]