mindspore.ops.ReLU
- class mindspore.ops.ReLU[source]
Computes ReLU (Rectified Linear Unit activation function) of input tensors element-wise.
Refer to
mindspore.ops.relu()
for more details.- Supported Platforms:
Ascend
GPU
CPU
Examples
>>> input_x = Tensor(np.array([[-1.0, 4.0, -8.0], [2.0, -5.0, 9.0]]), mindspore.float32) >>> relu = ops.ReLU() >>> output = relu(input_x) >>> print(output) [[0. 4. 0.] [2. 0. 9.]]