mindspore.ops.ReLU
- class mindspore.ops.ReLU[source]
Computes ReLU (Rectified Linear Unit activation function) of input tensors element-wise.
It returns max(x, 0) element-wise. Specially, the neurons with the negative output will be suppressed and the active neurons will stay the same.
Note
In general, this operator is more commonly used. The difference from ReLuV2 is that the ReLuV2 will output one more Mask.
- Inputs:
input_x (Tensor) - Tensor of shape
, where means, any number of additional dimensions, data type is number.
- Outputs:
Tensor of shape
, with the same type and shape as the input_x.
- Supported Platforms:
Ascend
GPU
CPU
Examples
>>> input_x = Tensor(np.array([[-1.0, 4.0, -8.0], [2.0, -5.0, 9.0]]), mindspore.float32) >>> relu = ops.ReLU() >>> output = relu(input_x) >>> print(output) [[0. 4. 0.] [2. 0. 9.]]