mindspore.ops.csr_relu
- mindspore.ops.csr_relu(x: CSRTensor)[source]
Computes ReLU (Rectified Linear Unit activation function) of input csr_tensors element-wise.
It returns max(x, 0) element-wise. Specially, the neurons with the negative output will be suppressed and the active neurons will stay the same.
\[ReLU(x) = (x)^+ = \max(0, x)\]Note
In general, this operator is more commonly used. The difference from ReLuV2 is that the ReLuV2 will output one more Mask.
- Parameters
x (CSRTensor) – Input CSRTensor.
- Returns
CSRTensor, with the same dtype and shape as the x.
- Raises
- Supported Platforms:
Ascend
GPU
CPU
Examples
>>> from mindspore import dtype as mstype >>> from mindspore import Tensor, ops, CSRTensor >>> indptr = Tensor([0, 1, 2, 2], dtype=mstype.int32) >>> indices = Tensor([3, 0], dtype=mstype.int32) >>> values = Tensor([-1, 2], dtype=mstype.float32) >>> shape = (3, 4) >>> x = CSRTensor(indptr, indices, values, shape) >>> output = ops.csr_relu(x) >>> print(output.values) [0. 2.]