mindspore.ops.coo_relu
- mindspore.ops.coo_relu(x: COOTensor)[source]
Computes ReLU (Rectified Linear Unit activation function) of input coo_tensors element-wise.
It returns
element-wise. Specially, the neurons with the negative output will be suppressed and the active neurons will stay the same.Note
In general, this operator is more commonly used. The difference from ReLuV2 is that the ReLuV2 will output one more Mask.
- Parameters
x (COOTensor) – Input COOTensor with shape
, where means any number of additional dimensions. Its dtype is number.- Returns
COOTensor, has the same shape and dtype as the x.
- Raises
- Supported Platforms:
Ascend
GPU
CPU
Examples
>>> indices = Tensor([[0, 1], [1, 2]], dtype=mstype.int64) >>> values = Tensor([-1, 2], dtype=mstype.float32) >>> shape = (3, 4) >>> x = COOTensor(indices, values, shape) >>> output = ops.coo_relu(x) >>> print(output.values) [0. 2.]