mindspore.nn.ReLU

class mindspore.nn.ReLU[source]

Rectified Linear Unit activation function.

Applies the rectified linear unit function element-wise.

\[\text{ReLU}(x) = (x)^+ = \max(0, x),\]

It returns element-wise \(\max(0, x)\), specially, the neurons with the negative output will be suppressed and the active neurons will stay the same.

The picture about ReLU looks like this ReLU.

Inputs:
  • input_data (Tensor) - The input of ReLU.

Outputs:

Tensor, with the same type and shape as the input_data.

Raises

TypeError – If dtype of input_data is not a number.

Supported Platforms:

Ascend GPU CPU

Examples

>>> input_x = Tensor(np.array([-1, 2, -3, 2, -1]), mindspore.float16)
>>> relu = nn.ReLU()
>>> output = relu(input_x)
>>> print(output)
[0. 2. 0. 2. 0.]