mindspore.nn.ReLU6
- class mindspore.nn.ReLU6[source]
Compute ReLU6 activation function.
ReLU6 is similar to ReLU with a upper limit of 6, which if the inputs are greater than 6, the outputs will be suppressed to 6. It computes element-wise as
\[\min(\max(0, x), 6).\]The input is a Tensor of any valid shape.
- Inputs:
input_data (Tensor) - The input of ReLU6 with data type of float16 or float32.
- Outputs:
Tensor, which has the same type as input_data.
- Raises
TypeError – If dtype of input_data is neither float16 nor float32.
- Supported Platforms:
Ascend
GPU
CPU
Examples
>>> input_x = Tensor(np.array([-1, -2, 0, 2, 1]), mindspore.float16) >>> relu6 = nn.ReLU6() >>> output = relu6(input_x) >>> print(output) [0. 0. 0. 2. 1.]