mindspore.mint.nn.ReLU6
- class mindspore.mint.nn.ReLU6(inplace=False)[source]
Activation function ReLU6.
Warning
This is an experimental API that is subject to change or deletion.
Refer to
mindspore.mint.nn.functional.relu6()
for more details.ReLU6 Activation Function Graph:
- Supported Platforms:
Ascend
Examples
>>> import mindspore >>> from mindspore import Tensor, mint >>> import numpy as np >>> input = Tensor(np.array([[-1.0, 4.0, -8.0], [2.0, -5.0, 9.0]]), mindspore.float32) >>> relu6 = mint.nn.ReLU6() >>> output = relu6(input) >>> print(output) [[0. 4. 0.] [2. 0. 6.]]