mindspore.ops.hardtanh

mindspore.ops.hardtanh(input, min_val=- 1.0, max_val=1.0)[source]

Applies the hardtanh activation function element-wise. The activation function is defined as:

\[\begin{split}\text{hardtanh}(input) = \begin{cases} max\_val, & \text{ if } input > max\_val \\ min\_val, & \text{ if } input < min\_val \\ input, & \text{ otherwise. } \end{cases}\end{split}\]

Linear region range \([min\_val, max\_val]\) can be adjusted using min_val and max_val.

Hardtanh Activation Function Graph:

../../_images/Hardtanh.png
Parameters
  • input (Tensor) – Input Tensor.

  • min_val (Union[int, float], optional) – Minimum value of the linear region range. Default: -1.0 .

  • max_val (Union[int, float], optional) – Maximum value of the linear region range. Default: 1.0 .

Returns

Tensor, with the same dtype and shape as input.

Raises
  • TypeError – If input is not a Tensor.

  • TypeError – If dtype of min_val is neither float nor int.

  • TypeError – If dtype of max_val is neither float nor int.

Supported Platforms:

Ascend GPU CPU

Examples

>>> import mindspore
>>> from mindspore import Tensor, ops
>>> x = Tensor([-1, -2, 0, 2, 1], mindspore.float16)
>>> output = ops.hardtanh(x, min_val=-1.0, max_val=1.0)
>>> print(output)
[-1. -1.  0.  1.  1.]