mindspore.nn.HSigmoid
- class mindspore.nn.HSigmoid[source]
Hard sigmoid activation function.
Applies hard sigmoid activation element-wise. The input is a Tensor with any valid shape.
Hard sigmoid is defined as:
\[\text{hsigmoid}(x_{i}) = max(0, min(1, \frac{x_{i} + 3}{6})),\]where \(x_{i}\) is the \(i\)-th slice in the given dimension of the input Tensor.
- Inputs:
x (Tensor) - The input of HSigmoid, data type must be float16 or float32. The shape is \((N,*)\) where \(*\) means, any number of additional dimensions.
- Outputs:
Tensor, with the same type and shape as the x.
- Raises
TypeError – If dtype of x is neither float16 nor float32.
- Supported Platforms:
GPU
CPU
Examples
>>> x = Tensor(np.array([-1, -2, 0, 2, 1]), mindspore.float16) >>> hsigmoid = nn.HSigmoid() >>> result = hsigmoid(x) >>> print(result) [0.3333 0.1666 0.5 0.833 0.6665]