sciai.architecture.SReLU

View Source On Gitee
class sciai.architecture.SReLU[source]

Sin rectified Linear Unit activation function. Applies the sin rectified linear unit function element-wise.

Inputs:
  • x (Tensor) - The input of SReLU.

Outputs:

Tensor, activated output with the same type and shape as x.

Supported Platforms:

Ascend

Examples

>>> import numpy as np
>>> from sciai.architecture.activation import SReLU
>>> from mindspore import Tensor
>>> input_x = Tensor(np.array([[1.2, 0.1], [0.2, 3.2]], dtype=np.float32))
>>> srelu = SReLU()
>>> output = srelu(input_x)
>>> print(output)
[[0.         0.05290067]
 [0.15216905 0.        ]]