sciai.architecture.AdaptActivation

View Source On Gitee
class sciai.architecture.AdaptActivation(activation, a, scale)[source]

Adaptive activation function with trainable Parameter and fixed scale.

For details of adaptive activation function, please check Adaptive activation functions accelerate convergence in deep and physics-informed neural networks and Locally adaptive activationfunctions with slope recoveryfor deep and physics-informedneural network.

Parameters
  • activation (Union[str, Cell, Primitive, function]) – Activation function.

  • a (Union[Number, Tensor, Parameter]) – Trainable parameter a.

  • scale (Union[Number, Tensor]) – Fixed scale parameter.

Inputs:
  • x (Tensor) - The input of AdaptActivation.

Outputs:

Tensor, activated output with the same type and shape as x.

Raises

TypeError – If types are not correct.

Supported Platforms:

GPU CPU Ascend

Examples

>>> import mindspore as ms
>>> from mindspore import ops, nn
>>> from sciai.architecture import AdaptActivation
>>> a = ms.Tensor(0.1, ms.float32)
>>> net = AdaptActivation(nn.Tanh(), a=a, scale=10)
>>> x = ops.ones((2, 3), ms.float32)
>>> y = net(x)
>>> print(y)
[[0.7615942 0.7615942 0.7615942]
[0.7615942 0.7615942 0.7615942]]