mindspore.nn.probability.bijector.Softplus

class mindspore.nn.probability.bijector.Softplus(sharpness=1.0, name='Softplus')[source]

Softplus Bijector. This Bijector performs the operation:

\[Y = \frac{\log(1 + e ^ {kX})}{k}\]

where k is the sharpness factor.

Parameters
Supported Platforms:

Ascend GPU

Note

The dtype of sharpness must be float.

Raises

TypeError – When the dtype of the sharpness is not float.

Examples

>>> import mindspore
>>> import mindspore.nn as nn
>>> import mindspore.nn.probability.bijector as msb
>>> from mindspore import Tensor
>>>
>>> # To initialize a Softplus bijector of sharpness 2.0.
>>> softplus = msb.Softplus(2.0)
>>> # To use a ScalarAffine bijector in a network.
>>> value = Tensor([1, 2, 3], dtype=mindspore.float32)
>>> ans1 = softplus.forward(value)
>>> print(ans1.shape)
(3,)
>>> ans2 = softplus.inverse(value)
>>> print(ans2.shape)
(3,)
>>> ans3 = softplus.forward_log_jacobian(value)
>>> print(ans3.shape)
(3,)
>>> ans4 = softplus.inverse_log_jacobian(value)
>>> print(ans4.shape)
(3,)
extend_repr()[source]

Display instance object as string.