mindspore.nn.DenseBnAct
- class mindspore.nn.DenseBnAct(in_channels, out_channels, weight_init='normal', bias_init='zeros', has_bias=True, has_bn=False, momentum=0.9, eps=1e-05, activation=None, alpha=0.2, after_fake=True)[source]
A combination of Dense, Batchnorm, and the activation layer.
This part is a more detailed overview of Dense op.
- Parameters
in_channels (int) – The number of channels in the input space.
out_channels (int) – The number of channels in the output space.
weight_init (Union[Tensor, str, Initializer, numbers.Number]) – The trainable weight_init parameter. The dtype is same as x. The values of str refer to the function initializer. Default: ‘normal’.
bias_init (Union[Tensor, str, Initializer, numbers.Number]) – The trainable bias_init parameter. The dtype is same as x. The values of str refer to the function initializer. Default: ‘zeros’.
has_bias (bool) – Specifies whether the layer uses a bias vector. Default: True.
has_bn (bool) – Specifies to use batchnorm or not. Default: False.
momentum (float) – Momentum for moving average for batchnorm, must be [0, 1]. Default:0.9
eps (float) – Term added to the denominator to improve numerical stability for batchnorm, should be greater than 0. Default: 1e-5.
activation (Union[str, Cell, Primitive]) – Specifies activation type. The optional values are as following: ‘softmax’, ‘logsoftmax’, ‘relu’, ‘relu6’, ‘tanh’, ‘gelu’, ‘sigmoid’, ‘prelu’, ‘leakyrelu’, ‘hswish’, ‘hsigmoid’. Default: None.
alpha (float) – Slope of the activation function at x < 0 for LeakyReLU. Default: 0.2.
after_fake (bool) – Determine whether there must be a fake quantization operation after DenseBnAct. Default: True.
- Inputs:
x (Tensor) - Tensor of shape \((N, in\_channels)\). The data type is float32.
- Outputs:
Tensor of shape \((N, out\_channels)\). The data type is float32.
- Raises
TypeError – If in_channels or out_channels is not an int.
TypeError – If has_bias, has_bn or after_fake is not a bool.
TypeError – If momentum or eps is not a float.
ValueError – If momentum is not in range [0, 1.0].
- Supported Platforms:
Ascend
GPU
CPU
Examples
>>> net = nn.DenseBnAct(3, 4) >>> x = Tensor(np.random.randint(0, 255, [2, 3]), mindspore.float32) >>> result = net(x) >>> output = result.shape >>> print(output) (2, 4)