mindflow.cell.FCSequential

View Source On Gitee
class mindflow.cell.FCSequential(in_channels, out_channels, layers, neurons, residual=True, act='sin', weight_init='normal', has_bias=True, bias_init='default', weight_norm=False)[source]

A sequential container of the dense layers, dense layers are added to the container sequentially.

Parameters
  • in_channels (int) – The number of channels in the input space.

  • out_channels (int) – The number of channels in the output space.

  • layers (int) – The total number of layers, include input/hidden/output layers.

  • neurons (int) – The number of neurons of hidden layers.

  • residual (bool) – full-connected of residual block for the hidden layers. Default: True.

  • act (Union[str, Cell, Primitive, None]) – activate function applied to the output of the fully connected layer, eg. 'ReLU'.Default: "sin".

  • weight_init (Union[Tensor, str, Initializer, numbers.Number]) – The trainable weight_init parameter. The dtype is same as input x. The values of str refer to the function initializer. Default: 'normal'.

  • has_bias (bool) – Specifies whether the layer uses a bias vector. Default: True.

  • bias_init (Union[Tensor, str, Initializer, numbers.Number]) – The trainable bias_init parameter. The dtype is same as input x. The values of str refer to the function initializer. Default: 'default'.

  • weight_norm (bool) – Whether to compute the sum of squares of weight. Default: False.

Inputs:
  • input (Tensor) - Tensor of shape \((*, in\_channels)\).

Outputs:

Tensor of shape \((*, out\_channels)\).

Raises
Supported Platforms:

Ascend GPU

Examples

>>> import numpy as np
>>> from mindflow.cell import FCSequential
>>> from mindspore import Tensor
>>> inputs = np.ones((16, 3))
>>> inputs = Tensor(inputs.astype(np.float32))
>>> net = FCSequential(3, 3, 5, 32, weight_init="ones", bias_init="zeros")
>>> output = net(inputs).asnumpy()
>>> print(output.shape)
(16, 3)