mindelec.architecture.FCSequential
- class mindelec.architecture.FCSequential(in_channel, out_channel, layers, neurons, residual=True, act='sin', weight_init='normal', has_bias=True, bias_init='default')[source]
A sequential container of The dense layers, dense layers are added to the container sequentially.
- Parameters
in_channel (int) – The number of channels in the input space.
out_channel (int) – The number of channels in the output space.
layers (int) – The total number of layers, include input/hidden/output layers.
neurons (int) – The number of neurons of hidden layers.
residual (bool) – full-connected of residual block for the hidden layers. Default: True.
act (Union[str, Cell, Primitive, None]) – activate function applied to the output of the fully connected layer, e.g. “ReLU”, “Softmax” and “Tanh”. Default: “sin”.
weight_init (Union[Tensor, str, Initializer, numbers.Number]) – The trainable weight_init parameter. The dtype is same as input x. The values of str refer to the function mindspore.common.initializer . Default: “normal”.
has_bias (bool) – Specifies whether the layer uses a bias vector. Default: True.
bias_init (Union[Tensor, str, Initializer, numbers.Number]) –
The trainable bias_init parameter. The dtype is same as input x. The values of str refer to the function mindspore.common.initializer . Default: “default”.
- Inputs:
input (Tensor) - Tensor of shape \((*, in\_channels)\).
- Outputs:
Tensor of shape \((*, out\_channels)\).
- Raises
TypeError – If layers is not an int.
TypeError – If neurons is not an int.
TypeError – If residual is not a bool.
ValueError – If layers is less than 3.
- Supported Platforms:
Ascend
Examples
>>> import numpy as np >>> from mindelec.architecture import FCSequential >>> from mindspore import Tensor >>> inputs = np.ones((16, 3)) >>> inputs = Tensor(inputs.astype(np.float32)) >>> net = FCSequential(3, 3, 5, 32, weight_init="ones", bias_init="zeros") >>> output = net(inputs).asnumpy() >>> print(output.shape) (16, 3)