mindspore.nn.TimeDistributed
- class mindspore.nn.TimeDistributed(layer, time_axis, reshape_with_axis=None)[source]
The time distributed layer.
Time distributed is a wrapper which allows to apply a layer to every temporal slice of an input. And the x should be at least 3D. There are two cases in the implementation. When reshape_with_axis provided, the reshape method will be chosen, which is more efficient; otherwise, the method of dividing the inputs along time axis will be used, which is more general. For example, reshape_with_axis could not be provided when deal with Batch Normalization.
- Parameters
- Inputs:
x (Tensor) - Tensor of shape \((N, T, *)\), where \(*\) means any number of additional dimensions.
- Outputs:
Tensor of shape \((N, T, *)\)
- Supported Platforms:
Ascend
GPU
CPU
- Raises
TypeError – If layer is not a Cell or Primitive.
Examples
>>> x = Tensor(np.random.random([32, 10, 3]), mindspore.float32) >>> dense = nn.Dense(3, 6) >>> net = nn.TimeDistributed(dense, time_axis=1, reshape_with_axis=0) >>> output = net(x) >>> print(output.shape) (32, 10, 6)