mindspore.nn.TimeDistributed
- class mindspore.nn.TimeDistributed(layer, time_axis, reshape_with_axis=None)[source]
The time distributed layer.
Time distributed is a wrapper which allows to apply a layer to every temporal slice of an input. And the x should be at least 3D. There are two cases in the implementation. When reshape_with_axis provided, the reshape method will be chosen, which is more efficient; otherwise, the method of dividing the inputs along time axis will be used, which is more general. For example, reshape_with_axis could not be provided when deal with Batch Normalization.
- Parameters
- Inputs:
x (Tensor) - Tensor of shape \((N, T, *)\), where \(*\) means any number of additional dimensions.
- Outputs:
Tensor of shape \((N, T, *)\)
- Raises
TypeError – If layer is not a Cell or Primitive.
- Supported Platforms:
Ascend
GPU
CPU
Examples
>>> import mindspore as ms >>> import numpy as np >>> x = ms.Tensor(np.random.random([32, 10, 3]), ms.float32) >>> dense = ms.nn.Dense(3, 6) >>> net = ms.nn.TimeDistributed(dense, time_axis=1, reshape_with_axis=0) >>> output = net(x) >>> print(output.shape) (32, 10, 6)