mindspore.ops.ReduceScatter
- class mindspore.ops.ReduceScatter(*args, **kwargs)[source]
Reduces and scatters tensors from the specified communication group.
Note
The back propagation of the op is not supported yet. Stay tuned for more. The tensors must have the same shape and format in all processes of the collection.
- Parameters
- Raises
TypeError – If any of operation and group is not a string.
ValueError – If the first dimension of the input cannot be divided by the rank size.
- Supported Platforms:
Ascend
GPU
Examples
>>> # This example should be run with two devices. Refer to the tutorial > Distributed Training on mindspore.cn >>> from mindspore import Tensor, context >>> from mindspore.communication import init >>> from mindspore.ops.operations.comm_ops import ReduceOp >>> import mindspore.nn as nn >>> import mindspore.ops.operations as ops >>> import numpy as np >>> >>> context.set_context(mode=context.GRAPH_MODE) >>> init() >>> class Net(nn.Cell): ... def __init__(self): ... super(Net, self).__init__() ... self.reducescatter = ops.ReduceScatter(ReduceOp.SUM) ... ... def construct(self, x): ... return self.reducescatter(x) ... >>> input_ = Tensor(np.ones([8, 8]).astype(np.float32)) >>> net = Net() >>> output = net(input_) >>> print(output) [[2. 2. 2. 2. 2. 2. 2. 2.] [2. 2. 2. 2. 2. 2. 2. 2.] [2. 2. 2. 2. 2. 2. 2. 2.] [2. 2. 2. 2. 2. 2. 2. 2.]]