mindspore.communication.create_group
- mindspore.communication.create_group(group, rank_ids)[source]
Create a user collective communication group.
Note
This method isn't supported in GPU and CPU versions of MindSpore. The size of rank_ids should be larger than 1, rank_ids should not have duplicate data. This method should be used after init(). Only support global single communication group in PyNative mode if you do not start with mpirun.
- Parameters
- Raises
TypeError – If group is not a string or rank_ids is not a list.
ValueError – If rank_ids size is not larger than 1, or rank_ids has duplicate data, or backend is invalid.
RuntimeError – If HCCL is not available or MindSpore is GPU/CPU version.
- Supported Platforms:
Ascend
GPU
CPU
Examples
Note
Before running the following examples, you need to configure the communication environment variables.
For Ascend/GPU/CPU devices, it is recommended to use the msrun startup method without any third-party or configuration file dependencies. Please see the msrun start up for more details.
>>> import mindspore as ms >>> from mindspore import set_context >>> from mindspore import ops >>> from mindspore.communication import init, create_group, get_rank >>> set_context(mode=ms.GRAPH_MODE) >>> ms.set_device(device_target="Ascend") >>> init() >>> group = "0-7" >>> rank_ids = [0,7] >>> if get_rank() in rank_ids: ... create_group(group, rank_ids) ... allreduce = ops.AllReduce(group)