mindspore.mint.distributed.new_group

View Source On Gitee
mindspore.mint.distributed.new_group(ranks=None, timeout=None, backend=None, pg_options=None, use_local_synchronization=False, group_desc=None)[source]

Create a new distributed group.

Note

This method should be used after init_process_group().

Parameters
  • ranks (list[int], optional) – List of ranks of group members. If None, will be create the world group. Default is None.

  • timeout (int, invalid) – Currently it is a reserved parameter.

  • backend (str, invalid) – Support backend Library, Currently support "hccl" and "mccl". when backend is "hccl" will use Huawei Collective Communication Library(HCCL). when backend is "mccl" will use MindSpore Collective Communication Library(MCCL). If None, which means "hccl" in Ascend. Default is None.

  • pg_options (str, invalid) – Currently it is a reserved parameter.

  • use_local_synchronization (bool, invalid) – Currently it is a reserved parameter.

  • group_desc (str, invalid) – Currently it is a reserved parameter.

Returns

A string with group name. Return "" in the abnormal scenarios.

Raises

TypeError – If list ranks in Group has duplicate rank id.

Supported Platforms:

Ascend

Examples

Note

Before running the following examples, you need to configure the communication environment variables. For Ascend devices, it is recommended to use the msrun startup method without any third-party or configuration file dependencies. Please see the msrun start up for more details.

>>> from mindspore import set_context
>>> from mindspore.mint.distributed import init_process_group, new_group
>>> set_context(device_target="Ascend")
>>> init_process_group()
>>> group = new_group()
>>> print("group is: ", group)
group is: hccl_world_group