mindspore.communication.get_local_rank_size
- mindspore.communication.get_local_rank_size(group=GlobalComm.WORLD_COMM_GROUP)[source]
Gets local rank size of the specified collective communication group.
Note
This method isn't supported in GPU and CPU versions of MindSpore. This method should be used after init().
- Parameters
group (str) – The communication group to work on. The group is created by create_group or the default world communication group. Default:
GlobalComm.WORLD_COMM_GROUP
.- Returns
int, the local rank size where the calling process is within the group.
- Raises
TypeError – If group is not a string.
ValueError – If backend is invalid.
RuntimeError – If HCCL is not available or MindSpore is GPU/CPU version.
- Supported Platforms:
Ascend
GPU
CPU
Examples
Note
Before running the following examples, you need to configure the communication environment variables.
For Ascend/GPU/CPU devices, it is recommended to use the msrun startup method without any third-party or configuration file dependencies. Please see the msrun start up for more details.
>>> import mindspore as ms >>> from mindspore.communication import init, get_local_rank_size >>> ms.set_device(device_target="Ascend") >>> init() >>> local_rank_size = get_local_rank_size() >>> print("local_rank_size is: ", local_rank_size) local_rank_size is: 8