mindspore.parallel.rank_list_for_convert
- mindspore.parallel.rank_list_for_convert(rank_id, src_strategy_file=None, dst_strategy_file=None)[source]
List of original distributed checkpoint rank index for obtaining the target checkpoint of a rank_id during the distributed checkpoint conversion.
- Parameters
rank_id (int) – The rank of which distributed checkpoint needs to be obtained after conversion.
src_strategy_file (str) – Name of source sharding strategy file which saved by mindspore.parallel.auto_parallel.AutoParallel(cell).save_param_strategy_file(file_path). when the src_strategy_file is
None
, it means that the source sharding strategy is without any sharing for each parameter. Default:None
.dst_strategy_file (str) – Name of destination sharding strategy file which saved by mindspore.parallel.auto_parallel.AutoParallel(cell).save_param_strategy_file(file_path). when the dst_strategy_file is
None
, it means that the destination sharding strategy is without any sharing for each parameter. Default:None
.
- Returns
List, the rank list required for converting the distributed checkpoint of rank_id.
- Raises
ValueError – src_strategy_file or dst_strategy_file is incorrect.
TypeError – src_strategy_file or dst_strategy_file is not a string.
TypeError – rank_id is not an int.
- Supported Platforms:
Ascend
Examples
>>> from mindspore.parallel import rank_list_for_convert >>> rank_id = 0 >>> rank_list = rank_list_for_convert(rank_id, "./src_strategy.ckpt", "./dst_strategy.ckpt") >>> checkpoint_files_map = {} >>> for rank in rank_list: ... checkpoint_files_map[rank] = "./pangu{}-100_2.ckpt".format(rank)