mindspore.rank_list_for_transform
- mindspore.rank_list_for_transform(rank_id, src_strategy_file=None, dst_strategy_file=None)[source]
List of original distributed checkpoint rank index for obtaining the target checkpoint of a rank_id during the distributed checkpoint conversion. For more details about converting distributed Checkpoint, please refer to Model Transformation.
- Parameters
rank_id (int) – The rank of which distributed checkpoint needs to be obtained after conversion.
src_strategy_file (str) – Name of source sharding strategy file which saved by mindspore.set_auto_parallel_context(strategy_ckpt_save_file). when the ‘src_strategy_file’ is None, it means that the source sharding strategy is without any sharing for each parameter. Default:None.
dst_strategy_file (str) – Name of destination sharding strategy file which saved by mindspore.set_auto_parallel_context(strategy_ckpt_save_file). when the ‘dst_strategy_file’ is None, it means that the destination sharding strategy is without any sharing for each parameter. Default:None.
- Returns
List, the rank list required for converting the distributed checkpoint of rank_id.
- Raises
ValueError – src_strategy_file or dst_strategy_file is incorrect.
TypeError – src_strategy_file or dst_strategy_file is not a string.
TypeError – rank_id is not a int.
Examples
>>> import mindspore as ms >>> rank_id = 0 >>> rank_list = ms.rank_list_for_transform(rank_id, "./src_strategy.ckpt", "./dst_strategy.ckpt") >>> checkpoint_files_map = {} >>> for rank in rank_list: ... checkpoint_files_map[rank] = "./pangu{}-100_2.ckpt".format(rank)