mindspore.set_offload_context

View Source On Gitee
mindspore.set_offload_context(offload_config)[source]

Configure heterogeneous training detailed parameters to adjust the offload strategy.

Note

The offload configuration is only used if the memory offload feature is enabled via mindspore.set_context(memory_offload="ON"), and the memory_optimize_level must be set to O0. On the Ascend hardware platform, the graph compilation level must be O0.

Parameters

offload_config (dict) –

A dict contains the keys and values for setting the offload context configure.It supports the following keys.

  • offload_path (str): The path of offload, relative paths are supported. Default: "./offload".

  • offload_cpu_size (str): The cpu memory size for offload. The format is "xxGB".

  • offload_disk_size (str): The disk size for offload. The format is "xxGB"

  • hbm_ratio (float): The ratio that can be used based on the maximum device memory. The range is (0,1], Default: 1.0.

  • cpu_ratio (float): The ratio that can be used based on the maximum host memory. The range is (0,1], Default: 1.0.

  • enable_pinned_mem (bool): The flag of whether enabling Pinned Memory. Default: True.

  • enable_aio (bool): The flag of whether enabling aio. Default: True.

  • aio_block_size (str): The size of aio block. The format is "xxGB".

  • aio_queue_depth (int): The depth of aio queue.

  • offload_param (str): The param for offload destination, cpu or disk, Default: "".

  • offload_checkpoint (str): The checkpoint for offload destination, only valid if recompute is turned on, cpu or disk, Default: "".

  • auto_offload (bool): The flag of whether auto offload. Default: True.

  • host_mem_block_size (str): The memory block size of host memory pool. The format is "xxGB"

Raises

ValueError – If input key is not attribute in auto parallel context.

Examples

>>> from mindspore import context
>>> context.set_offload_context(offload_config={"offload_param":"cpu"})