mindspore

Data Presentation

Tensor

mindspore.Tensor

Tensor is a data structure that stores an n-dimensional array.

mindspore.tensor

Create a new Tensor in Cell.construct() or function decorated by @jit.

mindspore.COOTensor

A sparse representation of a set of nonzero elements from a tensor at given indices.

mindspore.CSRTensor

Constructs a sparse tensor in CSR (Compressed Sparse Row) format, with specified values indicated by values and row and column positions indicated by indptr and indices.

mindspore.RowTensor

A sparse representation of a set of tensor slices at given indices.

mindspore.SparseTensor

A sparse representation of a set of nonzero elements from a tensor at given indices.

mindspore.is_tensor

Check whether the input object is a mindspore.Tensor .

mindspore.from_numpy

Convert numpy array to Tensor.

Parameter

mindspore.Parameter

Parameter is a Tensor subclass, when they are assigned as Cell attributes they are automatically added to the list of its parameters, and will appear, e.g.

mindspore.ParameterTuple

Inherited from tuple, ParameterTuple is used to save multiple parameter.

DataType

mindspore.dtype

Data type for MindSpore.

mindspore.dtype_to_nptype

Convert MindSpore dtype to numpy data type.

mindspore.dtype_to_pytype

Convert MindSpore dtype to python data type.

mindspore.pytype_to_dtype

Convert python type to MindSpore type.

mindspore.get_py_obj_dtype

Get the MindSpore data type, which corresponds to python type or variable.

mindspore.QuantDtype

An enum for quant datatype, contains INT1 ~ INT16, UINT1 ~ UINT16.

mindspore.common.np_dtype

Numpy data type for MindSpore.

Context

mindspore.set_context

Set context for running environment.

mindspore.get_context

Get context attribute value according to the input key.

mindspore.set_auto_parallel_context

Set auto parallel context, only data parallel supported on CPU.

mindspore.get_auto_parallel_context

Get auto parallel context attribute value according to the key.

mindspore.reset_auto_parallel_context

Reset auto parallel context attributes to the default values.

mindspore.ParallelMode

Parallel mode options.

mindspore.set_ps_context

Set parameter server training mode context.

mindspore.get_ps_context

Get parameter server training mode context attribute value according to the key.

mindspore.reset_ps_context

Reset parameter server training mode context attributes to the default values.

mindspore.set_algo_parameters

Set parameters in the algorithm for parallel strategy searching.

mindspore.get_algo_parameters

Get the algorithm parameter config attributes.

mindspore.reset_algo_parameters

Reset the algorithm parameter attributes.

mindspore.set_offload_context

Configure heterogeneous training detailed parameters to adjust the offload strategy.

mindspore.get_offload_context

Gets the offload configuration parameters.

Seed

mindspore.set_seed

Set global seed.

mindspore.get_seed

Get global seed.

Random Number Generator

mindspore.get_rng_state

Get the state of the default generator.

mindspore.Generator

A generator that manages the state of random numbers and provides seed and offset for random functions.

mindspore.initial_seed

Return the initial seed of the default generator.

mindspore.manual_seed

Set the default generator seed.

mindspore.seed

Seed the default generator with random number.

mindspore.set_rng_state

Set the state of the default generator.

Serialization

mindspore.async_ckpt_thread_status

Get the status of asynchronous save checkpoint thread.

mindspore.build_searched_strategy

Build strategy of every parameter in network.

mindspore.check_checkpoint

Check whether the checkpoint is valid.

mindspore.ckpt_to_safetensors

Converts MindSpore checkpoint files into safetensors format and saves them to save_path.

mindspore.convert_model

Convert mindir model to other format model.

mindspore.export

Export the MindSpore network into an offline model in the specified format.

mindspore.get_ckpt_path_with_strategy

Find available checkpoint file path from all backup checkpoint files of current rank.

mindspore.load

Load MindIR.

mindspore.load_checkpoint

Load checkpoint info from a specified file.

mindspore.load_checkpoint_async

Load checkpoint info from a specified file asyncly.

mindspore.load_distributed_checkpoint

Load checkpoint into net for distributed predication.

mindspore.load_mindir

load protobuf file.

mindspore.load_param_into_net

Load parameters into network, return parameter list that are not loaded in the network.

mindspore.load_segmented_checkpoints

Load checkpoint info from a specified file.

mindspore.merge_pipeline_strategys

Merge parallel strategy between all pipeline stages in pipeline parallel mode.

mindspore.merge_sliced_parameter

Merge parameter slices into one parameter.

mindspore.obfuscate_model

Obfuscate a model of MindIR format.

mindspore.parse_print

Parse data file generated by mindspore.ops.Print.

mindspore.rank_list_for_transform

List of original distributed checkpoint rank index for obtaining the target checkpoint of a rank_id during the distributed checkpoint conversion.

mindspore.restore_group_info_list

Build rank list, the checkpoint of ranks in the rank list has the same contents with the local rank who saves the group_info_file_name.

mindspore.safetensors_to_ckpt

Converts safetensors files into MindSpore checkpoint format and saves them to save_path.

mindspore.save_checkpoint

Save checkpoint to a specified file.

mindspore.save_mindir

save protobuf file.

mindspore.transform_checkpoint_by_rank

Transform distributed checkpoint from source sharding strategy to destination sharding strategy by rank for a network.

mindspore.transform_checkpoints

Transform distributed checkpoint from source sharding strategy to destination sharding strategy for a rank.

mindspore.unified_safetensors

Merge multiple safetensor files into a unified safetensor file.

Automatic Differentiation

mindspore.grad

A wrapper function to generate the gradient function for the input function.

mindspore.value_and_grad

A wrapper function to generate the function to calculate forward output and gradient for the input function.

mindspore.get_grad

When return_ids of mindspore.grad() or mindspore.grad() is set to True , use return value of mindspore.grad, or the second return value of mindspore.grad as gradients.

mindspore.jacfwd

Compute Jacobian via forward mode, corresponding to forward-mode differentiation.

mindspore.jacrev

Compute Jacobian via reverse mode, corresponding to reverse-mode differentiation.

mindspore.jvp

Compute the jacobian-vector-product of the given network.

mindspore.vjp

Compute the vector-jacobian-product of the given network.

Parallel Optimization

Automatic Vectorization

mindspore.vmap

Vectorizing map (vmap) is a kind of higher-order function to map fn along the parameter axes.

Parallel

mindspore.Layout

Parallel layout describes the detailed sharding information.

mindspore.parameter_broadcast

Broadcast parameter to other rank in data parallel dimension.

mindspore.recompute

This function is used to reduce memory, when run block, rather than storing the intermediate activation computed in forward pass, we will recompute it in backward pass.

mindspore.reshard

Specify the tensor by the given layout.

mindspore.shard

Defining the input and output layouts of this cell and the parallel strategies of remaining ops will be generated by sharding propagation.

mindspore.sync_pipeline_shared_parameters

synchronize pipeline parallel stage shared parameters.

JIT

mindspore.JitConfig

Jit config for compile.

mindspore.jit

Create a callable MindSpore graph from a Python function.

mindspore.jit_class

Class decorator for user-defined classes.

mindspore.ms_class

Class decorator for user-defined classes.

mindspore.ms_function

Create a callable MindSpore graph from a Python function.

mindspore.ms_memory_recycle

Recycle memory used by MindSpore.

mindspore.mutable

Make a constant value mutable.

mindspore.constexpr

Used to calculate constant in graph copmpiling process and improve compile performance in GRAPH_MODE.

mindspore.lazy_inline

Make the cell to be reusable.

mindspore.no_inline

Make the function to be reusable.

Tool

Dataset Helper

mindspore.DatasetHelper

DatasetHelper is a class to process the MindData dataset and provides the information of dataset.

mindspore.Symbol

Symbol is a data structure to indicate the symbolic info of shape.

mindspore.connect_network_with_dataset

Connect the network with dataset in dataset_helper.

mindspore.data_sink

A wrapper function to generate a function for the input function.

Debugging and Tuning

mindspore.Profiler

This class to enable the profiling of MindSpore neural networks.

mindspore.profiler.DynamicProfilerMonitor

This class to enable the dynamic profile monitoring of MindSpore neural networks.

mindspore.SummaryCollector

SummaryCollector can help you to collect some common information, such as loss, learning late, computational graph and so on.

mindspore.SummaryLandscape

SummaryLandscape can help you to collect loss landscape information.

mindspore.SummaryRecord

SummaryRecord is used to record the summary data and lineage data.

mindspore.set_dump

Enable or disable dump for the target and its contents.

Log

mindspore.get_level

Get the logger level.

mindspore.get_log_config

Get logger configurations.

Installation Verification

mindspore.run_check

Provide a convenient API to check if the installation is successful or failed.

Security

mindspore.obfuscate_ckpt

Obfuscate the plaintext checkpoint files according to the obfuscation config.

mindspore.load_obf_params_into_net

Modify model structure according to obfuscation config and load obfuscated checkpoint into obfuscated network.