mindspore
Data Presentation
Tensor
Tensor is a data structure that stores an n-dimensional array. |
|
Create a new Tensor in Cell.construct() or function decorated by @jit. |
|
A sparse representation of a set of nonzero elements from a tensor at given indices. |
|
Constructs a sparse tensor in CSR (Compressed Sparse Row) format, with specified values indicated by values and row and column positions indicated by indptr and indices. |
|
A sparse representation of a set of tensor slices at given indices. |
|
A sparse representation of a set of nonzero elements from a tensor at given indices. |
|
Check whether the input object is a |
|
Convert numpy array to Tensor. |
Parameter
Parameter is a Tensor subclass, when they are assigned as Cell attributes they are automatically added to the list of its parameters, and will appear, e.g. |
|
Inherited from tuple, ParameterTuple is used to save multiple parameter. |
DataType
Data type for MindSpore. |
|
Convert MindSpore dtype to numpy data type. |
|
Convert MindSpore dtype to python data type. |
|
Convert python type to MindSpore type. |
|
Get the MindSpore data type, which corresponds to python type or variable. |
|
An enum for quant datatype, contains INT1 ~ INT16, UINT1 ~ UINT16. |
|
Numpy data type for MindSpore. |
Context
Set device target and device id for running environment. |
|
Enables or disables deterministic computing. |
|
Set context for running environment. |
|
Get context attribute value according to the input key. |
|
Set auto parallel context, only data parallel supported on CPU. |
|
Get auto parallel context attribute value according to the key. |
|
Reset auto parallel context attributes to the default values. |
|
Parallel mode options. |
|
Set parameter server training mode context. |
|
Get parameter server training mode context attribute value according to the key. |
|
Reset parameter server training mode context attributes to the default values. |
|
Set parameters in the algorithm for parallel strategy searching. |
|
Get the algorithm parameter config attributes. |
|
Reset the algorithm parameter attributes. |
|
Configure heterogeneous training detailed parameters to adjust the offload strategy. |
|
Gets the offload configuration parameters. |
Seed
Set global seed. |
|
Get global seed. |
Random Number Generator
Get the state of the default generator. |
|
A generator that manages the state of random numbers and provides seed and offset for random functions. |
|
Return the initial seed of the default generator. |
|
Set the default generator seed. |
|
Seed the default generator with random number. |
|
Set the state of the default generator. |
Serialization
Get the status of asynchronous save checkpoint thread. |
|
Build strategy of every parameter in network. |
|
Check whether the checkpoint is valid. |
|
Converts MindSpore checkpoint files into safetensors format and saves them to save_path. |
|
Convert mindir model to other format model. |
|
Export the MindSpore network into an offline model in the specified format. |
|
Find available checkpoint file path from all backup checkpoint files of current rank. |
|
Load MindIR. |
|
Load checkpoint info from a specified file. |
|
Load checkpoint info from a specified file asyncly. |
|
Load checkpoint into net for distributed predication. |
|
load protobuf file. |
|
Load parameters into network, return parameter list that are not loaded in the network. |
|
Load checkpoint info from a specified file. |
|
Merge parallel strategy between all pipeline stages in pipeline parallel mode. |
|
Merge parameter slices into one parameter. |
|
Obfuscate a model of MindIR format. |
|
Parse data file generated by |
|
List of original distributed checkpoint rank index for obtaining the target checkpoint of a rank_id during the distributed checkpoint conversion. |
|
Build rank list, the checkpoint of ranks in the rank list has the same contents with the local rank who saves the group_info_file_name. |
|
Converts safetensors files into MindSpore checkpoint format and saves them to save_path. |
|
Save checkpoint to a specified file. |
|
save protobuf file. |
|
Transform distributed checkpoint from source sharding strategy to destination sharding strategy by rank for a network. |
|
Transform distributed checkpoint from source sharding strategy to destination sharding strategy for a rank. |
|
Merge multiple safetensor files into a unified safetensor file. |
Automatic Differentiation
A wrapper function to generate the gradient function for the input function. |
|
A wrapper function to generate the function to calculate forward output and gradient for the input function. |
|
When return_ids of |
|
Compute Jacobian via forward mode, corresponding to forward-mode differentiation. |
|
Compute Jacobian via reverse mode, corresponding to reverse-mode differentiation. |
|
Compute the jacobian-vector-product of the given network. |
|
Compute the vector-jacobian-product of the given network. |
Parallel Optimization
Automatic Vectorization
Vectorizing map (vmap) is a kind of higher-order function to map fn along the parameter axes. |
Parallel
Parallel layout describes the detailed sharding information. |
|
Broadcast parameter to other rank in data parallel dimension. |
|
This function is used to reduce memory, when run block, rather than storing the intermediate activation computed in forward pass, we will recompute it in backward pass. |
|
Specify the tensor by the given layout. |
|
Defining the input and output layouts of this cell and the parallel strategies of remaining ops will be generated by sharding propagation. |
|
synchronize pipeline parallel stage shared parameters. |
JIT
Jit config for compile. |
|
Create a callable MindSpore graph from a Python function. |
|
Class decorator for user-defined classes. |
|
Class decorator for user-defined classes. |
|
Create a callable MindSpore graph from a Python function. |
|
Recycle memory used by MindSpore. |
|
Make a constant value mutable. |
|
Used to calculate constant in graph copmpiling process and improve compile performance in GRAPH_MODE. |
|
Make the cell to be reusable. |
|
Make the function to be reusable. |
|
Specify the recursion depth limit of function call before compiling graph. |
Tool
Dataset Helper
DatasetHelper is a class to process the MindData dataset and provides the information of dataset. |
|
Symbol is a data structure to indicate the symbolic info of shape. |
|
Connect the network with dataset in dataset_helper. |
|
A wrapper function to generate a function for the input function. |
Debugging and Tuning
This class to enable the profiling of MindSpore neural networks. |
|
Mstx class provides profiling tools for marking and tracing on NPU. |
|
This class to enable the dynamic profile monitoring of MindSpore neural networks. |
|
This class use to get the actions of each step. |
|
For each step in dynamic graph mode, call this method for online analyse. |
|
SummaryCollector can help you to collect some common information, such as loss, learning late, computational graph and so on. |
|
SummaryLandscape can help you to collect loss landscape information. |
|
SummaryRecord is used to record the summary data and lineage data. |
|
Enable or disable dump for the target and its contents. |
Log
Get the logger level. |
|
Get logger configurations. |
Installation Verification
Provide a convenient API to check if the installation is successful or failed. |
Security
Obfuscate the plaintext checkpoint files according to the obfuscation config. |
|
Modify model structure according to obfuscation config and load obfuscated checkpoint into obfuscated network. |