mindspore.export

View Source On Gitee
mindspore.export(net, *inputs, file_name, file_format, **kwargs)[source]

Export the MindSpore network into an offline model in the specified format.

Note

  1. When exporting AIR, ONNX format, the size of a single tensor can not exceed 2GB.

  2. When file_name does not have a suffix, the system will automatically add one according to the file_format.

  3. Exporting functions decorated with mindspore.jit() to mindir format is supported.

  4. When exporting a function decorated with mindspore.jit(), the function should not involve class properties in calculations.

  5. AIR format is deprecated, and will be removed in a future version, please use other format or use MindSpore Lite to do offline inference.

Parameters
  • net (Union[Cell, function]) – MindSpore network.

  • inputs (Union[Tensor, Dataset, List, Tuple, Number, Bool]) – It represents the inputs of the net, if the network has multiple inputs, set them together. While its type is Dataset, it represents the preprocess behavior of the net, data preprocess operations will be serialized. In second situation, you should adjust batch size of dataset script manually which will impact on the batch size of 'net' input. Only supports parse "image" column from dataset currently.

  • file_name (str) – File name of the model to be exported.

  • file_format (str) –

    MindSpore currently supports 'AIR', 'ONNX' and 'MINDIR' format for exported model.

    • AIR: Ascend Intermediate Representation. An intermediate representation format of Ascend model.

    • ONNX: Open Neural Network eXchange. An open format built to represent machine learning models.

    • MINDIR: MindSpore Native Intermediate Representation for Anf. An intermediate representation format for MindSpore models. MINDIR does not support operators which have dictionary attribute.

  • kwargs (dict) –

    Configuration options dictionary.

    • enc_key (byte): Byte-type key used for encryption. The valid length is 16, 24, or 32.

    • enc_mode (Union[str, function]): Specifies the encryption mode, to take effect when enc_key is set.

      • For 'AIR' and 'ONNX' models, only customized encryption is supported.

      • For 'MINDIR', all options are supported. Option: 'AES-GCM', 'AES-CBC', 'SM4-CBC' or Customized encryption. Default: 'AES-GCM'.

      • For details of using the customized encryption, please check the tutorial.

    • dataset (Dataset): Specifies the preprocessing method of the dataset, which is used to import the preprocessing of the dataset into MindIR.

    • obf_config (dict): obfuscation config.

      • type (str): The type of obfuscation, only 'dynamic' is supported until now.

      • obf_ratio (float, str): The ratio of nodes in original model that would be obfuscated. obf_ratio should be in range of (0, 1] or in ["small", "medium", "large"]. "small", "medium" and "large" are correspond to 0.1, 0.3, and 0.6 respectively.

      • customized_func (function): A python function used for customized function mode, which used for control the switch branch of obfuscation structure. The outputs of customized_func should be boolean and const ( Reference to 'my_func()' in tutorials). This function needs to ensure that its result is constant for any input. Users can refer to opaque predicates. If customized_func is set, then it should be passed to load() interface when loading obfuscated model.

      • obf_random_seed (int): Obfuscation random seed, which should be in (0, 9223372036854775807]. The structure of obfuscated models corresponding to different random seeds is different. If obf_random_seed is set, then it should be passed to mindspore.nn.GraphCell interface when loading obfuscated model. It should be noted that at least one of customized_func or obf_random_seed should be set, and the latter mode would be applied if both of them are set.

    • incremental (bool): export MindIR incrementally.

    • custom_func (function): Functions for custom defined export policies. This function will be used to customize the model during network export. Currently only support for files with mindir format. The function only accepts one input representing the proto object of the mindir file. When modifying a model, it is necessary to ensure the correctness of the custom_func , otherwise it may lead to model loading failure or functional errors. Default: None .

Examples

>>> import mindspore as ms
>>> import numpy as np
>>> from mindspore import Tensor
>>>
>>> # Define the network structure of LeNet5. Refer to
>>> # https://gitee.com/mindspore/docs/blob/master/docs/mindspore/code/lenet.py
>>> net = LeNet5()
>>> input_tensor = Tensor(np.ones([1, 1, 32, 32]).astype(np.float32))
>>> ms.export(net, input_tensor, file_name='lenet', file_format='MINDIR')
>>>
>>> # Export model in MindIR format and modified the model info using custom_func
>>> # The custom_func only support one input representing the Proto object of the model
>>> # And custom_func does not support return value
>>> def _custom_func(mindir_model):
...     mindir_model.producer_name = "test11111"
...     mindir_model.producer_version = "11.0"
...     mindir_model.user_info["version"] = "11.0"
>>> ms.export(net, input_tensor, file_name="lenet", file_format='MINDIR', custom_func=_custom_func)
Tutorial Examples: