mindspore.export
- mindspore.export(net, *inputs, file_name, file_format, **kwargs)[source]
Export the MindSpore network into an offline model in the specified format.
Note
When exporting AIR, ONNX format, the size of a single tensor can not exceed 2GB.
When file_name does not have a suffix, the system will automatically add one according to the file_format.
Exporting functions decorated with ‘jit’ to mindir format is supported.
When exporting a function decorated with ‘jit’, the function should not involve class properties in calculations.
- Parameters
net (Union[Cell, function]) – MindSpore network.
inputs (Union[Tensor, Dataset, List, Tuple, Number, Bool]) – It represents the inputs of the net, if the network has multiple inputs, set them together. While its type is Dataset, it represents the preprocess behavior of the net, data preprocess operations will be serialized. In second situation, you should adjust batch size of dataset script manually which will impact on the batch size of ‘net’ input. Only supports parse “image” column from dataset currently.
file_name (str) – File name of the model to be exported.
file_format (str) –
MindSpore currently supports ‘AIR’, ‘ONNX’ and ‘MINDIR’ format for exported model.
AIR: Ascend Intermediate Representation. An intermediate representation format of Ascend model.
ONNX: Open Neural Network eXchange. An open format built to represent machine learning models.
MINDIR: MindSpore Native Intermediate Representation for Anf. An intermediate representation format for MindSpore models.
kwargs (dict) –
Configuration options dictionary.
quant_mode (str): If the network is a quantization aware training network, the quant_mode should be set to “QUANT”, else the quant_mode should be set to “NONQUANT”.
mean (float): The mean of input data after preprocessing, used for quantizing the first layer of network. Default: 127.5.
std_dev (float): The variance of input data after preprocessing, used for quantizing the first layer of the network. Default: 127.5.
enc_key (byte): Byte-type key used for encryption. The valid length is 16, 24, or 32.
enc_mode (Union[str, function]): Specifies the encryption mode, to take effect when enc_key is set.
For ‘AIR’ and ‘ONNX’ models, only customized encryption is supported.
For ‘MINDIR’, all options are supported. Option: ‘AES-GCM’, ‘AES-CBC’, ‘SM4-CBC’ or Customized encryption. Default: ‘AES-GCM’.
For details of using the customized encryption, please check the tutorial.
dataset (Dataset): Specifies the preprocessing method of the dataset, which is used to import the preprocessing of the dataset into MindIR.
obf_config (dict): obfuscation config.
type (str): The type of obfuscation, only ‘dynamic’ is supported until now.
obf_ratio (float, str): The ratio of nodes in original model that would be obfuscated. obf_ratio should be in range of (0, 1] or in [“small”, “medium”, “large”].
customized_func (function): A python function used for customized function mode, which used for control the switch branch of obfuscation structure. The outputs of customized_func should be boolean. This function needs to ensure that its result is constant for any input. Users can refer to opaque predicates. If customized_func is set, then it should be passed to load() interface when loading obfuscated model.
obf_password (int): A password used for password mode, which should be in (0, 9223372036854775807]. If obf_password is set, then it should be passed to nn.GraphCell() interface when loading obfuscated model. It should be noted that at least one of ‘customized_func’ or ‘obf_password’ should be set, and ‘obf_password’ mode would be applied if both of them are set.
Examples
>>> import mindspore as ms >>> import numpy as np >>> from mindspore import Tensor >>> >>> net = LeNet() >>> input_tensor = Tensor(np.ones([1, 1, 32, 32]).astype(np.float32)) >>> ms.export(net, input_tensor, file_name='lenet', file_format='MINDIR')