mindspore.export
- mindspore.export(net, *inputs, file_name, file_format='AIR', **kwargs)[source]
Export the mindspore network into an offline model in the specified format.
Note
When exporting AIR, ONNX format, the size of a single tensor can not exceed 2GB.
When file_name does not have a suffix, the system will automatically add one according to the file_format.
- Parameters
net (Cell) – MindSpore network.
inputs (Tensor) – Inputs of the net, if the network has multiple inputs, incoming tuple(Tensor).
file_name (str) – File name of the model to be exported.
file_format (str) –
MindSpore currently supports ‘AIR’, ‘ONNX’ and ‘MINDIR’ format for exported model.
AIR: Ascend Intermediate Representation. An intermediate representation format of Ascend model.
ONNX: Open Neural Network eXchange. An open format built to represent machine learning models.
MINDIR: MindSpore Native Intermediate Representation for Anf. An intermediate representation format for MindSpore models.
kwargs (dict) –
Configuration options dictionary.
quant_mode (str): If the network is quantization aware training network, the quant_mode should be set to “QUANT”, else the quant_mode should be set to “NONQUANT”.
mean (float): The mean of input data after preprocessing, used for quantizing the first layer of network. Default: 127.5.
std_dev (float): The variance of input data after preprocessing, used for quantizing the first layer of network. Default: 127.5.
enc_key (byte): Byte type key used for encryption. Tha valid length is 16, 24, or 32.
enc_mode (str): Specifies the encryption mode, take effect when enc_key is set. Option: ‘AES-GCM’ | ‘AES-CBC’. Default: ‘AES-GCM’.
dataset (Dataset): Specifies the preprocess methods of network.
Examples
>>> import numpy as np >>> from mindspore import export, Tensor >>> >>> net = LeNet() >>> input_tensor = Tensor(np.ones([1, 1, 32, 32]).astype(np.float32)) >>> export(net, Tensor(input_tensor), file_name='lenet', file_format='MINDIR')