mindspore_lite.Converter

class mindspore_lite.Converter(fmk_type, model_file, output_file, weight_file='', config_file='', weight_fp16=False, input_shape=None, input_format=Format.NHWC, input_data_type=DataType.FLOAT32, output_data_type=DataType.FLOAT32, export_mindir=ModelType.MINDIR_LITE, decrypt_key='', decrypt_mode='AES-GCM', enable_encryption=False, encrypt_key='', infer=False, train_model=False, no_fusion=False)[source]

Converter is used to convert third-party models.

Note

If the default value of the parameter is none, it means the parameter is not set.

Parameters
  • fmk_type (FmkType) – Input model framework type. Options: FmkType.TF | FmkType.CAFFE | FmkType.ONNX | FmkType.MINDIR | FmkType.TFLITE | FmkType.PYTORCH.

  • model_file (str) – Path of the input model. e.g. “/home/user/model.prototxt”. Options: TF: “*.pb” | CAFFE: “*.prototxt” | ONNX: “*.onnx” | MINDIR: “*.mindir” | TFLITE: “*.tflite” | PYTORCH: “*.pt or *.pth”.

  • output_file (str) – Path of the output model. The suffix .ms can be automatically generated. e.g. “/home/user/model.prototxt”, it will generate the model named model.prototxt.ms in /home/user/

  • weight_file (str, optional) – Input model weight file. Required only when fmk_type is FmkType.CAFFE. e.g. “/home/user/model.caffemodel”. Default: “”.

  • config_file (str, optional) – Configuration for post-training, offline split op to parallel, disable op fusion ability and set plugin so path. e.g. “/home/user/model.cfg”. Default: “”.

  • weight_fp16 (bool, optional) – Serialize const tensor in Float16 data type, only effective for const tensor in Float32 data type. Default: False.

  • input_shape (dict{str, list[int]}, optional) – Set the dimension of the model input, the order of input dimensions is consistent with the original model. For some models, the model structure can be further optimized, but the transformed model may lose the characteristics of dynamic shape. e.g. {“inTensor1”: [1, 32, 32, 32], “inTensor2”: [1, 1, 32, 32]}. Default: {}.

  • input_format (Format, optional) – Assign the input format of exported model. Only Valid for 4-dimensional input. Options: Format.NHWC | Format.NCHW. Default: Format.NHWC.

  • input_data_type (DataType, optional) – Data type of input tensors. The default type is same with the type defined in model. Default: DataType.FLOAT32.

  • output_data_type (DataType, optional) – Data type of output tensors. The default type is same with the type defined in model. Default: DataType.FLOAT32.

  • export_mindir (ModelType, optional) – Which model type need to be export. Default: ModelType.MINDIR_LITE.

  • decrypt_key (str, optional) – The key used to decrypt the file, expressed in hexadecimal characters. Only valid when fmk_type is FmkType.MINDIR. Default: “”.

  • decrypt_mode (str, optional) – Decryption method for the MindIR file. Only valid when dec_key is set. Options: “AES-GCM” | “AES-CBC”. Default: “AES-GCM”.

  • enable_encryption (bool, optional) – Whether to export the encryption model. Default: False.

  • encrypt_key (str, optional) – The key used to encrypt the file, expressed in hexadecimal characters. Only support decrypt_mode is “AES-GCM”, the key length is 16. Default: “”.

  • infer (bool, optional) – Whether to do pre-inference after convert. Default: False.

  • train_model (bool, optional) – whether the model is going to be trained on device. Default: False.

  • no_fusion (bool, optional) – Avoid fusion optimization, fusion optimization is allowed by default. Default: False.

Raises
  • TypeErrorfmk_type is not a FmkType.

  • TypeErrormodel_file is not a str.

  • TypeErroroutput_file is not a str.

  • TypeErrorweight_file is not a str.

  • TypeErrorconfig_file is not a str.

  • TypeErrorweight_fp16 is not a bool.

  • TypeErrorinput_shape is neither a dict nor None.

  • TypeErrorinput_shape is a dict, but the keys are not str.

  • TypeErrorinput_shape is a dict, the keys are str, but the values are not list.

  • TypeErrorinput_shape is a dict, the keys are str, the values are list, but the value’s elements are not int.

  • TypeErrorinput_format is not a Format.

  • TypeErrorinput_data_type is not a DataType.

  • TypeErroroutput_data_type is not a DataType.

  • TypeErrorexport_mindir is not a ModelType.

  • TypeErrordecrypt_key is not a str.

  • TypeErrordecrypt_mode is not a str.

  • TypeErrorenable_encryption is not a bool.

  • TypeErrorencrypt_key is not a str.

  • TypeErrorinfer is not a bool.

  • TypeErrortrain_model is not a bool.

  • TypeErrorno_fusion is not a bool.

  • ValueErrorinput_format is neither Format.NCHW nor Format.NHWC when it is a Format.

  • ValueErrordecrypt_mode is neither “AES-GCM” nor “AES-CBC” when it is a str.

  • RuntimeErrormodel_file does not exist.

  • RuntimeErrorweight_file is not “”, but weight_file does not exist.

  • RuntimeErrorconfig_file is not “”, but config_file does not exist.

Examples

>>> import mindspore_lite as mslite
>>> converter = mslite.Converter(mslite.FmkType.TFLITE, "./mobilenetv2/mobilenet_v2_1.0_224.tflite",
...                              "mobilenet_v2_1.0_224.tflite")
>>> print(converter)
config_file: ,
config_info: {},
weight_fp16: False,
input_shape: {},
input_format: Format.NHWC,
input_data_type: DataType.FLOAT32,
output_data_type: DataType.FLOAT32,
export_mindir: ModelType.MINDIR_LITE,
decrypt_key: ,
decrypt_mode: AES-GCM,
enable_encryption: False,
encrypt_key: ,
infer: False,
train_model: False,
no_fusion: False.
converter()[source]

Perform conversion, and convert the third-party model to the mindspire model.

Raises

RuntimeError – converter model failed.

Examples

>>> import mindspore_lite as mslite
>>> converter = mslite.Converter(mslite.FmkType.TFLITE, "./mobilenetv2/mobilenet_v2_1.0_224.tflite",
...                              "mobilenet_v2_1.0_224.tflite")
>>> converter.converter()
CONVERT RESULT SUCCESS:0
get_config_info()[source]

Get config info of converter.It is used together with set_config_info method for online converter. Please use set_config_info method before get_config_info.

Returns

dict{str, dict{str, str}}, the config info which has been set in converter.

Examples

>>> import mindspore_lite as mslite
>>> converter = mslite.Converter(mslite.FmkType.TFLITE, "./mobilenetv2/mobilenet_v2_1.0_224.tflite",
...                              "mobilenet_v2_1.0_224.tflite")
>>> section = "common_quant_param"
>>> config_info_in = {"quant_type":"WEIGHT_QUANT"}
>>> converter.set_config_info(section, config_info_in)
>>> config_info_out = converter.get_config_info()
>>> print(config_info_out)
{'common_quant_param': {'quant_type': 'WEIGHT_QUANT'}}
set_config_info(section, config_info)[source]

Set config info for converter.It is used together with get_config_info method for online converter.

Parameters
  • section (str) –

    The category of the configuration parameter. Set the individual parameters of the configFile together with config_info. e.g. for section = “common_quant_param”, config_info = {“quant_type”:”WEIGHT_QUANT”}. Default: None. For the configuration parameters related to post training quantization, please refer to quantization. For the configuration parameters related to extension, please refer to extension.

    • ”common_quant_param”: Common quantization parameter. One of configuration parameters for quantization.

    • ”mixed_bit_weight_quant_param”: Mixed bit weight quantization parameter. One of configuration parameters for quantization.

    • ”full_quant_param”: Full quantization parameter. One of configuration parameters for quantization.

    • ”data_preprocess_param”: Data preprocess parameter. One of configuration parameters for quantization.

    • ”registry”: Extension configuration parameter. One of configuration parameters for extension.

  • config_info (dict{str, str}) –

    List of configuration parameters. Set the individual parameters of the configFile together with section. e.g. for section = “common_quant_param”, config_info = {“quant_type”:”WEIGHT_QUANT”}. Default: None. For the configuration parameters related to post training quantization, please refer to quantization. For the configuration parameters related to extension, please refer to extension.

Raises
  • TypeErrorsection is not a str.

  • TypeErrorconfig_info is not a dict.

  • TypeErrorconfig_info is a dict, but the keys are not str.

  • TypeErrorconfig_info is a dict, the keys are str, but the values are not str.

Examples

>>> import mindspore_lite as mslite
>>> converter = mslite.Converter(mslite.FmkType.TFLITE, "./mobilenetv2/mobilenet_v2_1.0_224.tflite",
...                              "mobilenet_v2_1.0_224.tflite")
>>> section = "common_quant_param"
>>> config_info = {"quant_type":"WEIGHT_QUANT"}
>>> converter.set_config_info(section, config_info)