Class Serialization

Class Documentation

class Serialization

The Serialization class is used to summarize methods for reading and writing model files.

Public Static Functions

static inline Status Load(const void *model_data, size_t data_size, ModelType model_type, Graph *graph, const Key &dec_key = {}, const std::string &dec_mode = kDecModeAesGcm)

Loads a model file from memory buffer.

参数
  • model_data[in] A buffer filled by model file.

  • data_size[in] The size of the buffer.

  • model_type[in] The Type of model file, options are ModelType::kMindIR, ModelType::kOM.

  • graph[out] The output parameter, an object saves graph data.

  • dec_key[in] The decryption key, key length is 16, 24, or 32. Not supported on MindSpore Lite.

  • dec_mode[in] The decryption mode, optional options are AES-GCM, AES-CBC. Not supported on MindSpore Lite.

返回

Status.

static inline Status Load(const std::string &file, ModelType model_type, Graph *graph, const Key &dec_key = {}, const std::string &dec_mode = kDecModeAesGcm)

Loads a model file from path.

参数
  • file[in] The path of model file.

  • model_type[in] The Type of model file, options are ModelType::kMindIR, ModelType::kOM.

  • graph[out] The output parameter, an object saves graph data.

  • dec_key[in] The decryption key, key length is 16, 24, or 32. Not supported on MindSpore Lite.

  • dec_mode[in] The decryption mode, optional options are AES-GCM, AES-CBC. Not supported on MindSpore Lite.

返回

Status.

static inline Status Load(const std::vector<std::string> &files, ModelType model_type, std::vector<Graph> *graphs, const Key &dec_key = {}, const std::string &dec_mode = kDecModeAesGcm)

Load multiple models from multiple files, MindSpore Lite does not provide this feature.

参数
  • files[in] The path of model files.

  • model_type[in] The Type of model file, options are ModelType::kMindIR, ModelType::kOM.

  • graphs[out] The output parameter, an object saves graph data.

  • dec_key[in] The decryption key, key length is 16, 24, or 32.

  • dec_mode[in] The decryption mode, optional options are AES-GCM, AES-CBC.

返回

Status.

static inline Status SetParameters(const std::map<std::string, Buffer> &parameters, Model *model)

Configure model parameters, MindSpore Lite does not provide this feature.

参数
  • parameters[in] The parameters.

  • model[in] The model.

返回

Status.

static inline Status ExportModel(const Model &model, ModelType model_type, Buffer *model_data, QuantizationType quantization_type = kNoQuant, bool export_inference_only = true, const std::vector<std::string> &output_tensor_name = {})

Export training model from memory buffer, MindSpore Lite does not provide this feature.

参数
  • model[in] The model data.

  • model_type[in] The model file type.

  • model_data[out] The model buffer.

  • quantization_type[in] The quantification type.

  • export_inference_only[in] Whether to export a reasoning only model.

  • output_tensor_name[in] The set the name of the output tensor of the exported reasoning model, default as empty, and export the complete reasoning model.

返回

Status.

static inline Status ExportModel(const Model &model, ModelType model_type, const std::string &model_file, QuantizationType quantization_type = kNoQuant, bool export_inference_only = true, std::vector<std::string> output_tensor_name = {})

Export training model from file.

参数
  • model[in] The model data.

  • model_type[in] The model file type.

  • model_file[in] The path of exported model file.

  • quantization_type[in] The quantification type.

  • export_inference_only[in] Whether to export a reasoning only model.

  • output_tensor_name[in] The set the name of the output tensor of the exported reasoning model, default as empty, and export the complete reasoning model.

返回

Status.