mindspore_lite.Model

class mindspore_lite.Model[source]

The Model class is used to define a MindSpore model, facilitating computational graph management.

Examples

>>> import mindspore_lite as mslite
>>> model = mslite.Model()
>>> print(model)
model_path: .
build_from_file(model_path, model_type, context)[source]

Load and build a model from file.

Parameters
  • model_path (str) – Define the model path, include model file.

  • model_type (ModelType) –

    Define The type of model file. Options: ModelType::MINDIR | ModelType::MINDIR_LITE.

    • ModelType::MINDIR: An intermediate representation of the MindSpore model. The recommended model file suffix is “.mindir”.

    • ModelType::MINDIR_LITE: An intermediate representation of the MindSpore Lite model. The recommended model file suffix is “.ms”.

  • context (Context) – Define the context used to store options during execution.

Raises

Examples

>>> import mindspore_lite as mslite
>>> model = mslite.Model()
>>> context = mslite.Context()
>>> context.append_device_info(mslite.CPUDeviceInfo())
>>> model.build_from_file("mobilenetv2.ms", mslite.ModelType.MINDIR_LITE, context)
>>> print(model)
model_path: mobilenetv2.ms.
get_input_by_tensor_name(tensor_name)[source]

Obtains the input tensor of the model by name.

Parameters

tensor_name (str) – tensor name.

Returns

Tensor, the input tensor of the tensor name.

Raises

Examples

>>> import mindspore_lite as mslite
>>> model = mslite.Model()
>>> context = mslite.Context()
>>> context.append_device_info(mslite.CPUDeviceInfo())
>>> model.build_from_file("mobilenetv2.ms", mslite.ModelType.MINDIR_LITE, context)
>>> input_tensor = model.get_input_by_tensor_name("graph_input-173")
>>> print(input_tensor)
tensor_name: graph_input-173,
data_type: DataType.FLOAT32,
shape: [1, 224, 224, 3],
format: Format.NHWC,
element_num: 150528,
data_size: 602112.
get_inputs()[source]

Obtains all input tensors of the model.

Returns

list[Tensor], the inputs tensor list of the model.

Examples

>>> import mindspore_lite as mslite
>>> model = mslite.Model()
>>> context = mslite.Context()
>>> context.append_device_info(mslite.CPUDeviceInfo())
>>> model.build_from_file("mobilenetv2.ms", mslite.ModelType.MINDIR_LITE, context)
>>> inputs = model.get_inputs()
get_output_by_tensor_name(tensor_name)[source]

Obtains the output tensor of the model by name.

Parameters

tensor_name (str) – tensor name.

Returns

Tensor, the output tensor of the tensor name.

Raises

Examples

>>> import mindspore_lite as mslite
>>> model = mslite.Model()
>>> context = mslite.Context()
>>> context.append_device_info(mslite.CPUDeviceInfo())
>>> model.build_from_file("mobilenetv2.ms", mslite.ModelType.MINDIR_LITE, context)
>>> output_tensor = model.get_output_by_tensor_name("Softmax-65")
>>> print(output_tensor)
tensor_name: Softmax-65,
data_type: DataType.FLOAT32,
shape: [1, 1001],
format: Format.NHWC,
element_num: 1001,
data_size: 4004.
get_outputs()[source]

Obtains all output tensors of the model.

Returns

list[Tensor], the outputs tensor list of the model.

Examples

>>> import mindspore_lite as mslite
>>> model = mslite.Model()
>>> context = mslite.Context()
>>> context.append_device_info(mslite.CPUDeviceInfo())
>>> model.build_from_file("mobilenetv2.ms", mslite.ModelType.MINDIR_LITE, context)
>>> outputs = model.get_outputs()
predict(inputs, outputs)[source]

Inference model.

Parameters
  • inputs (list[Tensor]) – A list that includes all input tensors in order.

  • outputs (list[Tensor]) – The model outputs are filled in the container in sequence.

Raises
  • TypeErrorinputs is not a list.

  • TypeErrorinputs is a list, but the elements are not Tensor.

  • TypeErroroutputs is not a list.

  • TypeErroroutputs is a list, but the elements are not Tensor.

  • RuntimeError – predict model failed.

Examples

>>> # 1. predict which indata is from file
>>> import mindspore_lite as mslite
>>> import numpy as np
>>> model = mslite.Model()
>>> context = mslite.Context()
>>> context.append_device_info(mslite.CPUDeviceInfo())
>>> model.build_from_file("mobilenetv2.ms", mslite.ModelType.MINDIR_LITE, context)
>>> inputs = model.get_inputs()
>>> outputs = model.get_outputs()
>>> in_data = np.fromfile("input.bin", dtype=np.float32)
>>> inputs[0].set_data_from_numpy(in_data)
>>> model.predict(inputs, outputs)
>>> for output in outputs:
...     data = output.get_data_to_numpy()
...     print("outputs: ", data)
...
outputs:  [[1.02271215e-05 9.92699006e-06 1.69684317e-05 ... 6.69087376e-06
            2.16263197e-06 1.24009384e-04]]
>>> # 2. predict which indata is numpy array
>>> import mindspore_lite as mslite
>>> import numpy as np
>>> model = mslite.Model()
>>> context = mslite.Context()
>>> context.append_device_info(mslite.CPUDeviceInfo())
>>> model.build_from_file("mobilenetv2.ms", mslite.ModelType.MINDIR_LITE, context)
>>> inputs = model.get_inputs()
>>> outputs = model.get_outputs()
>>> for input in inputs:
...     in_data = np.arange(1 * 224 * 224 * 3, dtype=np.float32).reshape((1, 224, 224, 3))
...     input.set_data_from_numpy(in_data)
...
>>> model.predict(inputs, outputs)
>>> for output in outputs:
...     data = output.get_data_to_numpy()
...     print("outputs: ", data)
...
outputs:  [[0.00035889 0.00065501 0.00052925 ... 0.00018388 0.00148316 0.00116824]]
>>> # 3. predict which indata is new mslite tensor with numpy array
>>> import mindspore_lite as mslite
>>> import numpy as np
>>> model = mslite.Model()
>>> context = mslite.Context()
>>> context.append_device_info(mslite.CPUDeviceInfo())
>>> model.build_from_file("mobilenetv2.ms", mslite.ModelType.MINDIR_LITE, context)
>>> inputs = model.get_inputs()
>>> outputs = model.get_outputs()
>>> input_tensors = []
>>> for input in inputs:
...     input_tensor = mslite.Tensor()
...     input_tensor.set_data_type(input.get_data_type())
...     input_tensor.set_shape(input.get_shape())
...     input_tensor.set_format(input.get_format())
...     input_tensor.set_tensor_name(input.get_tensor_name())
...     in_data = np.arange(1 * 224 * 224 * 3, dtype=np.float32).reshape((1, 224, 224, 3))
...     input_tensor.set_data_from_numpy(in_data)
...     input_tensors.append(input_tensor)
...
>>> model.predict(input_tensors, outputs)
>>> for output in outputs:
...     data = output.get_data_to_numpy()
...     print("outputs: ", data)
...
outputs:  [[0.00035889 0.00065501 0.00052925 ... 0.00018388 0.00148316 0.00116824]]
resize(inputs, dims)[source]

Resizes the shapes of inputs.

Parameters
  • inputs (list[Tensor]) – A list that includes all input tensors in order.

  • dims (list[list[int]]) – A list that includes the new shapes of inputs, should be consistent with inputs.

Raises
  • TypeErrorinputs is not a list.

  • TypeErrorinputs is a list, but the elements are not Tensor.

  • TypeErrordims is not a list.

  • TypeErrordims is a list, but the elements are not list.

  • TypeErrordims is a list, the elements are list, but the element’s elements are not int.

  • ValueError – The size of inputs is not equal to the size of dims .

  • ValueError – The size of the elements of inputs is not equal to the size of the elements of dims .

Examples

>>> import mindspore_lite as mslite
>>> model = mslite.Model()
>>> context = mslite.Context()
>>> context.append_device_info(mslite.CPUDeviceInfo())
>>> model.build_from_file("mobilenetv2.ms", mslite.ModelType.MINDIR_LITE, context)
>>> inputs = model.get_inputs()
>>> print("Before resize, the first input shape: ", inputs[0].get_shape())
Before resize, the first input shape: [1, 224, 224, 3]
>>> model.resize(inputs, [[1, 112, 112, 3]])
>>> print("After resize, the first input shape: ", inputs[0].get_shape())
After resize, the first input shape: [1, 112, 112, 3]