mindspore_lite.ModelParallelRunner
- class mindspore_lite.ModelParallelRunner[source]
The ModelParallelRunner class is used to define a MindSpore ModelParallelRunner, facilitating Model management.
Examples
>>> # only for serving inference >>> import mindspore_lite as mslite >>> model_parallel_runner = mslite.ModelParallelRunner() >>> print(model_parallel_runner) model_path: .
- get_inputs()[source]
Obtains all input tensors of the model.
- Returns
list[Tensor], the inputs tensor list of the model.
Examples
>>> import mindspore_lite as mslite >>> context = mslite.Context() >>> context.append_device_info(mslite.CPUDeviceInfo()) >>> runner_config = mslite.RunnerConfig(context=context, workers_num=4) >>> model_parallel_runner = mslite.ModelParallelRunner() >>> model_parallel_runner.init(model_path="mobilenetv2.ms", runner_config=runner_config) >>> inputs = model_parallel_runner.get_inputs()
- get_outputs()[source]
Obtains all output tensors of the model.
- Returns
list[Tensor], the outputs tensor list of the model.
Examples
>>> import mindspore_lite as mslite >>> context = mslite.Context() >>> context.append_device_info(mslite.CPUDeviceInfo()) >>> runner_config = mslite.RunnerConfig(context=context, workers_num=4) >>> model_parallel_runner = mslite.ModelParallelRunner() >>> model_parallel_runner.init(model_path="mobilenetv2.ms", runner_config=runner_config) >>> outputs = model_parallel_runner.get_outputs()
- init(model_path, runner_config=None)[source]
build a model parallel runner from model path so that it can run on a device.
- Parameters
model_path (str) – Define the model path.
runner_config (RunnerConfig, optional) – Define the config used to store options during model pool init. Default: None.
- Raises
TypeError – model_path is not a str.
TypeError – runner_config is neither a RunnerConfig nor None.
RuntimeError – model_path does not exist.
RuntimeError – ModelParallelRunner’s init failed.
Examples
>>> import mindspore_lite as mslite >>> context = mslite.Context() >>> context.append_device_info(mslite.CPUDeviceInfo()) >>> runner_config = mslite.RunnerConfig(context=context, workers_num=4) >>> model_parallel_runner = mslite.ModelParallelRunner() >>> model_parallel_runner.init(model_path="mobilenetv2.ms", runner_config=runner_config) >>> print(model_parallel_runner) model_path: mobilenetv2.ms.
- predict(inputs, outputs)[source]
Inference ModelParallelRunner.
- Parameters
- Raises
TypeError – inputs is not a list.
TypeError – inputs is a list, but the elements are not Tensor.
TypeError – outputs is not a list.
TypeError – outputs is a list, but the elements are not Tensor.
RuntimeError – predict model failed.
Examples
>>> import mindspore_lite as mslite >>> import numpy as np >>> context = mslite.Context() >>> context.append_device_info(mslite.CPUDeviceInfo()) >>> runner_config = mslite.RunnerConfig(context=context, workers_num=4) >>> model_parallel_runner = mslite.ModelParallelRunner() >>> model_parallel_runner.init(model_path="mobilenetv2.ms", runner_config=runner_config) >>> inputs = model_parallel_runner.get_inputs() >>> in_data = np.fromfile("input.bin", dtype=np.float32) >>> inputs[0].set_data_from_numpy(in_data) >>> outputs = model_parallel_runner.get_outputs() >>> model_parallel_runner.predict(inputs, outputs) >>> for output in outputs: ... data = output.get_data_to_numpy() ... print("outputs: ", data) ... outputs: [[1.02271215e-05 9.92699006e-06 1.69684317e-05 ... 6.69087376e-06 2.16263197e-06 1.24009384e-04]]