ModelParallelRunner
import com.mindspore.config.RunnerConfig;
ModelParallelRunner defines MindSpore Lite concurrent inference.
Public Member Functions
function |
---|
getModelParallelRunnerPtr
public long getModelParallelRunnerPtr()
Get the underlying concurrent inference class pointer.
Returns
Low-level concurrent inference class pointer.
init
public boolean init(String modelPath, RunnerConfig runnerConfig)
Read and load models according to the path, generate one or more models, and compile all models to a state that can be run on the Device.
Parameters
modelPath
: model file path.runnerConfig
: A RunnerConfig structure. Defines configuration parameters for the concurrent inference model.
Returns
Whether the initialization is successful.
public boolean init(String modelPath)
Read and load models according to the path, generate one or more models, and compile all models to a state that can be run on the Device.
Parameters
modelPath
: model file path.
Returns
Whether the initialization is successful.
predict
public boolean predict(List<MSTensor> inputs, List<MSTensor> outputs)
Concurrent inference model.
Parameters
inputs
: model input.outputs
: model output.
Returns
Whether the inference is successful.
getInputs
public List<MSTensor> getInputs()
Get all input tensors of the model.
Returns
A list of input tensors for the model.
getOutputs
public List<MSTensor> getOutputs()
Get all input tensors of the model.
Returns
A list of output tensors for the model.
free
public void free()
Free concurrent inference class memory.