MindSpore Lite
Obtain MindSpore Lite
Downloading MindSpore Lite
Building MindSpore Lite
Quick Start
Experience C++ Simple Inference Demo
Experience Java Simple Inference Demo
Android Application Development Based on JNI Interface
Android Application Development Based on Java Interface
Implement Device Training Based On C++ Interface
Implement Device Training Based On Java Interface
Inference on Devices
Converting Models for Inference
Optimizing the Model (Quantization After Training)
Data Preprocessing
Executing Model Inference
Using C++ Interface to Perform Inference
Using Java Interface to Perform Inference
Perform Inference on Mini and Small Systems
Application Specific Integrated Circuit Integration Instructions
Register Kernel
Using Delegate to Support Third-party AI Framework
Training on Devices
Creating MindSpore Lite Models
Executing Model Training
Other Tools
Benchmark Tool
Static Library Cropper Tool
Visualization Tool
Model Obfuscation Tool
References
Overall Architecture (Lite)
Lite Operator List
Codegen Operator List
Model List
MindSpore Lite
»
Executing Model Inference
View page source
Executing Model Inference
Using C++ Interface to Perform Inference
Using Java Interface to Perform Inference