MindSpore Lite

Obtain MindSpore Lite

  • Downloading MindSpore Lite
  • Building MindSpore Lite

Quick Start

  • Getting Started in One Hour
  • Experience C++ Simple Inference Demo
  • Experience C++ Minimalist Concurrent Reasoning Demo
  • Experience Java Simple Inference Demo
  • Experience Java Minimalist Concurrent Reasoning Demo
  • Expriencing Simpcified Inference Demo with C-language
  • Android Application Development Based on Java Interface
  • Implement Device Training Based On C++ Interface
  • Implement Device Training Based On Java Interface

Inference on Devices

  • Converting Models for Inference
  • Post Training Quantization
  • Data Preprocessing
  • Executing Model Inference
  • Performing Inference on MCU or Small Systems
  • Application Specific Integrated Circuit Integration Instructions

Training on Devices

  • Creating MindSpore Lite Models
  • Executing Model Training

Server Inference

  • Executing Server Inference
    • Using C++ Interface to Parallel Inference
    • Using Java Interface to Parallel Inference

Third-party hardware docking

  • Custom Kernel
  • Using Delegate to Support Third-party AI Framework

Other Tools

  • Benchmark Tool
  • Static Library Cropper Tool
  • Visualization Tool
  • Model Obfuscation Tool

References

  • Overall Architecture (Lite)
  • Lite Operator List
  • Codegen Operator List
  • Model List
  • Troubleshooting
  • Log

RELEASE NOTES

  • Release Notes
MindSpore Lite
  • »
  • Executing Server Inference
  • View page source

Executing Server Inference

  • Using C++ Interface to Parallel Inference
  • Using Java Interface to Parallel Inference
Previous Next

© Copyright MindSpore.

Built with Sphinx using a theme provided by Read the Docs.