FAQ

View Source On Gitee


Q:How many log levels are supported by MindSpore Lite? How can I set the log level?

A:Currently MindSpore Lite supports 4 log levels, including DEBUG, INFO, WARNING and ERROR. Users can set log level by set environment parameter GLOG_v. This environment parameter ranges from 0 to 3, which represents DEBUG, INFO, WARNING and ERROR. The default log level is WARNING or ERROR. For example, if the user sets GLOG_v to 1, MindSpore Lite will print the log of INFO level or higher.

Q: What are the limitations of NPU?

A: Currently NPU only supports system ROM version EMUI>=11. Chip support includes Kirin 9000, Kirin 9000E, Kirin 990, Kirin 985, Kirin 820, Kirin 810, etc. For specific constraints and chip support, please see: https://developer.huawei.com/consumer/en/doc/development/hiai-Guides/mapping-relationship-0000001052830507#EN-US_TOPIC_0000001052830507__section94427279718


Q: Why does the static library after cutting with the cropper tool fail to compile during integration?

A: Currently the cropper tool only supports CPU and GPU libraries. For details, please refer to Use clipping tool to reduce library file size document.


Q: Will MindSpore Lite run out of device memory, when running model?

A: Currently the MindSpore Lite built-in memory pool has a maximum capacity limit 3GB. If a model is bigger than 3GB, MindSpore Lite will throw error.

Q: How do I visualize the MindSpore Lite offline model (.ms file) to view the network structure?

A: Model visualization open-source repository Netron supports viewing MindSpore Lite models (MindSpore >= r1.2), which can be downloaded in the Netron.


Q: Does MindSpore have a quantized inference tool?

A: MindSpore Lite supports the inference of the quantization aware training model on the cloud. The MindSpore Lite converter tool provides the quantization after training and weight quantization functions which are being continuously improved.


Q: Does MindSpore have a lightweight on-device inference engine?

A:The MindSpore lightweight inference framework MindSpore Lite has been officially launched in r0.7. You are welcome to try it and give your comments. For details about the overview, tutorials, and documents, see MindSpore Lite.