Release Notes
MindSpore Lite 2.6.0 Release Notes
Major Features and Improvements
[STABLE] MindSpore Lite supports configuring operator parallel inference acceleration during model conversion. You only need to configure the stream_label_file option during model conversion to specify the operators that need parallel inference.
[STABLE] MindSpore Lite supports the conversion of onnx if operators in the Ascend backend.
API Change
[STABLE] In the acl model conversion configuration, a new stream_label_file option is added under the ascend_context option to enable multi-stream parallel inference.
Contributors
熊攀,ZhangZGC,yanghaoran,李林杰,shenwei41,xiaotianci,panzhihui,guozhijian,胡彬,tangmengcheng,XianglongZeng,cccc1111,stavewu,刘思铭,r1chardf1d0,jiangshanfeng