Release Notes

View Source On Gitee

MindSpore Lite 2.2.14 Release Notes

Bug Fixes

  • [I96PJC] An error is reported when the CLIP model in MS format is loaded through the MindSpore Lite Python API.

Contributors

Thanks goes to these wonderful people:

wangtongyu6, zhuguodong, 徐永飞, 徐安越, yeyunpeng2020, moran, XinDu, gengdongjie.

Contributions of any kind are welcome!

MindSpore Lite 2.2.11 Release Notes

Bug Fixes

  • [#I8TPLY] Fixed SSD MobileNetV2 FPN network inference error on Atlas inference series products(configured with Ascend 310P AI processor).

Contributors

Thanks goes to these wonderful people:

wangtongyu6, zhuguodong, 徐永飞, 徐安越, yeyunpeng2020, moran, XinDu, gengdongjie.

Contributions of any kind are welcome!

MindSpore Lite 2.2.10 Release Notes

Bug Fixes

  • [#I8K7CC] Optimize error message when non-string segments are passed to get_model_info.

Contributors

Thanks goes to these wonderful people:

gengdongjie, zhangyanhui, xiaoxiongzhu, wangshaocong, jianghui58, moran, wangtongyu6, 徐安越, qinzheng, 徐永飞, youshu, XinDu, yeyunpeng2020, yefeng, wangpingan, zjun, 胡安东, 刘力力, 陈宇, chenjianping, kairui_kou, zhangdanyang, hangq, mengyuanli, 刘崇鸣

Contributions of any kind are welcome!

MindSpore Lite 2.2.1 Release Notes

Bug Fixes

  • [#I88055] Fixed a function issue caused by incorrect format setting of the gridsample operator in MindSpore Lite inference.

  • [#I8D80Y] The MindSpore Lite inference single-operator invoking process resources are not released and exits abnormally.

Contributors

Thanks goes to these wonderful people:

zhanghaibo, wangsiyuan, wangshaocong, chenjianping

Contributions of any kind are welcome!

MindSpore Lite 2.2.0 Release Notes

Major Features and Improvements

FlashAttention Operator Fusion

  • [STABLE] The OptiX OSN Ascend 910 series supports the FlashAttention large operator fusion of the LLAMA and stable diffusion models.