Downloading MindSpore Lite
Welcome to MindSpore Lite. We provide functions such as model conversion, model inference, image processing, etc. that support multiple operating systems and hardware platforms. You can download the version package suitable for the local environment and use it directly.
The Linux-x86_64 and Linux-aarch64 target have been tested and verified on the Linux distribution versions Euleros2.0, Centos7.8 and Ubuntu18.04.
1.8.1
Inference runtime
Module Name |
Hardware Platform |
Operating System |
Download Links |
SHA-256 |
---|---|---|---|---|
Inference/training runtime lib, inference/training aar package, Micro lib, benchmark tool |
CPU |
Android-aarch32 |
7cc56ef7c90e41b6df980c3c43cb326a7c3f1bff8b827fbba3bf19ac5c827f6e |
|
Inference/training runtime lib, inference/training aar package, Micro lib, benchmark tool |
CPU/GPU |
Android-aarch64 |
4dbee13424c347549903426ae794bf4520115625f78871d6c18eca9b62533551 |
|
Inference/training runtime lib, inference/training jar package, Micro lib, benchmark tool, converter tool, cropper tool |
CPU |
Linux-x86_64 |
8a60dbbfee8a9f25853ebbf297c7c727cab365171347c701c0aa51cd0b1290ec |
|
Inference runtime lib, Micro lib, benchmark tool, converter tool |
CPU |
Linux-aarch64 |
598841c2018c977fbdec5fa2d1b3c5f82fc3ffca41c7726faee8dcb850e6ba03 |
|
Inference runtime lib, Micro lib, benchmark tool, converter tool |
CPU |
Windows-x86_64 |
44adced8645f69ea452e19069306d425cd8a78de377db18f801327689e28cf0e |
|
iOS inference runtime lib |
CPU |
iOS-aarch32 |
22758b8ff36258e95a08dae79e09c70c150389adf7e74f543abb526b734fe64f |
|
iOS inference runtime lib |
CPU |
iOS-aarch64 |
deb75103438d45fb6d7c42fa6feaef2a570d1ecc2a4bc2e1e4379b97b1413ec9 |
|
NNIE converter tool |
CPU |
Linux-x86_64 |
0ee5427a82c1e876113e305e5613232ee93fc168d6eeb9c127261fb4b0ccefcc |
|
NNIE inference runtime lib, benchmark tool |
Hi3516D |
Linux-aarch32 |
f3bb6245fdd0ae51d090d428ce8e65fc8ca8934fc1d1e2cd262f046130db0835 |
|
NNIE inference runtime lib, benchmark tool |
Hi3519A |
Linux-aarch32 |
2552e6fac5c36d6c44c35eedba43a2e92aa28ebc0ec1125eb895c1d2ad62318c |
|
NNIE inference runtime lib, benchmark tool |
Hi3559A |
Linux-aarch64 |
25a7d30c82c1e90b6e832f2edd23ac45c8f4f75ba1fdcef633b7ee99af6856d0 |
|
Ascend inference runtime lib, benchmark tool, converter tool |
Ascend |
Linux-x86_64 |
a57de862c2e0809194a9183c09968f0f871f3b3c9be729c482b068ed672e6d32 |
|
Ascend inference runtime lib, benchmark tool, converter tool |
Ascend |
Linux-aarch64 |
e91c74f75d581c2a88ad42f75dc4007eb254469588dd42cec264477f0eab425b |
|
HarmonyOS lite runtime lib, Micro lib |
Hi3516D |
OpenHarmony-aarch32 |
67bd55222bf8c67e01ff413ac2226b7f9444f5a9875b82aa6faf3fcae753c1bf |
|
Micro lib |
Cortex-M7 |
None |
5b1d1de977aac04f29c9e8e84931b0b7ec2a700e103d0940e5afc3ced5f49bd0 |
Python API wheel package
Module Name |
Hardware Platform |
Operating System |
Download Links |
SHA-256 |
---|---|---|---|---|
Converter and inference runtime |
CPU |
Linux-x86_64 |
60c5a974ad76ee06f63ce784ae536d65c42fde6c41eaea3375ecbef5fc3efef0 |
|
Converter and inference runtime |
CPU |
Linux-aarch64 |
1ca4cc3ac0df52ca578389502b7ace581d835683b18b1fecafd8bb10b89b3d84 |
|
Ascend converter and inference runtime |
Ascend |
Linux-x86_64 |
95e8ff0fa4296428781e70d32f90cb44ddd97c55372cc7d2a4f61f7d9adbdd2c |
|
Ascend converter and inference runtime |
Ascend |
Linux-aarch64 |
9d361d71b2c8efc809a7fba939b8e2ba1e06368ba7d8a0ff6e192dbc0d777fc9 |
1.8.0
Module Name |
Hardware Platform |
Operating System |
Download Links |
SHA-256 |
---|---|---|---|---|
Inference/training runtime, inference/training aar package, and benchmark tools |
CPU |
Android-aarch32 |
0ff34972be36b9c9ef896094b1b0e5cd39d2253cbed86a41303fd9c21f1d5417 |
|
Inference/training runtime, inference/training aar package, and benchmark tools |
CPU/GPU |
Android-aarch64 |
f58139a33524191b6a42761948429aed4bc450d1e8c2986f58bfae1f5d81e626 |
|
Inference/training runtime, inference/training jar package, and benchmark/codegen/converter/cropper tools |
CPU |
Linux-x86_64 |
90be22a9f2a34329b46b1f0b08954a4b8cb2bd80bc69f8416f780f5f21f31547 |
|
Inference runtime and benchmark/converter tools |
CPU |
Linux-aarch64 |
e17bfddc47feb17a430f3aea1d2f9b8895505518ec0a0dd109d9c4ed2f8d3b01 |
|
Inference runtime and benchmark/codegen/converter tools |
CPU |
Windows-x86_64 |
f18b3e1697b27bcaf67bad2793494bf8babddd8d5b235b861cf693a44d9e9f14 |
|
iOS inference runtime |
CPU |
iOS-aarch32 |
2270a87ec1888a5f9c62c37177476ee3aece5da8c98d2f379b83889e2f0db8c0 |
|
iOS inference runtime |
CPU |
iOS-aarch64 |
ba368afa146f6e156d840e73145a81c6b2524a51dd0d138bcd2f9c2db5c59771 |
|
NNIE converter tool |
CPU |
Linux-x86_64 |
9d4fce71a3963a0fa779f7996abb315fd9572fe8aca8b562f65f5793b42812b4 |
|
NNIE inference runtime and benchmark tools |
Hi3516D |
Linux-aarch32 |
b5739551b5217171e522aea27d93f64ce23d4bcd46e6b87bdaec2b09ec4c395b |
|
Ascend inference runtime and benchmark/converter tools |
Ascend |
Linux-x86_64 |
a141a7886bde607fdc80fc7f35b28c14b11d36f28bbc3f917db6ae065cc32954 |
|
Ascend inference runtime and benchmark/converter tools |
Ascend |
Linux-aarch64 |
0dd382c848bbb6ae2bd45a31098d311ec046bf219787480156cf1e0123e9f602 |
|
HarmonyOS lite runtime |
Hi3516D |
OpenHarmony-aarch32 |
8c93859a1a7925069906abc59c9d5f5fc53b989b852e80a2ffe3b85e7ff944ba |
Python API wheel package
Module Name |
Hardware Platform |
Operating System |
Download Links |
SHA-256 |
---|---|---|---|---|
Converter and inference runtime |
CPU |
Linux-x86_64 |
80c08d67e2885f68e34c3b99356c1d95c72b1ad4cfdf29d08085ff7195753a08 |
|
Converter and inference runtime |
CPU |
Linux-aarch64 |
a6365c96bc6ca04c7482cc2f2d3c1367d56ad7af13ed0e0d56b9349e773bcc30 |
|
Ascend converter and inference runtime |
Ascend |
Linux-x86_64 |
6539df5f1d59c679fd5a4e0b372b410ced226c2a3f4e1aec1a004b6bf2b6f1ba |
|
Ascend converter and inference runtime |
Ascend |
Linux-aarch64 |
8fed6d1ee3ab6673f733bb9071b19e1fc4a34a20813f9d5d4f6468fd6b5a36b6 |
1.7.0
Module Name |
Hardware Platform |
Operating System |
Download Links |
SHA-256 |
---|---|---|---|---|
Inference/training runtime, inference/training aar package, and benchmark tools |
CPU |
Android-aarch32 |
26e31e9dc4a87698f0af18cd51ed219559b9a996b4123502608dd0a130270805 |
|
Inference/training runtime, inference/training aar package, and benchmark tools |
CPU/GPU |
Android-aarch64 |
e97a19f6d5a1e8cd6ff1b17d8445801eff496a2f02f1a3de8305a1459fe6eaef |
|
Inference/training runtime, inference/training jar package, and benchmark/codegen/converter/cropper tools |
CPU |
Linux-x86_64 |
f162a4b2ccfb5ec11f2d510e4df1d76bd1d9f9d6e321b1e068ac9e37db9dfd0e |
|
Inference runtime and benchmark/converter tools |
CPU |
Linux-aarch64 |
320e2f076a068151c2796bb4a7ceb2190dc86ec78de0c39b52e6a496511b5e85 |
|
Inference runtime and benchmark/codegen/converter tools |
CPU |
Windows-x86_64 |
41229b8be3997dd6ec648c7884b483ca4ca5926f3a7d9f78314af1cc2d2824ad |
|
iOS inference runtime |
CPU |
iOS-aarch32 |
769fe933ebc77ebeeac5486ca418f0f17b491a01253cce8d7368742832bc6a99 |
|
iOS inference runtime |
CPU |
iOS-aarch64 |
1ab0717c8d73ae64570f34ad203c6155f541a4420a24c24f26f0ffd919770832 |
|
NNIE converter tool |
CPU |
Linux-x86_64 |
3965c150ec0428693217dd88dceca4bd01de08f40f8299410af94d9ba3a170bc |
|
NNIE inference runtime and benchmark tools |
Hi3516D |
Linux-aarch32 |
16b51013b0b2e70e52edeb909535ad9818f7771e6c108f7ab7e207ffcd8673ba |
|
Ascend inference runtime and benchmark/converter tools |
Ascend310 |
Linux-x86_64 |
1fe2fdcabb9900c2d7e4ee9e239df157351574ed92b5f568eac23620f91f2ce8 |
|
Ascend inference runtime and benchmark/converter tools |
Ascend310 |
Linux-aarch64 |
41027f2e070896d8440c89b8b75f66e9c92bbefa72b8b68e959770c2ffea5780 |
|
HarmonyOS lite runtime |
Hi3516D |
OpenHarmony-aarch32 |
269f0dc976523302d372a0500a3c93eb78ea048c51b9c5ae339174f224e1cbee |
1.6.1
Module Name |
Hardware Platform |
Operating System |
Download Links |
SHA-256 |
---|---|---|---|---|
Inference/training runtime, inference/training aar package, and benchmark tools |
CPU |
Android-aarch32 |
afcfab6c13d46d25a19d4caedb66c2a16ccd63d5b5993ed1e6268633d03a7820 |
|
Inference/training runtime, inference/training aar package, and benchmark tools |
CPU/GPU |
Android-aarch64 |
7bba9b242c1b5bbb21e39313db0d1118aa568813466a82f3ddd689d570910b92 |
|
Inference/training runtime, inference/training jar package, and benchmark/codegen/converter/cropper tools |
CPU |
Linux-x86_64 |
667f8fae4762281858adfb71e62f11a52f641078496901d3f317b0df8c415452 |
|
Inference runtime and benchmark/converter tools |
CPU |
Linux-aarch64 |
238589a227534f0514d2371fe9a92eb5ac2f181a22917ac2b5ef6969d51c518e |
|
Inference runtime and benchmark/codegen/converter tools |
CPU |
Windows-x86_64 |
da0781020f8680a7ab5c00d9930d06fe0f9c295aa9e13b62b43c2b30e0bcad91 |
|
iOS inference runtime |
CPU |
iOS-aarch32 |
6a8adfb3ea960af9213ed9474a9b2d9deef57ebe8cb9f9c822a314c7ff8165e7 |
|
iOS inference runtime |
CPU |
iOS-aarch64 |
f7fe5477744bd1d7ee19d3292f150ac9665f60f613a3a8025a870b147885c76a |
|
NNIE converter tool |
CPU |
Linux-x86_64 |
bde6c05aa90193d6fc0431ac3ef7a2c7e36e77db072955db68d16d6a90cfbe64 |
|
NNIE inference runtime and benchmark tools |
Hi3516D |
Linux-aarch32 |
fc4ab99ae0fb4cad4a63dad591da8d20be624e29b563aba39bba8239c0d64bfc |
|
Ascend inference runtime and benchmark/converter tools |
Ascend310 |
Linux-x86_64 |
7b46fe61974d0295c1f816707e0330c26085ca5532c57f36b149fa851bcdbf7b |
|
HarmonyOS lite runtime |
Hi3516D |
OpenHarmony-aarch32 |
e70b2d30feafe0bbce3999c0bb551ed70edaf497c31fe84d70e4d5944093b253 |
1.6.0
Module Name |
Hardware Platform |
Operating System |
Download Links |
SHA-256 |
---|---|---|---|---|
Inference/training runtime, inference/training aar package, and benchmark tools |
CPU |
Android-aarch32 |
d043803cffc8a0b75409aab3e4039f1e86756cf618af1538a76865e9fa4fd481 |
|
Inference/training runtime, inference/training aar package, and benchmark tools |
CPU/GPU |
Android-aarch64 |
25188266621f4cfedb24970a9a98ef6190fe02c9d034b7285f360da425ffe9d6 |
|
Inference/training runtime, inference/training jar package, and benchmark/codegen/converter/cropper tools |
CPU |
Linux-x86_64 |
90472996359f64509f38036ed8100605c76dcdc42453c2fc7156048eb981708c |
|
Inference runtime and benchmark/codegen/converter tools |
CPU |
Windows-x86_64 |
4460b8f1bf321eca005074dccffb54d6d3164ba3f78ce34530ec20db4dbc9980 |
|
iOS inference runtime |
CPU |
iOS-aarch32 |
72fe007660abe9c51d0a1852b094fb52d8bbd1610c989e79c9858937102aa59f |
|
iOS inference runtime |
CPU |
iOS-aarch64 |
51bd5f7c21477d7856bea33d31e059f578b6b964a7c43e440e97c44b186db4a4 |
|
NNIE converter tool |
CPU |
Linux-x86_64 |
81c2a5dadf51978d1c80f75c63fde4edefa2897792ac571fd33ffd35e338736b |
|
NNIE inference runtime and benchmark tools |
Hi3516D |
Linux-aarch32 |
8133c2326e2defa3614f86592d5691fdb410a4296898e254a33cd33a7e519b16 |
|
HarmonyOS lite runtime |
Hi3516D |
OpenHarmony-aarch32 |
d5daafac4bdcd0d03158e2a7cd3f881869b49cfb77d9654a24ddd967edbe5e91 |
1.5.0
Module Name |
Hardware Platform |
Operating System |
Download Links |
SHA-256 |
---|---|---|---|---|
Inference/training runtime, inference/training aar package, and benchmark tools |
CPU |
Android-aarch32 |
83101ffc38de6c33c94d09bddd0efed31c23f468f76694ecbec5623db5f04afd |
|
Inference/training runtime, inference/training aar package, and benchmark tools |
CPU/GPU |
Android-aarch64 |
2f09c9f018c1141f2c415d48ff8ec01c17a2ad5e0b7d3233b8aa0612a2330a9e |
|
Inference/training runtime, inference/training jar package, and benchmark/codegen/converter/cropper tools |
CPU |
Ubuntu-x64 |
2359084653c1ddb55da738d5daf65e2cb0b0426032232ae455a72f8960961823 |
|
Inference runtime and benchmark/codegen/converter tools |
CPU |
Windows-x64 |
2ed5d767be638787755c3855e28312d655e15219ad8bf500b43d6cea3a8d2dc6 |
|
iOS inference runtime |
CPU |
iOS-aarch32 |
b3cd43f694f051e996cb8b39ed30137f92a4c324adaaceac1d75aa995bd7deb1 |
|
iOS inference runtime |
CPU |
iOS-aarch64 |
9e919cdaf92fbb408ab169113182f7977d51409575e3a864a5a985a3004623d4 |
|
NNIE converter tool |
CPU |
Ubuntu-x64 |
ab424c967b9ead17cebf1ef353b8b8129b7725e57f045be94a235741d12326c8 |
|
NNIE inference runtime and benchmark tools |
Hi3516D |
Linux-aarch32 |
976ce83dc89f3ebeab9b706ce0b1a8f49cab7a2c19cd860446c3464164054ceb |
|
HarmonyOS lite runtime |
Hi3516D |
OpenHarmony |
49cf161c90c259415718b0c6b4538bbc06d4629bfa604c4cbcbc04cf5ed1e3e8 |
1.5.0-rc1
Module Name |
Hardware Platform |
Operating System |
Download Links |
SHA-256 |
---|---|---|---|---|
Inference/training runtime, inference/training aar package, and benchmark tools |
CPU |
Android-aarch32 |
7317c9359bd6da97c389780a9a6f91987c512a794d3a367e42e411eaacb181e0 |
|
Inference/training runtime, inference/training aar package, and benchmark tools |
CPU/GPU |
Android-aarch64 |
d0453a1e5f5f653cc77f160b3ea9252ee1ab96d0bce3e9128d4a54f47279ac46 |
|
Inference/training runtime, inference/training jar package, and benchmark/codegen/converter/cropper tools |
CPU |
Ubuntu-x64 |
81451541762699c259d13dd1d5a985fde653aeaf91052471bb918eb4f5045f49 |
|
Inference runtime and benchmark/codegen/converter tools |
CPU |
Windows-x64 |
3ee80e7906f6173b805d985b51551c489bd23614b8747fb6e795a7303729913b |
|
iOS inference runtime |
CPU |
iOS-aarch32 |
56a508d49605d46b65b84d4160155ebcbb21d1f9295baac1c96148f0a2b06e79 |
|
iOS inference runtime |
CPU |
iOS-aarch64 |
69af90551a97f6d36379431a63fc31ebff102809fc5ec7e7a48e566f3a578ad8 |
|
NNIE converter tool |
CPU |
Ubuntu-x64 |
ee0ec7fdc203a5106eccc02ca15c7543c4e6c957e605f8a9aaead4d966f10c4b |
|
NNIE inference runtime and benchmark tools |
Hi3516D |
Linux-aarch32 |
477628889908fb64ae4b7b989c0181593f9c66bebf119f6a07437911879bff8b |
|
HarmonyOS lite runtime |
Hi3516D |
OpenHarmony |
54654927f6bf0f90d316407c689575c47a263e0f3c608e5b430b9764b4c35f5c |
1.3.0
Module Name |
Hardware Platform |
Operating System |
Download Links |
SHA-256 |
---|---|---|---|---|
Inference/training runtime, inference/training aar package, and benchmark tools |
CPU |
Android-aarch32 |
c1a950feec47a58871956cab74f0b6f76ad2f151dde990228782f76c0d8120df |
|
Inference/training runtime, inference/training aar package, and benchmark tools |
CPU/GPU |
Android-aarch64 |
4def68662f5b249db0fd0f372fab1877e530fe32b6b9317869f01bedde892838 |
|
Inference/training runtime, inference/training jar package, and benchmark/codegen/converter/cropper tools |
CPU |
Ubuntu-x64 |
2d0f77bb3c1a9489bc9511f334fb6cea3266d6bd4d600517b5aa7e58efab1310 |
|
Inference runtime and benchmark/codegen/converter tools |
CPU |
Windows-x64 |
40c1abdeb4c5f5353844e9798c6a0f20565a5a3bc6de7da3cdc4a2df6fa15ef7 |
|
iOS inference runtime |
CPU |
iOS-aarch32 |
b4bb1435887b04ce95be5429875e81c3e40a57b0c6182a35d58ea34b27d5fa5c |
|
iOS inference runtime |
CPU |
iOS-aarch64 |
a28111c2bcc542a70ef98edfe007892855dcfcd9d40586fa98be962caefc26d3 |
|
NNIE converter tool |
CPU |
Ubuntu-x64 |
73f4dffde69d24a8d0574e771bc6131a45a8fc5ebd18b34fc7afd30a0d149cb1 |
|
NNIE inference runtime and benchmark tools |
Hi3516D |
Linux-aarch32 |
07a0d0a8a8f257c01d06fa33f57969eb68385b430f2f5a3a4b09dba463c361d9 |
|
HarmonyOS lite runtime |
Hi3516D |
OpenHarmony |
1bd481e93b6f3b2467a6d8f0bcc2da0221e4d023d74c584174dedb8854704748 |
1.2.0
Module Name |
Hardware Platform |
Operating System |
Download Links |
SHA-256 |
---|---|---|---|---|
Inference runtime (cpp), training runtime (cpp), inference aar package, and benchmark/benchmark_train tools. |
CPU |
Android-aarch32 |
7d073573385a69bff53542c395d106393da241682cd6053703ce21f1de23bac6 |
|
Inference runtime (cpp), training runtime (cpp), inference aar package, and benchmark/benchmark_train tools. |
CPU/GPU |
Android-aarch64 |
7f8400f0b97fa3e7cbf0d266c73b43a2410905244b04d0202fab39d9267346e0 |
|
Inference runtime (cpp), training runtime (cpp), inference jar package, and benchmark/benchmark_train/codegen/converter/cropper tools. |
CPU |
Ubuntu-x64 |
3b609ed8be9e3ae70987d6e00421ad4720776d797133e72f6952ba6b93059062 |
|
Inference runtime (cpp) and benchmark/codegen/converter tools. |
CPU |
Windows-x64 |
bf01851d7e2cde416502dce11bd2a86ef63e559f6dabba090405755a87ce14ae |
|
Inference runtime(cpp) |
CPU |
OpenHarmony |
a9987b25815cb69e0f630be1388486e8d727a19815a67851089b7d633bd2f3f2 |
1.1.0
Inference
Module Name |
Hardware Platform |
Operating System |
Download Links |
SHA-256 |
---|---|---|---|---|
MindSpore Lite Converter |
CPU |
Ubuntu-x64 |
d449e38a8493c314d1b5b1a127f62269192da785b012ff892eda775dedca3d82 |
|
CPU |
Windows-x64 |
5e50b7701b97ebe784095f2ba954fc6c377eb157fbc9aaeae2497e38cc4ee212 |
||
MindSpore Lite Runtime (include image processing) |
CPU/GPU/NPU |
Android-aarch64/Android-aarch32 |
a19de5706db57e97a5f04ef08e0e383f8ea497c70bb60e60d056b31a603c0243 |
|
CPU |
Ubuntu-x64 |
176256c2fbef775f1a44aaeccae0c4eea6a60f41fc0baece5479dcb378155f36 |
||
CPU |
Windows-x64 |
30b5545245832a73d84732166f360c77cd09a7a4fe1fb922a8f7b80e7df326c1 |
Train
Module Name |
Hardware Platform |
Operating System |
Download Links |
SHA-256 |
---|---|---|---|---|
MindSpore Lite Converter |
CPU |
Ubuntu-x64 |
f95a9db98c84ec3d97f88383ecc3832582aa9737ed287c33703deb0b419acf25 |
|
MindSpore Lite Runtime (include image processing) |
CPU |
Android-aarch64/Android-aarch32 |
a6d8152f4e2d674c52af2c379f7d07858d30bc0dceef1dbc366e6fa16a5948b5 |
|
CPU |
Ubuntu-x64 |
1290f0adc790adc9edce654b9a629a9a323cfcb8453eb6bc19b779ef726282bf |
Ubuntu-x64 Package is compiled in an environment where the GCC version is greater than or equal to 7.3.0, so the deployment environment requires the GLIBC version to be greater than or equal to 2.27.
Android-aarch32 does not support GPU and NPU.
MindSpore Lite also provides
libmindspore-lite.a
static library cropper tool for Runtime, which can crop the static library files, and effectively reduce the size of the library files.After the download of MindSpore Lite is completed, SHA-256 integrity verification is required.