Downloading MindSpore Lite
Welcome to MindSpore Lite. We provide functions such as model conversion, model inference, image processing, etc. that support multiple operating systems and hardware platforms. You can download the version package suitable for the local environment and use it directly.
The Linux-x86_64 and Linux-aarch64 target have been tested and verified on the Linux distribution versions Euleros2.0, Centos7.8 and Ubuntu18.04.
2.4.1
Inference runtime
Module Name |
Hardware Platform |
Operating System |
Download Links |
SHA-256 |
---|---|---|---|---|
Cloud-side inference runtime lib, inference jar package, benchmark tool, converter tool |
CPU/GPU/Ascend |
Linux-x86_64 |
dd830cae9fe45d9db478445cff3209ed66c43b03679ed2c40218f5cecff845a1 |
|
Cloud-side inference runtime lib, inference jar package, benchmark tool, converter tool |
CPU/Ascend |
Linux-aarch64 |
d2b6673d145cc2c197c7f26077e89b549ee180b54f2fe9dce232bda5c873cce4 |
|
Inference/training runtime lib, inference/training aar package, Micro lib, benchmark tool |
CPU/GPU |
Android-aarch64 |
56e196a7a2cccb039fd7c74dd7b2b60ac5d386ed99cca393c795eb044257649d |
|
Device-side inference/training runtime lib, inference/training jar package, Micro lib, benchmark tool, converter tool, cropper tool |
CPU |
Linux-x86_64 |
af999cc35ba63ac37146d5926ecdc726b44bc9f6043088b3fc6671f33f896b08 |
|
Device-side inference runtime lib, Micro lib, benchmark tool, converter tool |
CPU |
Linux-aarch64 |
8c5ffa369f197a432738bc1821f2be95af12cc29007ce839eb44859cd4d84c94 |
|
Inference runtime lib, Micro lib, benchmark tool, converter tool |
CPU |
Windows-x86_64 |
1f3d9edc3ea6e2b5ddd04b00e185eba749d812ece166e9b0a09e5f21a7f1878a |
Python API wheel package
Module Name |
Hardware Platform |
Operating System |
Python Version |
Download Links |
SHA-256 |
---|---|---|---|---|---|
Cloud-side inference runtime and converter |
CPU/GPU/Ascend |
Linux-x86_64 |
Python3.9 |
9205444c04fabd9ff1b42a01a179a8657af827e92188698c53237814623654fc |
|
Cloud-side inference runtime and converter |
CPU/Ascend |
Linux-aarch64 |
Python3.9 |
1b6696df8aa41ad0e9c0c9613bba679d92f93c52541731596dd63947febacc76 |
|
Cloud-side inference runtime and converter |
CPU/GPU/Ascend |
Linux-x86_64 |
Python3.10 |
023790e8ef6905afe609e9b68a27441fcb753cca55a70fe6da20da0bf7e195e4 |
|
Cloud-side inference runtime and converter |
CPU/Ascend |
Linux-aarch64 |
Python3.10 |
124da9042f315ac9500b9b5b0ae05fcd98531b6f43481362125dcf7498b7e2ce |
|
Cloud-side inference runtime and converter |
CPU/GPU/Ascend |
Linux-x86_64 |
Python3.11 |
18cbb08cb30b6bff8eb7a127716d888b1adf06c065826aa053a99a0acf8ac408 |
|
Cloud-side inference runtime and converter |
CPU/Ascend |
Linux-aarch64 |
Python3.11 |
b07ac40256a4f5f47c12d1e3c18d707c572622c264127eeee7e327de33c7f01e |
2.4.0
Inference runtime
Module Name |
Hardware Platform |
Operating System |
Download Links |
SHA-256 |
---|---|---|---|---|
Cloud-side inference runtime lib, inference jar package, benchmark tool, converter tool |
CPU/GPU/Ascend |
Linux-x86_64 |
2d07704f3ea81cd00248647f16ee606867a78ff6d92e47770a08bf835df11322 |
|
Cloud-side inference runtime lib, inference jar package, benchmark tool, converter tool |
CPU/Ascend |
Linux-aarch64 |
107bf19348b84c8cd50dca47793c0a3a48bec73c74dc47cd510a651676099018 |
|
Inference/training runtime lib, inference/training aar package, Micro lib, benchmark tool |
CPU/GPU |
Android-aarch64 |
1ca6dee07205dc541705370a514b108883245dba43b96d1774a3e3a3e5e2cd38 |
|
Device-side inference/training runtime lib, inference/training jar package, Micro lib, benchmark tool, converter tool, cropper tool |
CPU |
Linux-x86_64 |
06a33fbf3fc9fd375327009a14902e3bf5ef0eeb366b42a8aa0ad4b1930778ea |
|
Device-side inference runtime lib, Micro lib, benchmark tool, converter tool |
CPU |
Linux-aarch64 |
8b8fb357b83df0cd62290bafc92e3577677825b1efcad97eca657f875419047d |
|
Inference runtime lib, Micro lib, benchmark tool, converter tool |
CPU |
Windows-x86_64 |
169f363ecbbc0703d4a3f9d3f3d0dbf78d1c52a122fa468850d8a4e4f3eed94a |
Python API wheel package
Module Name |
Hardware Platform |
Operating System |
Python Version |
Download Links |
SHA-256 |
---|---|---|---|---|---|
Cloud-side inference runtime and converter |
CPU/GPU/Ascend |
Linux-x86_64 |
Python3.8 |
a95f4323407be5b2258780559619cea04ed98786990b3059f20bf97efceab0ae |
|
Cloud-side inference runtime and converter |
CPU/Ascend |
Linux-aarch64 |
Python3.8 |
49776a4ecea8e1608fc9a501655ca3df9596f8960321b0886dc42b4494eab82f |
|
Cloud-side inference runtime and converter |
CPU/GPU/Ascend |
Linux-x86_64 |
Python3.9 |
5f61c95fb39ac2e19f1dce09bb86c3ae1b4cf8101a0054477c6855f147ec6b8d |
|
Cloud-side inference runtime and converter |
CPU/Ascend |
Linux-aarch64 |
Python3.9 |
8e654d2edb6826eae96bdb8435e239aa5c6fd1ac0502606061b0dd3dbcfdccab |
|
Cloud-side inference runtime and converter |
CPU/GPU/Ascend |
Linux-x86_64 |
Python3.10 |
95395337033c28fe09c7f70d0c8a8365f68099af2d47b8fe1709339c37727eb2 |
|
Cloud-side inference runtime and converter |
CPU/Ascend |
Linux-aarch64 |
Python3.10 |
80c2f3533c823d628c9b3dc834d19c48ee8fb95234e0b14ced644bdb41db933b |
2.3.1
Inference runtime
Module Name |
Hardware Platform |
Operating System |
Download Links |
SHA-256 |
---|---|---|---|---|
Cloud-side inference runtime lib, inference jar package, benchmark tool, converter tool |
CPU/GPU/Ascend |
Linux-x86_64 |
df702ba03a8a6a94a482273a2be15f1098fcaffb92ba396533b6b0896aab2292 |
|
Cloud-side inference runtime lib, inference jar package, benchmark tool, converter tool |
CPU/Ascend |
Linux-aarch64 |
9ba19a675d61da90e0e4f28ad54d2750ea7dfbfbc7611f001f64d3bdf952aac5 |
|
Inference/training runtime lib, inference/training aar package, Micro lib, benchmark tool |
CPU/GPU |
Android-aarch64 |
eb0c13158f06b8a17975380ad190c28882e86989895ed4193d7223d3904337b0 |
|
Device-side inference/training runtime lib, inference/training jar package, Micro lib, benchmark tool, converter tool, cropper tool |
CPU |
Linux-x86_64 |
26792aabebde4f3c9a7b779b9da9b9c7bb9b80d5411efc6571e1ef45f0839f83 |
|
Device-side inference runtime lib, Micro lib, benchmark tool, converter tool |
CPU |
Linux-aarch64 |
7eec6a578ac7c96f33e0bb026f1f9f974cafada03d1a060e3fb121dc09ccf243 |
|
Inference runtime lib, Micro lib, benchmark tool, converter tool |
CPU |
Windows-x86_64 |
67f5fbba85d5b0682e228928f4cae5f48e6d6f5320a2fb10291a792e52748d03 |
Python API wheel package
Module Name |
Hardware Platform |
Operating System |
Python Version |
Download Links |
SHA-256 |
---|---|---|---|---|---|
Cloud-side inference runtime and converter |
CPU/GPU/Ascend |
Linux-x86_64 |
Python3.8 |
436b7afb77caeee217141cccf3a728f7b25c66f89255fc77b0801c261a4c19b5 |
|
Cloud-side inference runtime and converter |
CPU/Ascend |
Linux-aarch64 |
Python3.8 |
0f69ec441f13e0fdba6ea71fe975a84b0ceaaef2c6162a9ce9c876ec674bbd5f |
|
Cloud-side inference runtime and converter |
CPU/GPU/Ascend |
Linux-x86_64 |
Python3.9 |
13775d493c57cf7a804d9fb667c591e550620d6e633885cce61b5905ced0a624 |
|
Cloud-side inference runtime and converter |
CPU/Ascend |
Linux-aarch64 |
Python3.9 |
b6007fc1df2a47367d3734ad7b4112e8a80d46f31d8f57c90520cecc0848c790 |
|
Cloud-side inference runtime and converter |
CPU/GPU/Ascend |
Linux-x86_64 |
Python3.10 |
04021504bf6c73c6284ba0bc75b2f49e04402cfc76c4d402f2d235231b456bf1 |
|
Cloud-side inference runtime and converter |
CPU/Ascend |
Linux-aarch64 |
Python3.10 |
767068f587ac25924b4ba172f47ef9a2f9bab8c6e8cedc1f7115fa1239ddc7bf |
2.3.0
Inference runtime
Module Name |
Hardware Platform |
Operating System |
Download Links |
SHA-256 |
---|---|---|---|---|
Cloud-side inference runtime lib, inference jar package, benchmark tool, converter tool |
CPU/GPU/Ascend |
Linux-x86_64 |
ff4cbc9800cb54217a8e23130c056e34d32aaf7091a9b114b01b6c82b918b072 |
|
Cloud-side inference runtime lib, inference jar package, benchmark tool, converter tool |
CPU/Ascend |
Linux-aarch64 |
5dd03838ac9cb76108c5df60042f2544e3106fd7196f4a5a65f2612fac52f2be |
|
Inference/training runtime lib, inference/training aar package, Micro lib, benchmark tool |
CPU/GPU |
Android-aarch64 |
d1806e9393f353121217a838b7d33b816dbd4df672b05277972f3fe201bf4bce |
|
Device-side inference/training runtime lib, inference/training jar package, Micro lib, benchmark tool, converter tool, cropper tool |
CPU |
Linux-x86_64 |
060d698a171b52c38b64c8d65927816daf4b81d8e2b5069718aeb91a9f8a154c |
|
Device-side inference runtime lib, Micro lib, benchmark tool, converter tool |
CPU |
Linux-aarch64 |
7759281df5ffa22bdbd0f5c432893acaf88895cf9f3f40b6527fa756625da326 |
|
Inference runtime lib, Micro lib, benchmark tool, converter tool |
CPU |
Windows-x86_64 |
792840866cec4c4dfe7c2bc8f10b01134e635e5486585f29bab7e9b3c15042af |
Python API wheel package
Module Name |
Hardware Platform |
Operating System |
Python Version |
Download Links |
SHA-256 |
---|---|---|---|---|---|
Cloud-side inference runtime and converter |
CPU/GPU/Ascend |
Linux-x86_64 |
Python3.8 |
5090d1428f25d6465eedf3d02b771dd62d5e8e3e8f1310d7a335ae8c44d7cc80 |
|
Cloud-side inference runtime and converter |
CPU/Ascend |
Linux-aarch64 |
Python3.8 |
8630c598025b277986597df4cc13795044306a29b917a2b975878eb71a10565a |
|
Cloud-side inference runtime and converter |
CPU/GPU/Ascend |
Linux-x86_64 |
Python3.9 |
b98cf012ed341f5e55576fe5e3a622bde196854f16a58897b744645aad08840f |
|
Cloud-side inference runtime and converter |
CPU/Ascend |
Linux-aarch64 |
Python3.9 |
6a0b9644a4d05defbde20191820ca4ece1722c23c6f0c7e0001507bd0db92eae |
|
Cloud-side inference runtime and converter |
CPU/GPU/Ascend |
Linux-x86_64 |
Python3.10 |
d64730c754fd2bfb35d4bc05435a5f34c1464a89fb42a396a93b87506359b86d |
|
Cloud-side inference runtime and converter |
CPU/Ascend |
Linux-aarch64 |
Python3.10 |
162944fb0f05603f26cdd3a0f9d200dc02ea721af1f4386800aac5fdbd2ddefa |
2.3.0rc2
Inference runtime
Module Name |
Hardware Platform |
Operating System |
Download Links |
SHA-256 |
---|---|---|---|---|
Cloud-side inference runtime lib, inference jar package, benchmark tool, converter tool |
CPU/GPU/Ascend |
Linux-x86_64 |
0063fe41be5fca555a071cff0cc75609e1557efb4472ff1d8fa313593a1499db |
|
Cloud-side inference runtime lib, inference jar package, benchmark tool, converter tool |
CPU/Ascend |
Linux-aarch64 |
acb92c41232dc2d23dc5900563988818716b72cfc9e9a9c19a04a8350ed0b6c1 |
|
Inference/training runtime lib, inference/training aar package, Micro lib, benchmark tool |
CPU/GPU |
Android-aarch64 |
9df1d26c67769feeb47eb9f2bbdc6c8ed705bc783d13357c8cc4158fe3355341 |
|
Device-side inference/training runtime lib, inference/training jar package, Micro lib, benchmark tool, converter tool, cropper tool |
CPU |
Linux-x86_64 |
8d28b4d63fb21d411d6975518446efd04672e26896b75697cad4de21a9f50095 |
|
Device-side inference runtime lib, Micro lib, benchmark tool, converter tool |
CPU |
Linux-aarch64 |
210e9393ffe249468c9304e0c2d060e6284dd5290d4f8d89e88b7c8805c56471 |
|
Inference runtime lib, Micro lib, benchmark tool, converter tool |
CPU |
Windows-x86_64 |
ab3ebba62631cd4b938bca4174576783b472fd8549c8a1fa51ef4e5e893a88f9 |
Python API wheel package
Module Name |
Hardware Platform |
Operating System |
Python Version |
Download Links |
SHA-256 |
---|---|---|---|---|---|
Cloud-side inference runtime and converter |
CPU/GPU/Ascend |
Linux-x86_64 |
Python3.7 |
5929bf900a98f2d12c5b2ceb070dbb11937d18c394627ddf40628b9500a3f7c7 |
|
Cloud-side inference runtime and converter |
CPU/Ascend |
Linux-aarch64 |
Python3.7 |
051c8b6ad6da57ee01d512ed06a4074bd12cf48156b19a688563990767bc83d9 |
|
Cloud-side inference runtime and converter |
CPU/GPU/Ascend |
Linux-x86_64 |
Python3.8 |
58a0eb8bd54569b0738c61f94b54601ec49480a1e3f819ec7feb55dba3e107db |
|
Cloud-side inference runtime and converter |
CPU/Ascend |
Linux-aarch64 |
Python3.8 |
b4dad728a8678b15fadf9a8390086e9007a390855672124ae2dc07194096cc39 |
|
Cloud-side inference runtime and converter |
CPU/GPU/Ascend |
Linux-x86_64 |
Python3.9 |
bc1b054867e5c47a6a1f04b883aa0d1258458a9839308f6a231e246b0abf6489 |
|
Cloud-side inference runtime and converter |
CPU/Ascend |
Linux-aarch64 |
Python3.9 |
e174028248a0da5b7a940bb2a678db118cecf2178ab447a9fe6c1ca4cf72bdb2 |
2.3.0rc1
Inference runtime
Module Name |
Hardware Platform |
Operating System |
Download Links |
SHA-256 |
---|---|---|---|---|
Cloud-side inference runtime lib, inference jar package, benchmark tool, converter tool |
CPU/GPU/Ascend |
Linux-x86_64 |
da86fe540beb229f046eea3b94bb30f5260e30ebdbdcee89d8bf562fe0c282c9 |
|
Cloud-side inference runtime lib, inference jar package, benchmark tool, converter tool |
CPU/Ascend |
Linux-aarch64 |
ec7ccdb7ddc82193b7497167f4787424e2d87a95448e68af6bd45d8f7cb62116 |
|
Inference/training runtime lib, inference/training aar package, Micro lib, benchmark tool |
CPU |
Android-aarch32 |
81ed5ca8fd1bab67f14ed4505c784e8fa9c941ea886aea682bc90ea5fb7de1c8 |
|
Inference/training runtime lib, inference/training aar package, Micro lib, benchmark tool |
CPU/GPU |
Android-aarch64 |
1ba0fe385efa9e4e6233c8396307b8f41a8f3a323f78a06e7ca5338a7dcb1145 |
|
Device-side inference/training runtime lib, inference/training jar package, Micro lib, benchmark tool, converter tool, cropper tool |
CPU |
Linux-x86_64 |
001a04808a12f5c17e265dcb7f933d6f75c66691001e6b306cdb8999195eb98d |
|
Device-side inference runtime lib, Micro lib, benchmark tool, converter tool |
CPU |
Linux-aarch64 |
0625da982e4c712d46e03059ac018d7e448ce27ca9140d76fa0fff67c374f831 |
|
Inference runtime lib, Micro lib, benchmark tool, converter tool |
CPU |
Windows-x86_64 |
737a5b2168a4f9db215bf570677b4f1316d9701ad18458c97a51153fd1fc27dc |
|
iOS inference runtime lib |
CPU |
iOS-aarch32 |
40fc5ffdfae385d09d28ff9e1ad4afba9df8b1c73ffa0c7dbedb9540593516e0 |
|
iOS inference runtime lib |
CPU |
iOS-aarch64 |
3be415466838725b095aa9f9edf4791a6cb4c7c3fcb1c8625b0af0352514a695 |
|
NNIE converter tool |
CPU |
Linux-x86_64 |
e3570b7ff0b6afaa79d8a348bdd2be7ce472a49e2e64b8da9147adbd53acf076 |
|
NNIE inference runtime lib, benchmark tool |
Hi3516D |
Linux-aarch32 |
ad75673390afdf92549684a67680075d1fea5b37c9705cb25009d4960087334a |
|
NNIE inference runtime lib, benchmark tool |
Hi3519A |
Linux-aarch32 |
dc145da965136182d5a25d3b0964bce1337c12bc7f8e98a01f74258819bbb194 |
|
NNIE inference runtime lib, benchmark tool |
Hi3559A |
Linux-aarch64 |
3135a2ca2a093d754c50c309b650faa45c305ffda86ece5db02bf8144b46bb0d |
|
Micro lib |
Cortex-M7 |
None |
55e04aee47b03f9bd7ff2ffd8db6c73f693cc509dec0dc87072f500190180fc3 |
Python API wheel package
Module Name |
Hardware Platform |
Operating System |
Python Version |
Download Links |
SHA-256 |
---|---|---|---|---|---|
Cloud-side inference runtime and converter |
CPU/GPU/Ascend |
Linux-x86_64 |
Python3.7 |
dbeccc7a80389f3607be6b9acd74f7b09e28208b371990767a0e288868bc9fcc |
|
Cloud-side inference runtime and converter |
CPU/Ascend |
Linux-aarch64 |
Python3.7 |
078c2982694662bb82dc605d24ec93c73e9c61e6a42cd9a2b0cee940e35f6aa8 |
|
Cloud-side inference runtime and converter |
CPU/GPU/Ascend |
Linux-x86_64 |
Python3.8 |
43cf572ad311182cb1a5001beb9eb901c62af519912bad21e8f318ec47adbad6 |
|
Cloud-side inference runtime and converter |
CPU/Ascend |
Linux-aarch64 |
Python3.8 |
0aa004913fdb6d91f82018b962853c89ed15fe74264587cec4128b4e19ec5d0f |
|
Cloud-side inference runtime and converter |
CPU/GPU/Ascend |
Linux-x86_64 |
Python3.9 |
7c36b5dc8d166cf9a4824e63a28d028d04199f46e3e6c8afbd17c5e21ca05bcd |
|
Cloud-side inference runtime and converter |
CPU/Ascend |
Linux-aarch64 |
Python3.9 |
2505972d7800475c0f6001f9da1154d643180d71d3930410726cb1c19330b33c |
2.2.14
Inference runtime
Module Name |
Hardware Platform |
Operating System |
Download Links |
SHA-256 |
---|---|---|---|---|
Cloud-side inference runtime lib, inference jar package, benchmark tool, converter tool |
CPU/GPU/Ascend |
Linux-x86_64 |
43e5ce56552c575e6cfb8693895442d599c75a5bec504544b3c8d67ac1fd57a6 |
|
Cloud-side inference runtime lib, inference jar package, benchmark tool, converter tool |
CPU/Ascend |
Linux-aarch64 |
8f72baf2563616e48d1b2308378344cabc9d30221967a7da5f78e42d4e961e20 |
|
Inference/training runtime lib, inference/training aar package, Micro lib, benchmark tool |
CPU |
Android-aarch32 |
8ce36fbaf0944cf65321f21eab63d5a09306656340c10b8005cc840bee6775b1 |
|
Inference/training runtime lib, inference/training aar package, Micro lib, benchmark tool |
CPU/GPU |
Android-aarch64 |
d04331cb2beeadae821245b2133c7d13d052045b1874916665ae575d9fa9c22c |
|
Device-side inference/training runtime lib, inference/training jar package, Micro lib, benchmark tool, converter tool, cropper tool |
CPU |
Linux-x86_64 |
71da1bac237d994034b1189bc51724572a2a73bc9db2a66e6676765f30dc0d20 |
|
Device-side inference runtime lib, Micro lib, benchmark tool, converter tool |
CPU |
Linux-aarch64 |
328a6c9e375d9e1f8e19c4e5d9f9a91fa52e08c23997d0c3d79440355cde15cc |
|
Inference runtime lib, Micro lib, benchmark tool, converter tool |
CPU |
Windows-x86_64 |
6e3d8a28232830c859a67a0a8aa2711bd952d3e138719fc0ea1f5d8fc547295c |
|
iOS inference runtime lib |
CPU |
iOS-aarch32 |
6748217116394da4adcd8231c00f8f6e9147da8a76446500004664a8f5c26a86 |
|
iOS inference runtime lib |
CPU |
iOS-aarch64 |
9478f5a754eae08e9a1b4d9f2d6ae358bcbd17eacc7c9edfbc3255b14b3eade3 |
|
NNIE converter tool |
CPU |
Linux-x86_64 |
4bc8c287d26f2777717a98d56101c5bc96ff499ccb491836175e53ded85693f0 |
|
NNIE inference runtime lib, benchmark tool |
Hi3516D |
Linux-aarch32 |
95389633021036db47e96699524371ab18de66b18c65f48a3b84e04db41fccfb |
|
NNIE inference runtime lib, benchmark tool |
Hi3519A |
Linux-aarch32 |
fbe3b934557a8732afe95f8dd0d69c59935a7dfd2e24877259d7f8d0816ba411 |
|
NNIE inference runtime lib, benchmark tool |
Hi3559A |
Linux-aarch64 |
b358ab197cec8e682605a50af4a4b731c25a5143d8e80533ca4a256c10d5388a |
|
Micro lib |
Cortex-M7 |
None |
11e0a5cc0f0d256a18b1773c9f291ce8d22054771e722488c404596d0c8db0a8 |
Python API wheel package
Module Name |
Hardware Platform |
Operating System |
Python Version |
Download Links |
SHA-256 |
---|---|---|---|---|---|
Cloud-side inference runtime and converter |
CPU/GPU/Ascend |
Linux-x86_64 |
Python3.7 |
e119bf249933d1f8319c0a4b379d311e76c000c828dc4d69f1d66c462b5eab2a |
|
Cloud-side inference runtime and converter |
CPU/Ascend |
Linux-aarch64 |
Python3.7 |
dcbbd3fd10e7ef5146c02737822233b44092e61efb1fec61ce141887923aef92 |
|
Cloud-side inference runtime and converter |
CPU/GPU/Ascend |
Linux-x86_64 |
Python3.8 |
69bb99faa6917fb4ce78454247341eb3ba2c5f777f86a3f3ccd1da452ab0e87f |
|
Cloud-side inference runtime and converter |
CPU/Ascend |
Linux-aarch64 |
Python3.8 |
5a6f6f895dc79c596c58cafea93cd093cd74b11fe2a9d190e86ff54d5eb60142 |
|
Cloud-side inference runtime and converter |
CPU/GPU/Ascend |
Linux-x86_64 |
Python3.9 |
e86cf6f35a9aa64ac91fd8092f39468f6244830338d0491237e78e03fc701e5b |
|
Cloud-side inference runtime and converter |
CPU/Ascend |
Linux-aarch64 |
Python3.9 |
3c63d7e16ae0a7c35be6881e17ceb6bf69e3c897fb5e95a934f6f3ec2b71a7d8 |
2.2.13
Inference runtime
Module Name |
Hardware Platform |
Operating System |
Download Links |
SHA-256 |
---|---|---|---|---|
Cloud-side inference runtime lib, inference jar package, benchmark tool, converter tool |
CPU/GPU/Ascend |
Linux-x86_64 |
5fad72019f0f89a77f5136f5cdc5b0f766ae5e22025c98b31a0337cdad6aa517 |
|
Cloud-side inference runtime lib, inference jar package, benchmark tool, converter tool |
CPU/Ascend |
Linux-aarch64 |
961d659b06c750c3db570727569ea06b0c7b758774e43a9b2f3ba14685866698 |
|
Inference/training runtime lib, inference/training aar package, Micro lib, benchmark tool |
CPU |
Android-aarch32 |
44e8cc7a706c5bb45005937630cc16a82988e87016d807500a9bb276a87432aa |
|
Inference/training runtime lib, inference/training aar package, Micro lib, benchmark tool |
CPU/GPU |
Android-aarch64 |
eb970a251e8d2e958e2bb5249449d044ee7cee3a37ffa2778afe6342b8c46644 |
|
Device-side inference/training runtime lib, inference/training jar package, Micro lib, benchmark tool, converter tool, cropper tool |
CPU |
Linux-x86_64 |
25c4ebc48b88a6d2277b46c8582b13079e00047928b0a8553a4869b58be2f6ec |
|
Device-side inference runtime lib, Micro lib, benchmark tool, converter tool |
CPU |
Linux-aarch64 |
90f913dfe1de16b5c49b344fa8dfc03d5e818fbdbf6f28df9d22d6ffd4db1f0b |
|
Inference runtime lib, Micro lib, benchmark tool, converter tool |
CPU |
Windows-x86_64 |
6438a7ccd22aade36ab9c69ec1f4030a040dab64014cafad8f0d87f9bab44cad |
|
iOS inference runtime lib |
CPU |
iOS-aarch32 |
c892648a2feefb837a14314545918513637b98d5bebb5863860d85774fefcfd2 |
|
iOS inference runtime lib |
CPU |
iOS-aarch64 |
a0dee2f86e76d0cc0ae96b92660ef1e86796b089fc9265bdf767fbd6f445859c |
|
NNIE converter tool |
CPU |
Linux-x86_64 |
12b2623dc787efff5ece07076135fdf5eff64a83f8195a1084b98b16a806c9d4 |
|
NNIE inference runtime lib, benchmark tool |
Hi3516D |
Linux-aarch32 |
f58ef9c49a53122b6077fd7c1cbccf3196175f73d2a37524aaab08681e59cab4 |
|
NNIE inference runtime lib, benchmark tool |
Hi3519A |
Linux-aarch32 |
5761723caca8958d2eb6a2a179a0096a03c0bc2ec26ccf6a69ed58e4809c5249 |
|
NNIE inference runtime lib, benchmark tool |
Hi3559A |
Linux-aarch64 |
199e6d7ce6ca467fc37fb6982b8f76e0fef9a6b481eb8a94d2659dfa8cdc92cd |
|
Micro lib |
Cortex-M7 |
None |
0e732da476a38a2cb126041296220060640c1cde5699e8f6dce9ef0554cd8dba |
Python API wheel package
Module Name |
Hardware Platform |
Operating System |
Python Version |
Download Links |
SHA-256 |
---|---|---|---|---|---|
Cloud-side inference runtime and converter |
CPU/GPU/Ascend |
Linux-x86_64 |
Python3.7 |
1e4fb83908a9d270fe57f57ad1eeca56f71d6d808942a671027cb4478cf7b683 |
|
Cloud-side inference runtime and converter |
CPU/Ascend |
Linux-aarch64 |
Python3.7 |
2d333ddac4b2edaa66026f0d7ebccea2351b70cea0ab20abc5e6944742775419 |
|
Cloud-side inference runtime and converter |
CPU/GPU/Ascend |
Linux-x86_64 |
Python3.8 |
eac0c19f189739c08b58ee68632adf0c5a65a17e5b2a83a09bdce91cdd57c107 |
|
Cloud-side inference runtime and converter |
CPU/Ascend |
Linux-aarch64 |
Python3.8 |
87666c3da798c656e70c4e4d5dc9bf4f96243ebbc1104908e9f34554a3c16c95 |
|
Cloud-side inference runtime and converter |
CPU/GPU/Ascend |
Linux-x86_64 |
Python3.9 |
013e04d2d23baad449c4758f39fd7426da0a4f5c1f2daa350c7eba006ab3f571 |
|
Cloud-side inference runtime and converter |
CPU/Ascend |
Linux-aarch64 |
Python3.9 |
2090d1dae1f3cabaa1807b67068bf4e54eb44364b7ba84e5035093e2afea94da |
2.2.12
Inference runtime
Module Name |
Hardware Platform |
Operating System |
Download Links |
SHA-256 |
---|---|---|---|---|
Cloud-side inference runtime lib, inference jar package, benchmark tool, converter tool |
CPU/GPU/Ascend |
Linux-x86_64 |
6f73eb20d68e7a8a820263bb89c342e9e4d04078a1de4d77c5096d513ee68ea4 |
|
Cloud-side inference runtime lib, inference jar package, benchmark tool, converter tool |
CPU/Ascend |
Linux-aarch64 |
2c56958cd48852179b0cd38ba2d7f189312d63554573b50fb8da8d3b67c65e2e |
|
Inference/training runtime lib, inference/training aar package, Micro lib, benchmark tool |
CPU |
Android-aarch32 |
de2849b5ec24aa815b79be415043d21f14f690238aa9c0262b6bf303f80e06bd |
|
Inference/training runtime lib, inference/training aar package, Micro lib, benchmark tool |
CPU/GPU |
Android-aarch64 |
debb55ae88cec1e9b538b93e272b142a68ea635a5e63fc543c64c84da89459d5 |
|
Device-side inference/training runtime lib, inference/training jar package, Micro lib, benchmark tool, converter tool, cropper tool |
CPU |
Linux-x86_64 |
9f44255470587aa9430e4df6c15f2d6681da6e0db2e92f94bcad9062222e1d2b |
|
Device-side inference runtime lib, Micro lib, benchmark tool, converter tool |
CPU |
Linux-aarch64 |
a75efcfa03bd1b176cee25ca7e258f6db829eb7594f2efe754216ed847f2485f |
|
Inference runtime lib, Micro lib, benchmark tool, converter tool |
CPU |
Windows-x86_64 |
965366b8f551e7da8236c62eacd943f75b8f4066b4a72e62e4c3328ac675cdba |
|
iOS inference runtime lib |
CPU |
iOS-aarch32 |
1ecb30f0ce2fb6df705bc92a8a4d9b23d1e43ea0848f298e339eaa107529491b |
|
iOS inference runtime lib |
CPU |
iOS-aarch64 |
0036cdd65359ee4b17c42d90ebd7bffc41f88ca48a72e4b02abf8d6894317d1e |
|
NNIE converter tool |
CPU |
Linux-x86_64 |
88b0edfb8cda9fa2208562e626c99a1062825cd14346a09851b6d99782367a29 |
|
NNIE inference runtime lib, benchmark tool |
Hi3516D |
Linux-aarch32 |
0ef9faf5387777f0aeba0cd6f2a34058c08c60d49f2cd732c6f48d33cc8be979 |
|
NNIE inference runtime lib, benchmark tool |
Hi3519A |
Linux-aarch32 |
04970cee5662c86da0ed6d0245b1fe83ddc0c30146d16cf21477976e08ce0749 |
|
NNIE inference runtime lib, benchmark tool |
Hi3559A |
Linux-aarch64 |
47745bb1558332ab679b65b4e8ddb0e4d53bc8694e0aa4d3e5251899efa92c34 |
|
Micro lib |
Cortex-M7 |
None |
de799b477e8c5d8335287f10f769e18e36660776c5caafae90ed8567cda7aebe |
Python API wheel package
Module Name |
Hardware Platform |
Operating System |
Python Version |
Download Links |
SHA-256 |
---|---|---|---|---|---|
Cloud-side inference runtime and converter |
CPU/GPU/Ascend |
Linux-x86_64 |
Python3.7 |
e4ab238ddbb90d39db5f9d1af01cccaaf1efe543f459c9f7b62fc7abd4bbc041 |
|
Cloud-side inference runtime and converter |
CPU/Ascend |
Linux-aarch64 |
Python3.7 |
f900e92d7da6af0bc288349e19aea469319ebc465a2efd615823cfd08d5af01d |
|
Cloud-side inference runtime and converter |
CPU/GPU/Ascend |
Linux-x86_64 |
Python3.8 |
ec6525020c8087711a3afcf6c1d9d3a8c3003b2eab5327e938f9b4bee4385339 |
|
Cloud-side inference runtime and converter |
CPU/Ascend |
Linux-aarch64 |
Python3.8 |
ebc35588a58022813cb96009768e26f7f9b8fdeb56af75eae8af19079411b06c |
|
Cloud-side inference runtime and converter |
CPU/GPU/Ascend |
Linux-x86_64 |
Python3.9 |
c0aa361989a601bf3df5b677f25010b0602d8b8b8c0ca8574638b6b6ba2b5cea |
|
Cloud-side inference runtime and converter |
CPU/Ascend |
Linux-aarch64 |
Python3.9 |
cafb6434b54e5ecbfc03a658fa11d0a1f315d099d81fd18bdec994dd282cb75a |
2.2.11
Inference runtime
Module Name |
Hardware Platform |
Operating System |
Download Links |
SHA-256 |
---|---|---|---|---|
Cloud-side inference runtime lib, inference jar package, benchmark tool, converter tool |
CPU/GPU/Ascend |
Linux-x86_64 |
d12cda92f14956ea27d4f43b40f10645b18750f4b2ed86284398b51ffdf2098b |
|
Cloud-side inference runtime lib, inference jar package, benchmark tool, converter tool |
CPU/Ascend |
Linux-aarch64 |
00621162222464b3303d5bf44876a038be115f34cc20b2c64f7ea22c623cc95a |
|
Inference/training runtime lib, inference/training aar package, Micro lib, benchmark tool |
CPU |
Android-aarch32 |
25822f6c71ff60ed59ca4ea5ffac4b8ea05c4f38dd3386e8cfe6e8081e5817ca |
|
Inference/training runtime lib, inference/training aar package, Micro lib, benchmark tool |
CPU/GPU |
Android-aarch64 |
5a3945fcef08240cb5fb4256f8220f3b280a40ddacf27cc560325c4da7c574cf |
|
Device-side inference/training runtime lib, inference/training jar package, Micro lib, benchmark tool, converter tool, cropper tool |
CPU |
Linux-x86_64 |
04ec077161aa37a147284070e6d1139d47e2e945e14be02a9179e8f1ec3576e6 |
|
Device-side inference runtime lib, Micro lib, benchmark tool, converter tool |
CPU |
Linux-aarch64 |
7cb6f20e48db647bbcd3fec9d0f54b584ba315630a245693059e9faa91941aaf |
|
Inference runtime lib, Micro lib, benchmark tool, converter tool |
CPU |
Windows-x86_64 |
0ad3a20c421ad8ddc700ab515e05a33c961126eb19f3fc4d4e215fb517067500 |
|
iOS inference runtime lib |
CPU |
iOS-aarch32 |
7fdf09f9f711684266bdb8e3d1dcf90e54de19e2db67b3ade6267acfb138a350 |
|
iOS inference runtime lib |
CPU |
iOS-aarch64 |
63f044c0bf36971563ec65a3ae2fc691063ffa0875949849f7f578f98cb9ce51 |
|
NNIE converter tool |
CPU |
Linux-x86_64 |
7a4b42f57c7669e35644837e561a4997dc855aee5cde493491ea0eb0e5260576 |
|
NNIE inference runtime lib, benchmark tool |
Hi3516D |
Linux-aarch32 |
6d84b44dc1e8e176ac2875358d781b7e11c09b751327a8284dcf208c47827402 |
|
NNIE inference runtime lib, benchmark tool |
Hi3519A |
Linux-aarch32 |
c7aade141d96138fd1fc0fae80ad12336db46d6a1bfd7f3b41f820c81d2e9e37 |
|
NNIE inference runtime lib, benchmark tool |
Hi3559A |
Linux-aarch64 |
31ffc1392879d23337d6160ce54d2373589ef53650b97fcbe21d7c62633d1640 |
|
Micro lib |
Cortex-M7 |
None |
36e579110b64cba81c4b9af185e7343468b451a0cf93be5f6cd0ecef233dccd9 |
Python API wheel package
Module Name |
Hardware Platform |
Operating System |
Python Version |
Download Links |
SHA-256 |
---|---|---|---|---|---|
Cloud-side inference runtime and converter |
CPU/GPU/Ascend |
Linux-x86_64 |
Python3.7 |
8cfae0d8559d7ec65f76fe664db6961193de3545871c788b6a9fd13e308d01b7 |
|
Cloud-side inference runtime and converter |
CPU/Ascend |
Linux-aarch64 |
Python3.7 |
2c55f8fc847fef2b9b3c3e0dfbd3bd8d904b6bdf35474615ffeb59ea3cea5030 |
|
Cloud-side inference runtime and converter |
CPU/GPU/Ascend |
Linux-x86_64 |
Python3.8 |
9f1458fb57ac438798c44d5f8f5fd582ed9641ff36cd9dbf5bad033e452970a8 |
|
Cloud-side inference runtime and converter |
CPU/Ascend |
Linux-aarch64 |
Python3.8 |
7a8dd62dc6d5c663ab8e7e6075a2cebfe0f502079f907fdbf818bc88c62e87a4 |
|
Cloud-side inference runtime and converter |
CPU/GPU/Ascend |
Linux-x86_64 |
Python3.9 |
b6d305ca62642951b86ba3f7b2c86d5a78c6a935ba6f2d2084dd46fad4cdcc80 |
|
Cloud-side inference runtime and converter |
CPU/Ascend |
Linux-aarch64 |
Python3.9 |
2731c8c7e27fafcc2c5a88443b101276963f8b64feba8e5313c81bf88538514d |
2.2.10
Inference runtime
Module Name |
Hardware Platform |
Operating System |
Download Links |
SHA-256 |
---|---|---|---|---|
Cloud-side inference runtime lib, inference jar package, benchmark tool, converter tool |
CPU/GPU/Ascend |
Linux-x86_64 |
43e216db34ecee055e3aa9b535edba8c9b9056c61eb52ca8c48c79558fa08e55 |
|
Cloud-side inference runtime lib, inference jar package, benchmark tool, converter tool |
CPU/Ascend |
Linux-aarch64 |
f59cff5c06eadedf6415d5d45edf7c96538a06ea14b16739c80c1d0ecb4e7670 |
|
Inference/training runtime lib, inference/training aar package, Micro lib, benchmark tool |
CPU |
Android-aarch32 |
dfd8a56ea9db3b893f7a6d71416ffbde634dccc5d600c2f2395d89829943f937 |
|
Inference/training runtime lib, inference/training aar package, Micro lib, benchmark tool |
CPU/GPU |
Android-aarch64 |
79d9aefd3db4568db6d888b4eed6c07feb719e38f43f5a0657328bff9acc22ac |
|
Device-side inference/training runtime lib, inference/training jar package, Micro lib, benchmark tool, converter tool, cropper tool |
CPU |
Linux-x86_64 |
ae5bbade3795d1c2187b69851cfd0b451e38b28f416fe52b9de66c64609a46e8 |
|
Device-side inference runtime lib, Micro lib, benchmark tool, converter tool |
CPU |
Linux-aarch64 |
c4713fe08e89cf078068865e2da7a3b4c9f201a7e3acd8b44e0ca61a586de657 |
|
Inference runtime lib, Micro lib, benchmark tool, converter tool |
CPU |
Windows-x86_64 |
cf3377ebe7cb7ed558fcab24cda23926488530e7dbf655fd3c5ee372d2601930 |
|
iOS inference runtime lib |
CPU |
iOS-aarch32 |
59dd3d3e3ef960eea528c2269f6b75848e67e0c669377adc05869d838a80a4ce |
|
iOS inference runtime lib |
CPU |
iOS-aarch64 |
3937087b76e7ca87f728959b089eefafb5d620f0ae4defb4ca8f4bb8ed425011 |
|
NNIE converter tool |
CPU |
Linux-x86_64 |
83bb1085de74d74b2ccc0fa5f1fc5e1b09c1fc30eb99660e0e02220e0e1dcd15 |
|
NNIE inference runtime lib, benchmark tool |
Hi3516D |
Linux-aarch32 |
e1957c5b4bc8cd73643e23b9e2f1d0f4036c3f9f927f614b5ed70ca6abb5e0fd |
|
NNIE inference runtime lib, benchmark tool |
Hi3519A |
Linux-aarch32 |
7d1ebfd952c22cf74b329aeb4881c51ca9d13447251a55fcf2c8816a1a98b725 |
|
NNIE inference runtime lib, benchmark tool |
Hi3559A |
Linux-aarch64 |
942ea5bab05b06bac8052803694a07171ac3322b173caa4de7b5fd7f0b020e73 |
|
Micro lib |
Cortex-M7 |
None |
c86bc6bc5c8bd8aca4aea33f12fb522955103afeba0b0b091634809f623f3745 |
Python API wheel package
Module Name |
Hardware Platform |
Operating System |
Python Version |
Download Links |
SHA-256 |
---|---|---|---|---|---|
Cloud-side inference runtime and converter |
CPU/GPU/Ascend |
Linux-x86_64 |
Python3.7 |
a621b8cd60f9a7ba0cb585b55f21ff0fa5b400478a8d3d5b8ad861444e138c10 |
|
Cloud-side inference runtime and converter |
CPU/Ascend |
Linux-aarch64 |
Python3.7 |
66868580db4bd9ce13efff696ec395ea24f35453cc2a8430f70c09e4b79d377d |
|
Cloud-side inference runtime and converter |
CPU/GPU/Ascend |
Linux-x86_64 |
Python3.8 |
2265da10af212801f4afe9c3ed2088d5a1661edee617f8f3490841cf723d934d |
|
Cloud-side inference runtime and converter |
CPU/Ascend |
Linux-aarch64 |
Python3.8 |
7e01fedb95dff26d0aa4d3a3663c36ea481be6700fc3efa7418b65c0e3145280 |
|
Cloud-side inference runtime and converter |
CPU/GPU/Ascend |
Linux-x86_64 |
Python3.9 |
cf27f0fdb61d3bda5ff385b2fd590c3379a726b7fbc99be4432207b31296df5f |
|
Cloud-side inference runtime and converter |
CPU/Ascend |
Linux-aarch64 |
Python3.9 |
874d99c6757c010c5afb1443e23cc8bc866df9c6672e8dfca7f81c61aee4f700 |
2.2.1
Inference runtime
Module Name |
Hardware Platform |
Operating System |
Download Links |
SHA-256 |
---|---|---|---|---|
Cloud-side inference runtime lib, inference jar package, benchmark tool, converter tool |
CPU/GPU/Ascend |
Linux-x86_64 |
0b05fbdb20a6657616093d6313c2645158ce9c07119ba301507d8c25154dda04 |
|
Cloud-side inference runtime lib, inference jar package, benchmark tool, converter tool |
CPU/Ascend |
Linux-aarch64 |
47116f31536516530409fada2f43c0c382abbc032cc5bac57fc03400a2f498c4 |
|
Inference/training runtime lib, inference/training aar package, Micro lib, benchmark tool |
CPU |
Android-aarch32 |
910f3fc8a8911967a338d7d3e191c21b7c2750d9c44a3a19b3bf05ae7d8a3686 |
|
Inference/training runtime lib, inference/training aar package, Micro lib, benchmark tool |
CPU/GPU |
Android-aarch64 |
f02ab1694e43a99a7a95a1fde333a725e51589464e8f78823174bdf1277f6273 |
|
Device-side inference/training runtime lib, inference/training jar package, Micro lib, benchmark tool, converter tool, cropper tool |
CPU |
Linux-x86_64 |
ef9875f7b5e235c5e52afc9e370821c2a6d3cb6132d0674d24b24a24c7c60edd |
|
Device-side inference runtime lib, Micro lib, benchmark tool, converter tool |
CPU |
Linux-aarch64 |
6e4364229e51d72e10a558d415a91fe9c242f0c6e7c4a2e0ca2e383a95052946 |
|
Inference runtime lib, Micro lib, benchmark tool, converter tool |
CPU |
Windows-x86_64 |
d4f98c01c54947e632e2d98f271f6a9b554fbfc7e018188bf3bdb6bfa403bed6 |
|
iOS inference runtime lib |
CPU |
iOS-aarch32 |
e956bc780a349557e217184bfdedf4cb8f8b9a6e5d922f3b9eab9980fed362b0 |
|
iOS inference runtime lib |
CPU |
iOS-aarch64 |
ff4c6ece4dc191c4a54ddd72f836d8f5350bf9aa81d2d93c35aa946e1d555c19 |
|
NNIE converter tool |
CPU |
Linux-x86_64 |
499d1bd70e87b72e0ccc7b104bb5f8c12b6d42683b83157db71d8db8c527e2c8 |
|
NNIE inference runtime lib, benchmark tool |
Hi3516D |
Linux-aarch32 |
2c1b516d02c45ec5e22a8bb686630f2b9d7f555e2822dcaa4dc37bb4447fc0be |
|
NNIE inference runtime lib, benchmark tool |
Hi3519A |
Linux-aarch32 |
1c7e0cb7ef08d6aae2e9f66de67e3f7dce79e299b40c672cce9a9a35ea0ca00f |
|
NNIE inference runtime lib, benchmark tool |
Hi3559A |
Linux-aarch64 |
67134e3e5e5602b9c295c90983f2f25cf4e4856571ffc402992296ac4ae5db53 |
|
Micro lib |
Cortex-M7 |
None |
a9525518639b1d70ed32c5e621c75f07f67a7c54dd8e7ce6b33dd0a4d1a072f9 |
Python API wheel package
Module Name |
Hardware Platform |
Operating System |
Python Version |
Download Links |
SHA-256 |
---|---|---|---|---|---|
Cloud-side inference runtime and converter |
CPU/GPU/Ascend |
Linux-x86_64 |
Python3.7 |
ecbe098cae5aeb76b14512724a04ec47805b5fd2e337c7daaa8a5b2c653b0768 |
|
Cloud-side inference runtime and converter |
CPU/Ascend |
Linux-aarch64 |
Python3.7 |
290dd79bba020c0fcffe69890f810bd082790ff306739f53d644ba9c5d7bfbe6 |
|
Cloud-side inference runtime and converter |
CPU/GPU/Ascend |
Linux-x86_64 |
Python3.8 |
714aca459d1c3a27e62a5ba65be6d0245f58d998fbcd36ac0dd8677c19541721 |
|
Cloud-side inference runtime and converter |
CPU/Ascend |
Linux-aarch64 |
Python3.8 |
cc226e5f6d85887b4be852cffe1e3a08cb1f46d8793fc86bb1378271e0ce4771 |
|
Cloud-side inference runtime and converter |
CPU/GPU/Ascend |
Linux-x86_64 |
Python3.9 |
48a87c21be044be68c7de49cb2720e53c1851e37e7d6b3af68ccc735acae34ef |
|
Cloud-side inference runtime and converter |
CPU/Ascend |
Linux-aarch64 |
Python3.9 |
3ade20b5bebfdb0b4027086ec62c04e574a25229c5acbaff0b8a80ebff6a3a06 |
2.2.0
Inference runtime
Module Name |
Hardware Platform |
Operating System |
Download Links |
SHA-256 |
---|---|---|---|---|
Cloud-side inference runtime lib, inference jar package, benchmark tool, converter tool |
CPU/GPU/Ascend |
Linux-x86_64 |
01829e31f374a459c452d06a47306b9b9f0f6b391f8728a7a788578d8a556882 |
|
Cloud-side inference runtime lib, inference jar package, benchmark tool, converter tool |
CPU/Ascend |
Linux-aarch64 |
ee60a012d465402e848d103a895752f9366ae134e933bdb5a5f437955c3238c2 |
|
Inference/training runtime lib, inference/training aar package, Micro lib, benchmark tool |
CPU |
Android-aarch32 |
997ebc4b29a182d002682b825c798856ff51d25f99f3f0e17a55674c943079ff |
|
Inference/training runtime lib, inference/training aar package, Micro lib, benchmark tool |
CPU/GPU |
Android-aarch64 |
b08bebb528f40a77f86936b8019fe7e4ac08337fa049a471d17c8288bc29f7b5 |
|
Device-side inference/training runtime lib, inference/training jar package, Micro lib, benchmark tool, converter tool, cropper tool |
CPU |
Linux-x86_64 |
e8f623ea4c44d20bb8ecb366113644f89e826b5a3194824002ca4d19e7627454 |
|
Device-side inference runtime lib, Micro lib, benchmark tool, converter tool |
CPU |
Linux-aarch64 |
b6c0cb71c8371fdadfeec86ced45f4e442202c64762599cd1c29bf70c6654911 |
|
Inference runtime lib, Micro lib, benchmark tool, converter tool |
CPU |
Windows-x86_64 |
15b832083d77b4016a0bb10033f94765224184a72f9568935ba7c85caec38189 |
|
iOS inference runtime lib |
CPU |
iOS-aarch32 |
cff500ae05b9eb1c06c5dfc1bf36e516e28b01ec6c6cefb190bf5053f64aeed2 |
|
iOS inference runtime lib |
CPU |
iOS-aarch64 |
7fb6fb4e179a1083fb266c49011353f2e83ae292147148129e2cc8a35bfe964a |
|
NNIE converter tool |
CPU |
Linux-x86_64 |
33e4764346878011b18d12ac3d00395254f7e026481974b4a6a2515e0b0c96ef |
|
NNIE inference runtime lib, benchmark tool |
Hi3516D |
Linux-aarch32 |
71288d0785413e0831593afde4093b68234016bb61af8ca7296423bf0b6284b5 |
|
NNIE inference runtime lib, benchmark tool |
Hi3519A |
Linux-aarch32 |
0b24949b5391fc64af99a10c712fd313b23c41a2d49c11c5e41322558571652b |
|
NNIE inference runtime lib, benchmark tool |
Hi3559A |
Linux-aarch64 |
6c86ffc30d5a6e6e60f9a9440fb2544284aa81d8f577f9e029e9999758c20455 |
|
Micro lib |
Cortex-M7 |
None |
580f0d69a67619841f6f3aae522e2c5a4e91fd3da58b162bbeb7e2e2eeffdaf1 |
Python API wheel package
Module Name |
Hardware Platform |
Operating System |
Python Version |
Download Links |
SHA-256 |
---|---|---|---|---|---|
Cloud-side inference runtime and converter |
CPU/GPU/Ascend |
Linux-x86_64 |
Python3.7 |
7215bfe6703f0260068f851cab33352c39618bbca21647e29a3cc732637de5e8 |
|
Cloud-side inference runtime and converter |
CPU/Ascend |
Linux-aarch64 |
Python3.7 |
6c6d6e119c585dfe12fda48e59faff83cb0bf5b670643eb65911afdca6546b61 |
|
Cloud-side inference runtime and converter |
CPU/GPU/Ascend |
Linux-x86_64 |
Python3.8 |
18175dba32845165a6bfe8106d1e23f63f8de4781e65fc433a390ace7bf38798 |
|
Cloud-side inference runtime and converter |
CPU/Ascend |
Linux-aarch64 |
Python3.8 |
9a01adb35a477f874e43098576115cf399c15855e2526be1da1ff05a833e19cf |
|
Cloud-side inference runtime and converter |
CPU/GPU/Ascend |
Linux-x86_64 |
Python3.9 |
8da72e0b1801985e2f86ca9439f766419af387f27a204df070fc14b83b432496 |
|
Cloud-side inference runtime and converter |
CPU/Ascend |
Linux-aarch64 |
Python3.9 |
0be673b05f138cbf4c7854e72710ea2ef0b7e5a9734d65526fba6ff73a3c08c3 |
2.1.1
Inference runtime
Module Name |
Hardware Platform |
Operating System |
Download Links |
SHA-256 |
---|---|---|---|---|
Cloud-side inference runtime lib, inference jar package, benchmark tool, converter tool |
CPU/GPU/Ascend |
Linux-x86_64 |
220c3c4eecd0102c195189b28e45ef7ec86b55381eaa5a9652b45bc06155b874 |
|
Cloud-side inference runtime lib, inference jar package, benchmark tool, converter tool |
CPU/Ascend |
Linux-aarch64 |
173a079817b1c40efbdfb5fb1249c153e7ad8236ba43e4e1b89431f39bd484a5 |
|
Inference/training runtime lib, inference/training aar package, Micro lib, benchmark tool |
CPU |
Android-aarch32 |
27e857eb812d4190b17012fb5d0fcaf24be25000bcb9fba245bd5a390885c2b8 |
|
Inference/training runtime lib, inference/training aar package, Micro lib, benchmark tool |
CPU/GPU |
Android-aarch64 |
a2f30d23c3e44fbdec57cbcb0bf05d3f45bd3b242143a012e3aaa10a51844725 |
|
Device-side inference/training runtime lib, inference/training jar package, Micro lib, benchmark tool, converter tool, cropper tool |
CPU |
Linux-x86_64 |
e06abc4fb445646e452236dd2a7aa069ebf1f251e9a6ae00021d662b09763281 |
|
Device-side inference runtime lib, Micro lib, benchmark tool, converter tool |
CPU |
Linux-aarch64 |
a7e1d8027cdb00eeb7c71ee94de246b3fad3512dddde1120df14caddcd46fb41 |
|
Inference runtime lib, Micro lib, benchmark tool, converter tool |
CPU |
Windows-x86_64 |
23eb1d185c873a1e623b259842842d2bc8e88d699cafda530e7428a6e71036bc |
|
iOS inference runtime lib |
CPU |
iOS-aarch32 |
83fb7649e1847aa4eec808364d3fc5e60c3304508644819c0c4dd3c99441eff0 |
|
iOS inference runtime lib |
CPU |
iOS-aarch64 |
6583880ad989710e41842dcc171b496e292c49b5275f5391d9e997a8e17db57b |
|
NNIE converter tool |
CPU |
Linux-x86_64 |
e5d9dec0fae3c8ee149f878c42efc985a2ac35bb24ebb9db69fc4955fd4d4a47 |
|
NNIE inference runtime lib, benchmark tool |
Hi3516D |
Linux-aarch32 |
576a1f5296d7e60a8d4291c9e6d5c84b6cc5f152d8747651974b3694994d998f |
|
NNIE inference runtime lib, benchmark tool |
Hi3519A |
Linux-aarch32 |
bdaf262d63328b4d85c87924d3d82d92d048da3ca3d1ba59e34f383b30ac4e7d |
|
NNIE inference runtime lib, benchmark tool |
Hi3559A |
Linux-aarch64 |
aa2e4309494eeff38e9a421639acfd0db1ebfc89359f704fbdc3f6fdc14836a6 |
|
Micro lib |
Cortex-M7 |
None |
552ffecedf62a064fc0e0106cfe9e237e2efa23d01627500271fa8c1ba505867 |
Python API wheel package
Module Name |
Hardware Platform |
Operating System |
Python Version |
Download Links |
SHA-256 |
---|---|---|---|---|---|
Cloud-side inference runtime and converter |
CPU/GPU/Ascend |
Linux-x86_64 |
Python3.7 |
3abc5c801572f977be172c630557788387b0ac57ea4611fdc1555a068541edba |
|
Cloud-side inference runtime and converter |
CPU/Ascend |
Linux-aarch64 |
Python3.7 |
e63c7fed6462fa48ad5f147fcddef060d04bfffdd232490e12c22b9f1090c710 |
|
Cloud-side inference runtime and converter |
CPU/GPU/Ascend |
Linux-x86_64 |
Python3.8 |
6336bf6197d12016c5296c58d7c5c6b23b3ad9eeaab4c2ed5e21200ded1d6bd0 |
|
Cloud-side inference runtime and converter |
CPU/Ascend |
Linux-aarch64 |
Python3.8 |
3066495713c2f65c9b3c16f2a46df3502af89b211cf01df4a57526fd68335ec2 |
|
Cloud-side inference runtime and converter |
CPU/GPU/Ascend |
Linux-x86_64 |
Python3.9 |
445e41dcf49f99cc1c7a4b2ddaae4b6af5a25e064ee935c2dafa3e01e287d97b |
|
Cloud-side inference runtime and converter |
CPU/Ascend |
Linux-aarch64 |
Python3.9 |
01999bd2ad6e29aee1cf1123ce601d72c75d06d2644f80b8570ff62c288c8876 |
2.1.0
Inference runtime
Module Name |
Hardware Platform |
Operating System |
Download Links |
SHA-256 |
---|---|---|---|---|
Cloud-side inference runtime lib, inference jar package, benchmark tool, converter tool |
CPU/GPU/Ascend |
Linux-x86_64 |
ed8de9c7654791acef1db62c5a0878ecdbcbf01533aa0ea53e499c11a6dc7855 |
|
Cloud-side inference runtime lib, inference jar package, benchmark tool, converter tool |
CPU/Ascend |
Linux-aarch64 |
ef99f635e8ea991211c6dbb393555724a74ad1faf65a3bb0a33e052290ffa1f6 |
|
Inference/training runtime lib, inference/training aar package, Micro lib, benchmark tool |
CPU |
Android-aarch32 |
fc348e5a186e62c59544a370855b266950f9723593696e79ea4af4afeb9474cc |
|
Inference/training runtime lib, inference/training aar package, Micro lib, benchmark tool |
CPU/GPU |
Android-aarch64 |
764f76548f18725c20e91aed4f02f293c47bcae44a65d863738c30d94062718e |
|
Device-side inference/training runtime lib, inference/training jar package, Micro lib, benchmark tool, converter tool, cropper tool |
CPU |
Linux-x86_64 |
b267e5726720329200389e47a178c4f882bf526833b714ba6e630c8e2920fe89 |
|
Device-side inference runtime lib, Micro lib, benchmark tool, converter tool |
CPU |
Linux-aarch64 |
945d67c985cc7cd0786cc63620d7fe6a361617aac912fe9ce6423069f65d66ab |
|
Inference runtime lib, Micro lib, benchmark tool, converter tool |
CPU |
Windows-x86_64 |
5b32178f2bcb57c1a0d33f3d99a7a966527d41f0745d5879fe3ae011704f93f6 |
|
iOS inference runtime lib |
CPU |
iOS-aarch32 |
809e8674fb4920b7a9bdabdb4e08c5f606b7b82c54bc6d396c5805e8418ef74c |
|
iOS inference runtime lib |
CPU |
iOS-aarch64 |
286b6c884777cf186d5f7821d17228b2e17668534d6b9b2bb758c8d4d05404ad |
|
NNIE converter tool |
CPU |
Linux-x86_64 |
155654f7dbaf363389f633801353c2eb60dd9233fa68cbee3c922faf90d4c368 |
|
NNIE inference runtime lib, benchmark tool |
Hi3516D |
Linux-aarch32 |
dcd50f6e989d2c49f4bbe7db1d1e7b8cc29ae6b7dd720371e8bcb7d137981e34 |
|
NNIE inference runtime lib, benchmark tool |
Hi3519A |
Linux-aarch32 |
b7a6d3330e7178991b7994c3e59194bf82eb2eff0c4b05e07062f76715d6605b |
|
NNIE inference runtime lib, benchmark tool |
Hi3559A |
Linux-aarch64 |
e2b5dddeaa04b7e358ecb102e80e69527d33d8119a6aeb79f56dee3f1282ab7d |
|
Micro lib |
Cortex-M7 |
None |
21f3d10cdfa1ca718870ee22da379544fe4ee86654969b37eeff4635e0c72004 |
Python API wheel package
Module Name |
Hardware Platform |
Operating System |
Python Version |
Download Links |
SHA-256 |
---|---|---|---|---|---|
Cloud-side inference runtime and converter |
CPU/GPU/Ascend |
Linux-x86_64 |
Python3.7 |
e206559c4e9b6646b256143912114283b8bd23edd0784049e13ee911a830e71b |
|
Cloud-side inference runtime and converter |
CPU/Ascend |
Linux-aarch64 |
Python3.7 |
290e600c498f05946f3ef6e4c40e9ded2838459035a57619dd35425a6fba14ba |
2.0.0
Inference runtime
Module Name |
Hardware Platform |
Operating System |
Download Links |
SHA-256 |
---|---|---|---|---|
Cloud-side inference runtime lib, inference jar package, benchmark tool, converter tool |
CPU/GPU/Ascend |
Linux-x86_64 |
556e21b9849ad8efeb5e8a50a26293749395072d8f3ef883aaecfc3b8bd21e7d |
|
Cloud-side inference runtime lib, inference jar package, benchmark tool, converter tool |
CPU/Ascend |
Linux-aarch64 |
e885f65181d61bf11281f786229b24cd7912d98b1a38e5ce11e83e8196b4074b |
|
Inference/training runtime lib, inference/training aar package, Micro lib, benchmark tool |
CPU |
Android-aarch32 |
948ce4033a2f0a08e7ee881b22ca35ca5d6fa441dedd77331372016c96ce4be6 |
|
Inference/training runtime lib, inference/training aar package, Micro lib, benchmark tool |
CPU/GPU |
Android-aarch64 |
8f7d1863be93871780b303561a668d68e4debfebe784af98f1f608a18be2ea48 |
|
Inference/training runtime lib, inference/training jar package, Micro lib, benchmark tool, converter tool, cropper tool |
CPU |
Linux-x86_64 |
b997b45b5eb41f5c46cb80343f5883e3644b576dfb077e3888db6df7e707340b |
|
Inference runtime lib, Micro lib, benchmark tool, converter tool |
CPU |
Linux-aarch64 |
7349bf1a66a217b0846eebf4c9eae82c31d6911efa9cd3b8194727a07736cd2a |
|
Inference runtime lib, Micro lib, benchmark tool, converter tool |
CPU |
Windows-x86_64 |
fb117f9b277f8895fea79ed3b904479be3d23da64500f509b7f21f50381ca747 |
|
iOS inference runtime lib |
CPU |
iOS-aarch32 |
df1b07e76787b242583afc24b5384e8c769122161723741e783029cc0f6fed96 |
|
iOS inference runtime lib |
CPU |
iOS-aarch64 |
404fc22c728ee2864e0888636e790b9d3df57c26d11c510ea440a29b32785a25 |
|
Ascend inference runtime lib, benchmark tool, converter tool |
Ascend |
Linux-x86_64 |
aa62e671bdbd5d861ab047765d2497974deedf6297c8d4a676a83664a9257c53 |
|
Ascend inference runtime lib, benchmark tool, converter tool |
Ascend |
Linux-aarch64 |
3d40d81bfcbacd23dd6b33bc54e6abec23dd03df38f43890a05dac1ca4744568 |
|
Micro lib |
Cortex-M7 |
None |
43bbbdb3b2366a582c79617280615ae85e2cd05bc1bf6e7e59c1403b9bf37972 |
Python API wheel package
Module Name |
Hardware Platform |
Operating System |
Python Version |
Download Links |
SHA-256 |
---|---|---|---|---|---|
Cloud-side inference runtime and converter |
CPU/GPU/Ascend |
Linux-x86_64 |
Python3.7 |
199cd0605cc985b88eddb273db9dfe9000a585ba10ca2292b6b36ebfff07b3ec |
|
Cloud-side inference runtime and converter |
CPU/Ascend |
Linux-aarch64 |
Python3.7 |
9006b56a0285833802b4880f34cbce5f94111ef5ca4071d5df5c61e8c2a3a6f7 |
2.0.0-rc1
Inference runtime
Module Name |
Hardware Platform |
Operating System |
Download Links |
SHA-256 |
---|---|---|---|---|
Cloud-side inference runtime lib, inference jar package, benchmark tool, converter tool |
CPU/GPU/Ascend |
Linux-x86_64 |
15e4a6d22d820092a0a48a334c16f9398f1196d9a7d64df990a5345c6da7c02c |
|
Cloud-side inference runtime lib, inference jar package, benchmark tool, converter tool |
CPU/Ascend |
Linux-aarch64 |
cd7cba9b8384289b76a6267272bbff57d434e76f4aae6deeed9153128871ba17 |
|
Inference/training runtime lib, inference/training aar package, Micro lib, benchmark tool |
CPU |
Android-aarch32 |
b977fdf9c0cc8ce6bb3b6c471795681a5ae86ce19be0b730e43faa7bb49a7220 |
|
Inference/training runtime lib, inference/training aar package, Micro lib, benchmark tool |
CPU/GPU |
Android-aarch64 |
833109c5ed1b8ab2f96b6fe42ab4a8b257a3dca072111d19ed14b27ecc2cabb6 |
|
Device-side inference/training runtime lib, inference/training jar package, Micro lib, benchmark tool, converter tool, cropper tool |
CPU |
Linux-x86_64 |
f5be857a0a75997843688f1a1f528df05331d62f0795f5698bae27ae7d26a408 |
|
Device-side inference runtime lib, Micro lib, benchmark tool, converter tool |
CPU |
Linux-aarch64 |
b0971a652f0426d27750d21c703e37cfa44ae92c0f7e7949c429c91f0c177dd0 |
|
Inference runtime lib, Micro lib, benchmark tool, converter tool |
CPU |
Windows-x86_64 |
fb9b4eb1733d127f7bfac5518a249562efcc316988d8d62ce1da8167f977a204 |
|
iOS inference runtime lib |
CPU |
iOS-aarch32 |
bc23d3a511e5f49c49aed64b5e96e80ed2a0c01c16d53537ad7d8430d14f9529 |
|
iOS inference runtime lib |
CPU |
iOS-aarch64 |
ebcf27334e79913fd38866382ca0259e5c67c36c243a6458ef46ab0a205afdf6 |
|
NNIE converter tool |
CPU |
Linux-x86_64 |
35b3543151100656aa759c88b11f9df9fc62d49f709d62ffc0fdb0c3f35c5569 |
|
NNIE inference runtime lib, benchmark tool |
Hi3516D |
Linux-aarch32 |
9f66975d804e4797dc099d01b1d10e0c77938a224f6e4d52034f59ef9699cc96 |
|
NNIE inference runtime lib, benchmark tool |
Hi3519A |
Linux-aarch32 |
b5836bd7a14fa31b5a05aed0eb80f968c79362d975c8f6b2964a8663386d91fc |
|
NNIE inference runtime lib, benchmark tool |
Hi3559A |
Linux-aarch64 |
b6b6ece083fa7864d14ffc629ff4377d0908a235e9a6ce6ef79ac41a2b2db41e |
|
Micro lib |
Cortex-M7 |
None |
08fe2e0e30cab340916c921a7fc320cf0b32d29c3528a6b5d45e8dbe77c3b82e |
Python API wheel package
Module Name |
Hardware Platform |
Operating System |
Download Links |
SHA-256 |
---|---|---|---|---|
Cloud-side inference runtime and converter |
CPU/GPU/Ascend |
Linux-x86_64 |
55b067c52b0e059b54b2b5be27dc3313fca7d24ef98510252ee478aebbd4485f |
|
Cloud-side inference runtime and converter |
CPU/Ascend |
Linux-aarch64 |
593b5777c799cc4135b4accc1ca83cc77b1c3af80f8dabaf1545034085162bcb |
2.0.0-alpha
Inference runtime
Module Name |
Hardware Platform |
Operating System |
Download Links |
SHA-256 |
---|---|---|---|---|
Inference/training runtime lib, inference/training aar package, Micro lib, benchmark tool |
CPU |
Android-aarch32 |
267a665e80a894c784915901c8f6e8ecd3fead84e1a440e84d6af36359ccf7a6 |
|
Inference/training runtime lib, inference/training aar package, Micro lib, benchmark tool |
CPU/GPU |
Android-aarch64 |
f4a7870646d11e4087980f221aa313dadcb42fdbc6c3a9465fd27cf3d5932aa9 |
|
Inference/training runtime lib, inference/training jar package, Micro lib, benchmark tool, converter tool, cropper tool |
CPU |
Linux-x86_64 |
9895b7bf32aaa8b233edbdd742c465dfa1a554a0eb663771e5accac16d2e2b72 |
|
Inference runtime lib, Micro lib, benchmark tool, converter tool |
CPU |
Linux-aarch64 |
f1aa81dd41f8999ee288af4213364cc3e16486eb7fd224d001dc053b938acace |
|
Inference runtime lib, Micro lib, benchmark tool, converter tool |
CPU |
Windows-x86_64 |
ad834e1ab051605a9d6a844cc8caf45103753b4481451033e7067ed6ec5c2940 |
|
iOS inference runtime lib |
CPU |
iOS-aarch32 |
466470e3d90c56d2f74b0dd67dac0583c134152da3c69eac32b0ca7e149e8a94 |
|
iOS inference runtime lib |
CPU |
iOS-aarch64 |
f3f43275ee0f8214e37a9b04aec784b704d8f4b584c641572874bc02ba97b976 |
|
NNIE converter tool |
CPU |
Linux-x86_64 |
1cf6de6435d991b6575ab5a1782cf57632fe876f27f1e2f33fe5e0e6563f9fd4 |
|
NNIE inference runtime lib, benchmark tool |
Hi3516D |
Linux-aarch32 |
1835b9b1a61aacbb711e76e937fa0173dfcb633a408bf161b0232da6406270fe |
|
NNIE inference runtime lib, benchmark tool |
Hi3519A |
Linux-aarch32 |
ba726786d35ce99d118beaec43c1c745567d05a6058994006477a55a4ab196df |
|
NNIE inference runtime lib, benchmark tool |
Hi3559A |
Linux-aarch64 |
3049ad94e9fac30a3e6a10b9cfe30f466c7b2bbdd91b287fa385e3a59699544b |
|
Ascend inference runtime lib, benchmark tool, converter tool |
Ascend |
Linux-x86_64 |
8136d12b6637a8585dfa012bcc907d441e49e0f6d0fdb85a9a49663068c00003 |
|
Ascend inference runtime lib, benchmark tool, converter tool |
Ascend |
Linux-aarch64 |
1b2881a6cffbd7a83f6485d18fc00dbd058680549b21dba953cab5111f088490 |
|
HarmonyOS lite runtime lib, Micro lib |
Hi3516D |
OpenHarmony-aarch32 |
cf4d439cd61b73607efd6ce46816c21e4b8c1189930f6af56788b8087b164001 |
|
Micro lib |
Cortex-M7 |
None |
a1b4e2d17697d24d172eb87f2ce92f9fbeb1ac79010dcc22af713fb59a84556b |
Python API wheel package
Module Name |
Hardware Platform |
Operating System |
Download Links |
SHA-256 |
---|---|---|---|---|
Converter and inference runtime |
CPU |
Linux-x86_64 |
32e5b80d389514e9bee614f8aef94c8c981cace15bdddc14be93297efbc39970 |
|
Converter and inference runtime |
CPU |
Linux-aarch64 |
e1ab7f8ede94e2378490510bfe05b0c861312ee7bd70e939c8c9187b7e0883ba |
|
Ascend converter and inference runtime |
Ascend |
Linux-x86_64 |
bdc11da01a18f887c69ceb465a66fe9cba02b039f888ebee894e4cce7eedc1d1 |
|
Ascend converter and inference runtime |
Ascend |
Linux-aarch64 |
d7ed914dbbd91e6ea5eb548c6757d22b0cfcc391aaedaa10d1bd5e4246d2da77 |
1.10.1
Inference runtime
Module Name |
Hardware Platform |
Operating System |
Download Links |
SHA-256 |
---|---|---|---|---|
Inference/training runtime lib, inference/training aar package, Micro lib, benchmark tool |
CPU |
Android-aarch32 |
a6f2eab47f1b1da322670332c2d9c1f24513001531f57481e3e03a4c7c27677e |
|
Inference/training runtime lib, inference/training aar package, Micro lib, benchmark tool |
CPU/GPU |
Android-aarch64 |
7d41ea8180a73c79ef7268d63b325a14ff5429dfdfccdd3557403404d9ebb9e0 |
|
Inference/training runtime lib, inference/training jar package, Micro lib, benchmark tool, converter tool, cropper tool |
CPU |
Linux-x86_64 |
093a92715e9972d5c5bde6774c22b94306df80c40b2f9852c13868061c7068ac |
|
Inference runtime lib, Micro lib, benchmark tool, converter tool |
CPU |
Linux-aarch64 |
3669cf2da9afa54d1c254332f7ccfe4aac00cff58b66f3f489075d1e6abaf9e2 |
|
Inference runtime lib, Micro lib, benchmark tool, converter tool |
CPU |
Windows-x86_64 |
77a3ba17f52fd15b570eafd99fb2c7334a6662a3025a30ef649924aa80a6e4be |
|
iOS inference runtime lib |
CPU |
iOS-aarch32 |
d619bbe8eca4a32bb2e0f69918c0c7bfb326b2bd1ad9be81f5c53470fedfa081 |
|
iOS inference runtime lib |
CPU |
iOS-aarch64 |
b862222e74a275963f3035ffe5f1b73130bcc0a7c2ca7907f9b06d04e971d2a5 |
|
NNIE converter tool |
CPU |
Linux-x86_64 |
893b80f768a3eef72fad45edf0c4db53bb9d0e205f42ce439d882a22d08e920f |
|
NNIE inference runtime lib, benchmark tool |
Hi3516D |
Linux-aarch32 |
cdd7ce851917b74ea0e5f0cda1d6c66bb263085afabc4e0fdbf410612f8017f3 |
|
NNIE inference runtime lib, benchmark tool |
Hi3519A |
Linux-aarch32 |
4840036f7f104a69d80e49a49d71865a639d7d3a2982d3975fbe1f6495810cf8 |
|
NNIE inference runtime lib, benchmark tool |
Hi3559A |
Linux-aarch64 |
da92cffc329998cecaae86890d331bb2082ec9b177ebdb7c9fbc0679261300c1 |
|
Ascend inference runtime lib, benchmark tool, converter tool |
Ascend |
Linux-x86_64 |
845b221e41688e1d7763f75df18b54b59729c11105fd0d34697ca52ac4ba7d76 |
|
Ascend inference runtime lib, benchmark tool, converter tool |
Ascend |
Linux-aarch64 |
919c3f88db13b4648558e995fe04aad0ec1751cb0e9761b9f6b836fe3128fea3 |
|
HarmonyOS lite runtime lib, Micro lib |
Hi3516D |
OpenHarmony-aarch32 |
3ecbc09ba40f435cb8ecd844a5acafb49ee6cdaadd4c4f9cb96404e8f4bda64f |
|
Micro lib |
Cortex-M7 |
None |
69f6fb71b1335a6481e7faccb23b6822414aeb2773f9aad9749814a251c5ec6e |
Python API wheel package
Module Name |
Hardware Platform |
Operating System |
Download Links |
SHA-256 |
---|---|---|---|---|
Converter and inference runtime |
CPU |
Linux-x86_64 |
86e5f925648396f5a351097d3ade70168b53ade2367590b03106084a6fddd350 |
|
Converter and inference runtime |
CPU |
Linux-aarch64 |
908edc7e43bce3accd34cbcdb3a6b86ab063078dbf784f04516fd7531f5d25de |
|
Ascend converter and inference runtime |
Ascend |
Linux-x86_64 |
b50b531225810c85c4c61f868b446912a8c3bde34baff0759738378017eaacc5 |
|
Ascend converter and inference runtime |
Ascend |
Linux-aarch64 |
20b8ee6c80251fc6a0d7de53b4ea739d29cb4d70633d2af8825eb10dfe243077 |
1.10.0
Inference runtime
Module Name |
Hardware Platform |
Operating System |
Download Links |
SHA-256 |
---|---|---|---|---|
Inference/training runtime lib, inference/training aar package, Micro lib, benchmark tool |
CPU |
Android-aarch32 |
fb752b080ab3c392b53584b97ef795475b0916e141cbfe9eb1aada0624cd60cd |
|
Inference/training runtime lib, inference/training aar package, Micro lib, benchmark tool |
CPU/GPU |
Android-aarch64 |
a9abbdefcbaa7bef3cf0a2e2eb5b83256f81c391b3f28e2be4a52e1ed759fc08 |
|
Inference/training runtime lib, inference/training jar package, Micro lib, benchmark tool, converter tool, cropper tool |
CPU |
Linux-x86_64 |
28f0b48c16b0bc522c7d67899a034b80b0f8ea9505e6ffe658a1b1843279ba9f |
|
Inference runtime lib, Micro lib, benchmark tool, converter tool |
CPU |
Linux-aarch64 |
4efe69f474f77433348e37bb8f15ff9107d077f864c3ccac725603a68e00bd11 |
|
Inference runtime lib, Micro lib, benchmark tool, converter tool |
CPU |
Windows-x86_64 |
37901f88e09788641f404649133392af8fedeaa853ab3edbaed7bafccd785136 |
|
iOS inference runtime lib |
CPU |
iOS-aarch32 |
a9ce4a24974e20832cd36a5bcd0efcf7bf1464291b9ca75b9f1c4c58749d0bd9 |
|
iOS inference runtime lib |
CPU |
iOS-aarch64 |
9c66c6f79d3be617ebe1848bf929e33e29200bce7262331cb3c2bcc1a162d60a |
|
NNIE converter tool |
CPU |
Linux-x86_64 |
8defb0de586967b05717749c32d3221f16a5d9248914b85b3d43b1bb9b2fdb98 |
|
NNIE inference runtime lib, benchmark tool |
Hi3516D |
Linux-aarch32 |
baf451bee265e57a387e12c69f3d6202efc2ff2dd87635e4cbda1d38f536079c |
|
NNIE inference runtime lib, benchmark tool |
Hi3519A |
Linux-aarch32 |
ca1311c951fe42e4a5bcd321f7a2fe729660e5a304470b5f9b71357517ff67b2 |
|
NNIE inference runtime lib, benchmark tool |
Hi3559A |
Linux-aarch64 |
292b22de0c3f9b326139f83c645bdccaac7bf1e0bb0ffc577e210ce32deab0e4 |
|
Ascend inference runtime lib, benchmark tool, converter tool |
Ascend |
Linux-x86_64 |
000f9e710d5de9445b9f5baf2d910165c4d7d905d87b55fc8b9f597e5ef030b7 |
|
Ascend inference runtime lib, benchmark tool, converter tool |
Ascend |
Linux-aarch64 |
08ac0713faae329354d50fd6bcea98113c9d44355dc4bb91b86db6b209028998 |
|
HarmonyOS lite runtime lib, Micro lib |
Hi3516D |
OpenHarmony-aarch32 |
b79ef529126aef24e0e0dcc74d0d8c90714166560f8f4527f04431d503afc6b4 |
|
Micro lib |
Cortex-M7 |
None |
f48018ec794643f4296f187ddb65a7e453e0675dd878794f82ba09aea418b203 |
Python API wheel package
Module Name |
Hardware Platform |
Operating System |
Download Links |
SHA-256 |
---|---|---|---|---|
Converter and inference runtime |
CPU |
Linux-x86_64 |
d0680f1b49c2eaae006dea334c95acf9d13cd984b7f09640d2170e672ebbf6ec |
|
Converter and inference runtime |
CPU |
Linux-aarch64 |
07c7726537f0c863b50c92be2d0917c683c65f2a740b0365a0763f9eeaccae89 |
|
Ascend converter and inference runtime |
Ascend |
Linux-x86_64 |
f99a0e7a474b90795487d703db52378674d4c8dba7d759511f01e8a192ce2704 |
|
Ascend converter and inference runtime |
Ascend |
Linux-aarch64 |
54909fe76d078b585238591a88f709e0b998418ac914cba1e748151c6a77a648 |
1.9.0
Inference runtime
Module Name |
Hardware Platform |
Operating System |
Download Links |
SHA-256 |
---|---|---|---|---|
Inference/training runtime lib, inference/training aar package, Micro lib, benchmark tool |
CPU |
Android-aarch32 |
1662181a50852bd4759a821ca29872897efb835c4c318c738609ac4218b80905 |
|
Inference/training runtime lib, inference/training aar package, Micro lib, benchmark tool |
CPU/GPU |
Android-aarch64 |
72a4d51dad9e17b46ae90986ff52cf33393cb549186bf9615dcb32a9d5d084a6 |
|
Inference/training runtime lib, inference/training jar package, Micro lib, benchmark tool, converter tool, cropper tool |
CPU |
Linux-x86_64 |
6175127898766e8a71b4968c8d7588ebe5b192e2c51925814d2b5b3e69c64a7f |
|
Inference runtime lib, Micro lib, benchmark tool, converter tool |
CPU |
Linux-aarch64 |
00e87b88f095b24467cdb3ca752f4e7300b5ee58e28b33a584d9ff81f0647d7b |
|
Inference runtime lib, Micro lib, benchmark tool, converter tool |
CPU |
Windows-x86_64 |
e9a5ac1146860bd576790bb46242a2cb5f8711eb25267cf726a7bf7c510216a2 |
|
iOS inference runtime lib |
CPU |
iOS-aarch32 |
655d9e71e8812e8afd6ff19ab523b3a942f526687bb9ff5165506ba32a1c510e |
|
iOS inference runtime lib |
CPU |
iOS-aarch64 |
bfb56c4295d006d96b20b52636361519dde8ad2141abb54f5fa6fc3b61311401 |
|
NNIE converter tool |
CPU |
Linux-x86_64 |
5fccaa2c6f267f6edf52f8cb699697adc8253fbc3855248029a67d5b33cf8766 |
|
NNIE inference runtime lib, benchmark tool |
Hi3516D |
Linux-aarch32 |
926c96e9a9f77952689eb0ef757ea79600f5c5fcfcef2c26e6178751efbd7a4d |
|
NNIE inference runtime lib, benchmark tool |
Hi3519A |
Linux-aarch32 |
5a98ffeaea6a0ec712b60cc8017a6623d0d8fa75ddb230c9a9e855fec4390456 |
|
NNIE inference runtime lib, benchmark tool |
Hi3559A |
Linux-aarch64 |
be897dd1f481d59a8979d0d028e30dcdb10727e6ae068c779dddeba775ba5a2b |
|
Ascend inference runtime lib, benchmark tool, converter tool |
Ascend |
Linux-x86_64 |
82ba2ac9218f832719030ed93d0fc0116380679f6739e8bc0fc9d978ad644261 |
|
Ascend inference runtime lib, benchmark tool, converter tool |
Ascend |
Linux-aarch64 |
017a340d58b2b1a282fbeba2d2039dfe0932909a36f3e66180c57abc0bc49f58 |
|
HarmonyOS lite runtime lib, Micro lib |
Hi3516D |
OpenHarmony-aarch32 |
71261e4ee7e364fd4be04a9e919e84adb52e1e49d0eefff9725c5aa0cdb62ce9 |
|
Micro lib |
Cortex-M7 |
None |
c101088a199bd4348d75e374a4ab2926db83a3b48bfc68dbcad91a3316c045af |
Python API wheel package
Module Name |
Hardware Platform |
Operating System |
Download Links |
SHA-256 |
---|---|---|---|---|
Converter and inference runtime |
CPU |
Linux-x86_64 |
b8e2385f385103d5ce1c841ba9d83e9a972bdace715b549aad0b9a8eb2ed15ac |
|
Converter and inference runtime |
CPU |
Linux-aarch64 |
afb027398fc1474b60113480983da4df04caef7ef6ce3a93bd9ce1424468fbf4 |
|
Ascend converter and inference runtime |
Ascend |
Linux-x86_64 |
d0a9d979e769d5e2eed64a7603c843b98c2b62adaa5b181ae5a5d5d52db62a73 |
|
Ascend converter and inference runtime |
Ascend |
Linux-aarch64 |
05b9620b70ea5a964c6ab68da659ac838cc155e07bd39d4f54957071fe46551d |
1.8.1
Inference runtime
Module Name |
Hardware Platform |
Operating System |
Download Links |
SHA-256 |
---|---|---|---|---|
Inference/training runtime lib, inference/training aar package, Micro lib, benchmark tool |
CPU |
Android-aarch32 |
7cc56ef7c90e41b6df980c3c43cb326a7c3f1bff8b827fbba3bf19ac5c827f6e |
|
Inference/training runtime lib, inference/training aar package, Micro lib, benchmark tool |
CPU/GPU |
Android-aarch64 |
4dbee13424c347549903426ae794bf4520115625f78871d6c18eca9b62533551 |
|
Inference/training runtime lib, inference/training jar package, Micro lib, benchmark tool, converter tool, cropper tool |
CPU |
Linux-x86_64 |
8a60dbbfee8a9f25853ebbf297c7c727cab365171347c701c0aa51cd0b1290ec |
|
Inference runtime lib, Micro lib, benchmark tool, converter tool |
CPU |
Linux-aarch64 |
598841c2018c977fbdec5fa2d1b3c5f82fc3ffca41c7726faee8dcb850e6ba03 |
|
Inference runtime lib, Micro lib, benchmark tool, converter tool |
CPU |
Windows-x86_64 |
44adced8645f69ea452e19069306d425cd8a78de377db18f801327689e28cf0e |
|
iOS inference runtime lib |
CPU |
iOS-aarch32 |
22758b8ff36258e95a08dae79e09c70c150389adf7e74f543abb526b734fe64f |
|
iOS inference runtime lib |
CPU |
iOS-aarch64 |
deb75103438d45fb6d7c42fa6feaef2a570d1ecc2a4bc2e1e4379b97b1413ec9 |
|
NNIE converter tool |
CPU |
Linux-x86_64 |
0ee5427a82c1e876113e305e5613232ee93fc168d6eeb9c127261fb4b0ccefcc |
|
NNIE inference runtime lib, benchmark tool |
Hi3516D |
Linux-aarch32 |
f3bb6245fdd0ae51d090d428ce8e65fc8ca8934fc1d1e2cd262f046130db0835 |
|
NNIE inference runtime lib, benchmark tool |
Hi3519A |
Linux-aarch32 |
2552e6fac5c36d6c44c35eedba43a2e92aa28ebc0ec1125eb895c1d2ad62318c |
|
NNIE inference runtime lib, benchmark tool |
Hi3559A |
Linux-aarch64 |
25a7d30c82c1e90b6e832f2edd23ac45c8f4f75ba1fdcef633b7ee99af6856d0 |
|
Ascend inference runtime lib, benchmark tool, converter tool |
Ascend |
Linux-x86_64 |
a57de862c2e0809194a9183c09968f0f871f3b3c9be729c482b068ed672e6d32 |
|
Ascend inference runtime lib, benchmark tool, converter tool |
Ascend |
Linux-aarch64 |
e91c74f75d581c2a88ad42f75dc4007eb254469588dd42cec264477f0eab425b |
|
HarmonyOS lite runtime lib, Micro lib |
Hi3516D |
OpenHarmony-aarch32 |
67bd55222bf8c67e01ff413ac2226b7f9444f5a9875b82aa6faf3fcae753c1bf |
|
Micro lib |
Cortex-M7 |
None |
5b1d1de977aac04f29c9e8e84931b0b7ec2a700e103d0940e5afc3ced5f49bd0 |
Python API wheel package
Module Name |
Hardware Platform |
Operating System |
Download Links |
SHA-256 |
---|---|---|---|---|
Converter and inference runtime |
CPU |
Linux-x86_64 |
60c5a974ad76ee06f63ce784ae536d65c42fde6c41eaea3375ecbef5fc3efef0 |
|
Converter and inference runtime |
CPU |
Linux-aarch64 |
1ca4cc3ac0df52ca578389502b7ace581d835683b18b1fecafd8bb10b89b3d84 |
|
Ascend converter and inference runtime |
Ascend |
Linux-x86_64 |
95e8ff0fa4296428781e70d32f90cb44ddd97c55372cc7d2a4f61f7d9adbdd2c |
|
Ascend converter and inference runtime |
Ascend |
Linux-aarch64 |
9d361d71b2c8efc809a7fba939b8e2ba1e06368ba7d8a0ff6e192dbc0d777fc9 |
1.8.0
Module Name |
Hardware Platform |
Operating System |
Download Links |
SHA-256 |
---|---|---|---|---|
Inference/training runtime, inference/training aar package, and benchmark tools |
CPU |
Android-aarch32 |
0ff34972be36b9c9ef896094b1b0e5cd39d2253cbed86a41303fd9c21f1d5417 |
|
Inference/training runtime, inference/training aar package, and benchmark tools |
CPU/GPU |
Android-aarch64 |
f58139a33524191b6a42761948429aed4bc450d1e8c2986f58bfae1f5d81e626 |
|
Inference/training runtime, inference/training jar package, and benchmark/codegen/converter/cropper tools |
CPU |
Linux-x86_64 |
90be22a9f2a34329b46b1f0b08954a4b8cb2bd80bc69f8416f780f5f21f31547 |
|
Inference runtime and benchmark/converter tools |
CPU |
Linux-aarch64 |
e17bfddc47feb17a430f3aea1d2f9b8895505518ec0a0dd109d9c4ed2f8d3b01 |
|
Inference runtime and benchmark/codegen/converter tools |
CPU |
Windows-x86_64 |
f18b3e1697b27bcaf67bad2793494bf8babddd8d5b235b861cf693a44d9e9f14 |
|
iOS inference runtime |
CPU |
iOS-aarch32 |
2270a87ec1888a5f9c62c37177476ee3aece5da8c98d2f379b83889e2f0db8c0 |
|
iOS inference runtime |
CPU |
iOS-aarch64 |
ba368afa146f6e156d840e73145a81c6b2524a51dd0d138bcd2f9c2db5c59771 |
|
NNIE converter tool |
CPU |
Linux-x86_64 |
9d4fce71a3963a0fa779f7996abb315fd9572fe8aca8b562f65f5793b42812b4 |
|
NNIE inference runtime and benchmark tools |
Hi3516D |
Linux-aarch32 |
b5739551b5217171e522aea27d93f64ce23d4bcd46e6b87bdaec2b09ec4c395b |
|
Ascend inference runtime and benchmark/converter tools |
Ascend |
Linux-x86_64 |
a141a7886bde607fdc80fc7f35b28c14b11d36f28bbc3f917db6ae065cc32954 |
|
Ascend inference runtime and benchmark/converter tools |
Ascend |
Linux-aarch64 |
0dd382c848bbb6ae2bd45a31098d311ec046bf219787480156cf1e0123e9f602 |
|
HarmonyOS lite runtime |
Hi3516D |
OpenHarmony-aarch32 |
8c93859a1a7925069906abc59c9d5f5fc53b989b852e80a2ffe3b85e7ff944ba |
Python API wheel package
Module Name |
Hardware Platform |
Operating System |
Download Links |
SHA-256 |
---|---|---|---|---|
Converter and inference runtime |
CPU |
Linux-x86_64 |
80c08d67e2885f68e34c3b99356c1d95c72b1ad4cfdf29d08085ff7195753a08 |
|
Converter and inference runtime |
CPU |
Linux-aarch64 |
a6365c96bc6ca04c7482cc2f2d3c1367d56ad7af13ed0e0d56b9349e773bcc30 |
|
Ascend converter and inference runtime |
Ascend |
Linux-x86_64 |
6539df5f1d59c679fd5a4e0b372b410ced226c2a3f4e1aec1a004b6bf2b6f1ba |
|
Ascend converter and inference runtime |
Ascend |
Linux-aarch64 |
8fed6d1ee3ab6673f733bb9071b19e1fc4a34a20813f9d5d4f6468fd6b5a36b6 |
1.7.0
Module Name |
Hardware Platform |
Operating System |
Download Links |
SHA-256 |
---|---|---|---|---|
Inference/training runtime, inference/training aar package, and benchmark tools |
CPU |
Android-aarch32 |
26e31e9dc4a87698f0af18cd51ed219559b9a996b4123502608dd0a130270805 |
|
Inference/training runtime, inference/training aar package, and benchmark tools |
CPU/GPU |
Android-aarch64 |
e97a19f6d5a1e8cd6ff1b17d8445801eff496a2f02f1a3de8305a1459fe6eaef |
|
Inference/training runtime, inference/training jar package, and benchmark/codegen/converter/cropper tools |
CPU |
Linux-x86_64 |
f162a4b2ccfb5ec11f2d510e4df1d76bd1d9f9d6e321b1e068ac9e37db9dfd0e |
|
Inference runtime and benchmark/converter tools |
CPU |
Linux-aarch64 |
320e2f076a068151c2796bb4a7ceb2190dc86ec78de0c39b52e6a496511b5e85 |
|
Inference runtime and benchmark/codegen/converter tools |
CPU |
Windows-x86_64 |
41229b8be3997dd6ec648c7884b483ca4ca5926f3a7d9f78314af1cc2d2824ad |
|
iOS inference runtime |
CPU |
iOS-aarch32 |
769fe933ebc77ebeeac5486ca418f0f17b491a01253cce8d7368742832bc6a99 |
|
iOS inference runtime |
CPU |
iOS-aarch64 |
1ab0717c8d73ae64570f34ad203c6155f541a4420a24c24f26f0ffd919770832 |
|
NNIE converter tool |
CPU |
Linux-x86_64 |
3965c150ec0428693217dd88dceca4bd01de08f40f8299410af94d9ba3a170bc |
|
NNIE inference runtime and benchmark tools |
Hi3516D |
Linux-aarch32 |
16b51013b0b2e70e52edeb909535ad9818f7771e6c108f7ab7e207ffcd8673ba |
|
Ascend inference runtime and benchmark/converter tools |
Atlas 200/300/500 inference product |
Linux-x86_64 |
1fe2fdcabb9900c2d7e4ee9e239df157351574ed92b5f568eac23620f91f2ce8 |
|
Ascend inference runtime and benchmark/converter tools |
Atlas 200/300/500 inference product |
Linux-aarch64 |
41027f2e070896d8440c89b8b75f66e9c92bbefa72b8b68e959770c2ffea5780 |
|
HarmonyOS lite runtime |
Hi3516D |
OpenHarmony-aarch32 |
269f0dc976523302d372a0500a3c93eb78ea048c51b9c5ae339174f224e1cbee |
1.6.1
Module Name |
Hardware Platform |
Operating System |
Download Links |
SHA-256 |
---|---|---|---|---|
Inference/training runtime, inference/training aar package, and benchmark tools |
CPU |
Android-aarch32 |
afcfab6c13d46d25a19d4caedb66c2a16ccd63d5b5993ed1e6268633d03a7820 |
|
Inference/training runtime, inference/training aar package, and benchmark tools |
CPU/GPU |
Android-aarch64 |
7bba9b242c1b5bbb21e39313db0d1118aa568813466a82f3ddd689d570910b92 |
|
Inference/training runtime, inference/training jar package, and benchmark/codegen/converter/cropper tools |
CPU |
Linux-x86_64 |
667f8fae4762281858adfb71e62f11a52f641078496901d3f317b0df8c415452 |
|
Inference runtime and benchmark/converter tools |
CPU |
Linux-aarch64 |
238589a227534f0514d2371fe9a92eb5ac2f181a22917ac2b5ef6969d51c518e |
|
Inference runtime and benchmark/codegen/converter tools |
CPU |
Windows-x86_64 |
da0781020f8680a7ab5c00d9930d06fe0f9c295aa9e13b62b43c2b30e0bcad91 |
|
iOS inference runtime |
CPU |
iOS-aarch32 |
6a8adfb3ea960af9213ed9474a9b2d9deef57ebe8cb9f9c822a314c7ff8165e7 |
|
iOS inference runtime |
CPU |
iOS-aarch64 |
f7fe5477744bd1d7ee19d3292f150ac9665f60f613a3a8025a870b147885c76a |
|
NNIE converter tool |
CPU |
Linux-x86_64 |
bde6c05aa90193d6fc0431ac3ef7a2c7e36e77db072955db68d16d6a90cfbe64 |
|
NNIE inference runtime and benchmark tools |
Hi3516D |
Linux-aarch32 |
fc4ab99ae0fb4cad4a63dad591da8d20be624e29b563aba39bba8239c0d64bfc |
|
Ascend inference runtime and benchmark/converter tools |
Atlas 200/300/500 inference product |
Linux-x86_64 |
7b46fe61974d0295c1f816707e0330c26085ca5532c57f36b149fa851bcdbf7b |
|
HarmonyOS lite runtime |
Hi3516D |
OpenHarmony-aarch32 |
e70b2d30feafe0bbce3999c0bb551ed70edaf497c31fe84d70e4d5944093b253 |
1.6.0
Module Name |
Hardware Platform |
Operating System |
Download Links |
SHA-256 |
---|---|---|---|---|
Inference/training runtime, inference/training aar package, and benchmark tools |
CPU |
Android-aarch32 |
d043803cffc8a0b75409aab3e4039f1e86756cf618af1538a76865e9fa4fd481 |
|
Inference/training runtime, inference/training aar package, and benchmark tools |
CPU/GPU |
Android-aarch64 |
25188266621f4cfedb24970a9a98ef6190fe02c9d034b7285f360da425ffe9d6 |
|
Inference/training runtime, inference/training jar package, and benchmark/codegen/converter/cropper tools |
CPU |
Linux-x86_64 |
90472996359f64509f38036ed8100605c76dcdc42453c2fc7156048eb981708c |
|
Inference runtime and benchmark/codegen/converter tools |
CPU |
Windows-x86_64 |
4460b8f1bf321eca005074dccffb54d6d3164ba3f78ce34530ec20db4dbc9980 |
|
iOS inference runtime |
CPU |
iOS-aarch32 |
72fe007660abe9c51d0a1852b094fb52d8bbd1610c989e79c9858937102aa59f |
|
iOS inference runtime |
CPU |
iOS-aarch64 |
51bd5f7c21477d7856bea33d31e059f578b6b964a7c43e440e97c44b186db4a4 |
|
NNIE converter tool |
CPU |
Linux-x86_64 |
81c2a5dadf51978d1c80f75c63fde4edefa2897792ac571fd33ffd35e338736b |
|
NNIE inference runtime and benchmark tools |
Hi3516D |
Linux-aarch32 |
8133c2326e2defa3614f86592d5691fdb410a4296898e254a33cd33a7e519b16 |
|
HarmonyOS lite runtime |
Hi3516D |
OpenHarmony-aarch32 |
d5daafac4bdcd0d03158e2a7cd3f881869b49cfb77d9654a24ddd967edbe5e91 |
1.5.0
Module Name |
Hardware Platform |
Operating System |
Download Links |
SHA-256 |
---|---|---|---|---|
Inference/training runtime, inference/training aar package, and benchmark tools |
CPU |
Android-aarch32 |
83101ffc38de6c33c94d09bddd0efed31c23f468f76694ecbec5623db5f04afd |
|
Inference/training runtime, inference/training aar package, and benchmark tools |
CPU/GPU |
Android-aarch64 |
2f09c9f018c1141f2c415d48ff8ec01c17a2ad5e0b7d3233b8aa0612a2330a9e |
|
Inference/training runtime, inference/training jar package, and benchmark/codegen/converter/cropper tools |
CPU |
Ubuntu-x64 |
2359084653c1ddb55da738d5daf65e2cb0b0426032232ae455a72f8960961823 |
|
Inference runtime and benchmark/codegen/converter tools |
CPU |
Windows-x64 |
2ed5d767be638787755c3855e28312d655e15219ad8bf500b43d6cea3a8d2dc6 |
|
iOS inference runtime |
CPU |
iOS-aarch32 |
b3cd43f694f051e996cb8b39ed30137f92a4c324adaaceac1d75aa995bd7deb1 |
|
iOS inference runtime |
CPU |
iOS-aarch64 |
9e919cdaf92fbb408ab169113182f7977d51409575e3a864a5a985a3004623d4 |
|
NNIE converter tool |
CPU |
Ubuntu-x64 |
ab424c967b9ead17cebf1ef353b8b8129b7725e57f045be94a235741d12326c8 |
|
NNIE inference runtime and benchmark tools |
Hi3516D |
Linux-aarch32 |
976ce83dc89f3ebeab9b706ce0b1a8f49cab7a2c19cd860446c3464164054ceb |
|
HarmonyOS lite runtime |
Hi3516D |
OpenHarmony |
49cf161c90c259415718b0c6b4538bbc06d4629bfa604c4cbcbc04cf5ed1e3e8 |
1.5.0-rc1
Module Name |
Hardware Platform |
Operating System |
Download Links |
SHA-256 |
---|---|---|---|---|
Inference/training runtime, inference/training aar package, and benchmark tools |
CPU |
Android-aarch32 |
7317c9359bd6da97c389780a9a6f91987c512a794d3a367e42e411eaacb181e0 |
|
Inference/training runtime, inference/training aar package, and benchmark tools |
CPU/GPU |
Android-aarch64 |
d0453a1e5f5f653cc77f160b3ea9252ee1ab96d0bce3e9128d4a54f47279ac46 |
|
Inference/training runtime, inference/training jar package, and benchmark/codegen/converter/cropper tools |
CPU |
Ubuntu-x64 |
81451541762699c259d13dd1d5a985fde653aeaf91052471bb918eb4f5045f49 |
|
Inference runtime and benchmark/codegen/converter tools |
CPU |
Windows-x64 |
3ee80e7906f6173b805d985b51551c489bd23614b8747fb6e795a7303729913b |
|
iOS inference runtime |
CPU |
iOS-aarch32 |
56a508d49605d46b65b84d4160155ebcbb21d1f9295baac1c96148f0a2b06e79 |
|
iOS inference runtime |
CPU |
iOS-aarch64 |
69af90551a97f6d36379431a63fc31ebff102809fc5ec7e7a48e566f3a578ad8 |
|
NNIE converter tool |
CPU |
Ubuntu-x64 |
ee0ec7fdc203a5106eccc02ca15c7543c4e6c957e605f8a9aaead4d966f10c4b |
|
NNIE inference runtime and benchmark tools |
Hi3516D |
Linux-aarch32 |
477628889908fb64ae4b7b989c0181593f9c66bebf119f6a07437911879bff8b |
|
HarmonyOS lite runtime |
Hi3516D |
OpenHarmony |
54654927f6bf0f90d316407c689575c47a263e0f3c608e5b430b9764b4c35f5c |
1.3.0
Module Name |
Hardware Platform |
Operating System |
Download Links |
SHA-256 |
---|---|---|---|---|
Inference/training runtime, inference/training aar package, and benchmark tools |
CPU |
Android-aarch32 |
c1a950feec47a58871956cab74f0b6f76ad2f151dde990228782f76c0d8120df |
|
Inference/training runtime, inference/training aar package, and benchmark tools |
CPU/GPU |
Android-aarch64 |
4def68662f5b249db0fd0f372fab1877e530fe32b6b9317869f01bedde892838 |
|
Inference/training runtime, inference/training jar package, and benchmark/codegen/converter/cropper tools |
CPU |
Ubuntu-x64 |
2d0f77bb3c1a9489bc9511f334fb6cea3266d6bd4d600517b5aa7e58efab1310 |
|
Inference runtime and benchmark/codegen/converter tools |
CPU |
Windows-x64 |
40c1abdeb4c5f5353844e9798c6a0f20565a5a3bc6de7da3cdc4a2df6fa15ef7 |
|
iOS inference runtime |
CPU |
iOS-aarch32 |
b4bb1435887b04ce95be5429875e81c3e40a57b0c6182a35d58ea34b27d5fa5c |
|
iOS inference runtime |
CPU |
iOS-aarch64 |
a28111c2bcc542a70ef98edfe007892855dcfcd9d40586fa98be962caefc26d3 |
|
NNIE converter tool |
CPU |
Ubuntu-x64 |
73f4dffde69d24a8d0574e771bc6131a45a8fc5ebd18b34fc7afd30a0d149cb1 |
|
NNIE inference runtime and benchmark tools |
Hi3516D |
Linux-aarch32 |
07a0d0a8a8f257c01d06fa33f57969eb68385b430f2f5a3a4b09dba463c361d9 |
|
HarmonyOS lite runtime |
Hi3516D |
OpenHarmony |
1bd481e93b6f3b2467a6d8f0bcc2da0221e4d023d74c584174dedb8854704748 |
1.2.0
Module Name |
Hardware Platform |
Operating System |
Download Links |
SHA-256 |
---|---|---|---|---|
Inference runtime (cpp), training runtime (cpp), inference aar package, and benchmark/benchmark_train tools. |
CPU |
Android-aarch32 |
7d073573385a69bff53542c395d106393da241682cd6053703ce21f1de23bac6 |
|
Inference runtime (cpp), training runtime (cpp), inference aar package, and benchmark/benchmark_train tools. |
CPU/GPU |
Android-aarch64 |
7f8400f0b97fa3e7cbf0d266c73b43a2410905244b04d0202fab39d9267346e0 |
|
Inference runtime (cpp), training runtime (cpp), inference jar package, and benchmark/benchmark_train/codegen/converter/cropper tools. |
CPU |
Ubuntu-x64 |
3b609ed8be9e3ae70987d6e00421ad4720776d797133e72f6952ba6b93059062 |
|
Inference runtime (cpp) and benchmark/codegen/converter tools. |
CPU |
Windows-x64 |
bf01851d7e2cde416502dce11bd2a86ef63e559f6dabba090405755a87ce14ae |
|
Inference runtime(cpp) |
CPU |
OpenHarmony |
a9987b25815cb69e0f630be1388486e8d727a19815a67851089b7d633bd2f3f2 |
1.1.0
Inference
Module Name |
Hardware Platform |
Operating System |
Download Links |
SHA-256 |
---|---|---|---|---|
MindSpore Lite Converter |
CPU |
Ubuntu-x64 |
d449e38a8493c314d1b5b1a127f62269192da785b012ff892eda775dedca3d82 |
|
CPU |
Windows-x64 |
5e50b7701b97ebe784095f2ba954fc6c377eb157fbc9aaeae2497e38cc4ee212 |
||
MindSpore Lite Runtime (include image processing) |
CPU/GPU/NPU |
Android-aarch64/Android-aarch32 |
a19de5706db57e97a5f04ef08e0e383f8ea497c70bb60e60d056b31a603c0243 |
|
CPU |
Ubuntu-x64 |
176256c2fbef775f1a44aaeccae0c4eea6a60f41fc0baece5479dcb378155f36 |
||
CPU |
Windows-x64 |
30b5545245832a73d84732166f360c77cd09a7a4fe1fb922a8f7b80e7df326c1 |
Train
Module Name |
Hardware Platform |
Operating System |
Download Links |
SHA-256 |
---|---|---|---|---|
MindSpore Lite Converter |
CPU |
Ubuntu-x64 |
f95a9db98c84ec3d97f88383ecc3832582aa9737ed287c33703deb0b419acf25 |
|
MindSpore Lite Runtime (include image processing) |
CPU |
Android-aarch64/Android-aarch32 |
a6d8152f4e2d674c52af2c379f7d07858d30bc0dceef1dbc366e6fa16a5948b5 |
|
CPU |
Ubuntu-x64 |
1290f0adc790adc9edce654b9a629a9a323cfcb8453eb6bc19b779ef726282bf |
Ubuntu-x64 Package is compiled in an environment where the GCC version is greater than or equal to 7.3.0, so the deployment environment requires the GLIBC version to be greater than or equal to 2.27.
Android-aarch32 does not support GPU and NPU.
MindSpore Lite also provides
libmindspore-lite.a
static library cropper tool for Runtime, which can crop the static library files, and effectively reduce the size of the library files.After the download of MindSpore Lite is completed, SHA-256 integrity verification is required.