Lite算子支持
Linux
Ascend
端侧
推理应用
初级
中级
高级
本文列举MindSpore Lite支持的算子。
操作名 |
CPU |
CPU |
CPU |
CPU |
GPU |
GPU |
NPU |
TensorRT |
支持的TensorFlow Lite算子 |
支持的Caffe Lite算子 |
支持的Onnx Lite算子 |
支持的TensorFlow算子 |
---|---|---|---|---|---|---|---|---|---|---|---|---|
Abs |
✅ |
✅ |
✅ |
✅ |
✅ |
✅ |
✅ |
Abs |
Abs |
Abs |
||
Add |
✅ |
✅ |
✅ |
✅ |
✅ |
✅ |
✅ |
✅ |
Add |
Add, Int8Add |
Add, AddV2 |
|
AddGrad |
✅ |
|||||||||||
AddN |
✅ |
✅ |
AddN |
|||||||||
Assert |
✅ |
Assert |
||||||||||
Argmax |
✅ |
✅ |
✅ |
✅ |
✅ |
✅ |
✅ |
Argmax |
ArgMax |
ArgMax |
Argmax |
|
Argmin |
✅ |
✅ |
✅ |
✅ |
✅ |
✅ |
Argmin |
ArgMin |
||||
AvgPool |
✅ |
✅ |
✅ |
✅ |
✅ |
✅ |
✅ |
MeanPooling |
Pooling |
AveragePool, |
AvgPool |
|
AvgPoolGrad |
✅ |
✅ |
||||||||||
BatchNorm |
✅ |
✅ |
✅ |
✅ |
✅ |
✅ |
BatchNorm |
BatchNormalization |
||||
BatchNormGrad |
✅ |
✅ |
||||||||||
BatchToSpace |
✅ |
✅ |
✅ |
✅ |
✅ |
BatchToSpace, |
BatchToSpace, |
|||||
BiasAdd |
✅ |
✅ |
✅ |
✅ |
✅ |
✅ |
BiasAdd |
BiasAdd |
||||
BiasAddGrad |
✅ |
✅ |
||||||||||
Broadcast |
✅ |
BroadcastTo |
Expand |
|||||||||
Cast |
✅ |
✅ |
✅ |
✅ |
✅ |
✅ |
✅ |
Cast, |
Cast |
Cast |
||
Ceil |
✅ |
✅ |
✅ |
✅ |
✅ |
✅ |
✅ |
Ceil |
Ceil |
Ceil |
||
Concat |
✅ |
✅ |
✅ |
✅ |
✅ |
✅ |
✅ |
✅ |
Concat |
Concat |
Concat |
ConcatV2 |
ConstantOfShape |
✅ |
ConstantOfShape |
||||||||||
Conv2d |
✅ |
✅ |
✅ |
✅ |
✅ |
✅ |
✅ |
✅ |
Conv2D |
Convolution |
Conv, Int8Conv, |
Conv2D |
Conv2dGrad |
✅ |
|||||||||||
Conv2dTranspose |
✅ |
✅ |
✅ |
✅ |
✅ |
✅ |
✅ |
DeConv2D |
Deconvolution |
ConvTranspose |
Conv2DBackpropInput |
|
Conv2dTransposeGrad |
✅ |
|||||||||||
Cos |
✅ |
✅ |
✅ |
✅ |
✅ |
✅ |
✅ |
Cos |
Cos |
Cos |
||
Crop |
✅ |
✅ |
✅ |
✅ |
Crop |
|||||||
CropAndResize |
✅ |
✅ |
CropAndResize |
|||||||||
CumSum |
✅ |
Cumsum |
||||||||||
CustomExtractFeatures |
✅ |
ExtractFeatures |
||||||||||
CustomNormalize |
✅ |
Normalize |
||||||||||
CustomPredict |
✅ |
Predict |
||||||||||
DeDepthwiseConv2D |
✅ |
✅ |
✅ |
Deconvolution |
||||||||
DepthToSpace |
✅ |
✅ |
✅ |
✅ |
✅ |
DepthToSpace |
DepthToSpace |
|||||
DepthwiseConv2dNative |
✅ |
✅ |
✅ |
✅ |
✅ |
✅ |
✅ |
DepthwiseConv2D |
Convolution |
DepthwiseConv2dNative |
||
DetectionPostProcess |
✅ |
✅ |
✅ |
Custom |
||||||||
Div |
✅ |
✅ |
✅ |
✅ |
✅ |
✅ |
✅ |
✅ |
Div, RealDiv |
Div |
Div, RealDiv |
|
DivGrad |
✅ |
|||||||||||
Eltwise |
✅ |
✅ |
✅ |
✅ |
✅ |
✅ |
✅ |
✅ |
Eltwise |
Sum, Max[3] |
||
Elu |
✅ |
Elu |
Elu, |
NonMaxSuppressionV3 |
||||||||
EluGrad |
✅ |
|||||||||||
Equal |
✅ |
✅ |
✅ |
✅ |
✅ |
✅ |
✅ |
Equal |
Equal |
Equal |
||
Exp |
✅ |
✅ |
✅ |
Exp |
Exp |
Exp |
Exp |
|||||
ExpandDims |
✅ |
✅ |
✅ |
✅ |
✅ |
ExpandDims |
ExpandDims |
|||||
Fill |
✅ |
✅ |
Fill |
Fill |
||||||||
Flatten |
✅ |
✅ |
Flatten |
|||||||||
Floor |
✅ |
✅ |
✅ |
✅ |
✅ |
✅ |
✅ |
flOOR |
Floor |
Floor |
||
FloorDiv |
✅ |
✅ |
✅ |
✅ |
✅ |
FloorDiv |
FloorDiv |
|||||
FloorMod |
✅ |
✅ |
✅ |
✅ |
✅ |
FloorMod |
FloorMod |
|||||
FullConnection |
✅ |
✅ |
✅ |
✅ |
✅ |
✅ |
✅ |
FullyConnected |
InnerProduct |
|||
FusedBatchNorm |
✅ |
✅ |
✅ |
✅ |
✅ |
FusedBatchNorm |
FusedBatchNorm, |
|||||
GatherNd |
✅ |
✅ |
✅ |
✅ |
✅ |
GatherND |
GatherNd |
|||||
Gather |
✅ |
✅ |
✅ |
✅ |
✅ |
✅ |
✅ |
Gather |
Gather |
GatherV2 |
||
Greater |
✅ |
✅ |
✅ |
✅ |
✅ |
✅ |
✅ |
Greater |
Greater |
Greater |
||
GreaterEqual |
✅ |
✅ |
✅ |
✅ |
✅ |
✅ |
✅ |
GreaterEqual |
GreaterEqual |
|||
GRU |
✅ |
✅ |
||||||||||
HardTanh |
✅ |
✅ |
||||||||||
HashtableLookup |
✅ |
HashtableLookup |
||||||||||
HSigmoid |
✅ |
✅ |
||||||||||
Hswish |
✅ |
✅ |
✅ |
✅ |
✅ |
✅ |
✅ |
HardSwish |
||||
HswishGrad |
✅ |
|||||||||||
InstanceNorm |
✅ |
✅ |
InstanceNorm |
|||||||||
InvertPermutation |
✅ |
InvertPermutation |
||||||||||
L2Norm |
✅ |
✅ |
L2_NORMALIZATION |
|||||||||
LayerNorm |
✅ |
✅ |
✅ |
|||||||||
LeakyReLU |
✅ |
✅ |
✅ |
✅ |
✅ |
✅ |
✅ |
LeakyRelu |
LeakyRelu |
LeakyRelu |
||
LeakyReLUGrad |
✅ |
|||||||||||
Less |
✅ |
✅ |
✅ |
✅ |
✅ |
✅ |
✅ |
Less |
Less |
Less |
||
LessEqual |
✅ |
✅ |
✅ |
✅ |
✅ |
✅ |
✅ |
LessEqual |
LessEqual |
|||
LRN |
✅ |
LocalResponseNorm |
Lrn, LRN |
|||||||||
Log |
✅ |
✅ |
✅ |
✅ |
✅ |
✅ |
✅ |
Log |
Log |
Log |
||
LogGrad |
✅ |
✅ |
||||||||||
LogicalAnd |
✅ |
✅ |
✅ |
✅ |
✅ |
LogicalAnd |
And |
LogicalAnd |
||||
LogicalNot |
✅ |
✅ |
✅ |
✅ |
✅ |
✅ |
✅ |
LogicalNot |
Not |
LogicalNot |
||
LogicalOr |
✅ |
✅ |
✅ |
✅ |
✅ |
LogicalOr |
Or |
LogicalOr |
||||
LogSoftmax |
✅ |
✅ |
✅ |
✅ |
✅ |
LogSoftmax |
LogSoftmax |
|||||
LshProjection |
✅ |
LshProjection |
||||||||||
LSTM |
✅ |
✅ |
LSTM |
|||||||||
MatMul |
✅ |
✅ |
✅ |
✅ |
✅ |
✅ |
✅ |
✅ |
BatchMatMul |
MatMul, |
MatMul, |
|
MatMulGrad |
✅ |
|||||||||||
Maximum |
✅ |
✅ |
✅ |
✅ |
✅ |
Maximum |
Maximum |
|||||
MaximumGrad |
✅ |
✅ |
||||||||||
MaxPool |
✅ |
✅ |
✅ |
✅ |
✅ |
✅ |
✅ |
MaxPooling |
Pooling |
MaxPool, |
MaxPool |
|
MaxPoolGrad |
✅ |
✅ |
||||||||||
Merge |
✅ |
✅ |
Merge |
|||||||||
Minimum |
✅ |
✅ |
✅ |
✅ |
✅ |
Minimum |
Min |
Minimum |
||||
MinimumGrad |
✅ |
✅ |
||||||||||
Mul |
✅ |
✅ |
✅ |
✅ |
✅ |
✅ |
✅ |
Mul |
Mul |
Mul |
||
MulGrad |
✅ |
|||||||||||
Neg |
✅ |
✅ |
✅ |
✅ |
✅ |
Neg |
Neg |
|||||
NegGrad |
✅ |
✅ |
||||||||||
NotEqual |
✅ |
✅ |
✅ |
✅ |
✅ |
✅ |
✅ |
NotEqual |
NotEqual |
|||
OneHot |
✅ |
✅ |
✅ |
OneHot |
OneHot |
OneHot |
||||||
Pad |
✅ |
✅ |
✅ |
✅ |
✅ |
✅ |
✅ |
Pad, MirrorPad, PadV2 |
Pad |
MirrorPad, Pad, PadV2 |
||
Pow |
✅ |
✅ |
✅ |
✅ |
✅ |
✅ |
✅ |
Pow |
Power |
Pow[2] |
Pow |
|
PowGrad |
✅ |
|||||||||||
PReLU |
✅ |
✅ |
✅ |
PRELU |
PReLU |
PRelu |
||||||
RandomStandardNormal |
✅ |
RandomStandardNormal |
||||||||||
RandomUniform |
✅ |
RandomUniform |
||||||||||
Range |
✅ |
Range |
Range, |
|||||||||
Rank |
✅ |
Rank |
Rank |
|||||||||
Reciprocal |
✅ |
✅ |
✅ |
✅ |
||||||||
ReduceAll |
✅ |
All |
||||||||||
ReduceASum |
✅ |
✅ |
✅ |
Reduction |
||||||||
ReduceMax |
✅ |
✅ |
✅ |
✅ |
✅ |
✅ |
ReduceMax |
ReduceMax |
Max |
|||
ReduceMean |
✅ |
✅ |
✅ |
✅ |
✅ |
✅ |
✅ |
Mean |
Reduction |
ReduceMean |
Mean |
|
ReduceMin |
✅ |
✅ |
✅ |
✅ |
✅ |
ReduceMin |
ReduceMin |
Min |
||||
ReduceProd |
✅ |
✅ |
✅ |
✅ |
✅ |
ReduceProd |
ReduceProd |
Prod |
||||
ReduceSum |
✅ |
✅ |
✅ |
✅ |
✅ |
Sum |
Reduction |
ReduceSum |
Sum |
|||
ReduceSumSquare |
✅ |
✅ |
✅ |
Reduction |
ReduceSumSquare |
|||||||
ReLU |
✅ |
✅ |
✅ |
✅ |
✅ |
✅ |
✅ |
✅ |
Relu |
ReLU |
Relu |
Relu |
ReLUGrad |
✅ |
✅ |
||||||||||
ReLU6 |
✅ |
✅ |
✅ |
✅ |
✅ |
✅ |
✅ |
Relu6 |
ReLU6 |
Clip[1] |
Relu6 |
|
ReLU6Grad |
✅ |
✅ |
||||||||||
Reshape |
✅ |
✅ |
✅ |
✅ |
✅ |
✅ |
✅ |
✅ |
Reshape |
Reshape |
Reshape, |
Reshape |
Resize |
✅ |
✅ |
✅ |
✅ |
✅ |
✅ |
ResizeBilinear, |
Interp |
ResizeBilinear, |
|||
ResizeGrad |
✅ |
✅ |
||||||||||
Reverse |
✅ |
reverse |
ReverseV2 |
|||||||||
ReverseSequence |
✅ |
ReverseSequence |
ReverseSequence |
|||||||||
Round |
✅ |
✅ |
✅ |
✅ |
✅ |
✅ |
✅ |
Round |
Round |
Round |
||
Rsqrt |
✅ |
✅ |
✅ |
✅ |
✅ |
✅ |
✅ |
Rsqrt |
Rsqrt |
|||
Select |
✅ |
Select |
||||||||||
Selu |
Selu |
|||||||||||
Scale |
✅ |
✅ |
✅ |
✅ |
✅ |
✅ |
✅ |
✅ |
Scale |
|||
ScatterNd |
✅ |
ScatterNd |
||||||||||
Shape |
✅ |
✅ |
✅ |
✅ |
✅ |
✅ |
✅ |
Shape |
Shape |
Shape |
||
Sigmoid |
✅ |
✅ |
✅ |
✅ |
✅ |
✅ |
✅ |
Logistic |
Sigmoid |
Sigmoid |
Sigmoid |
|
SigmoidGrad |
✅ |
✅ |
||||||||||
Sin |
✅ |
✅ |
✅ |
✅ |
✅ |
✅ |
✅ |
Sin |
Sin |
Sin |
||
Size |
✅ |
Size |
||||||||||
Slice |
✅ |
✅ |
✅ |
✅ |
✅ |
✅ |
✅ |
Slice |
Slice |
Slice |
Slice |
|
SkipGram |
✅ |
SKipGram |
||||||||||
Softmax |
✅ |
✅ |
✅ |
✅ |
✅ |
✅ |
✅ |
Softmax |
Softmax |
Softmax |
Softmax |
|
SoftmaxGrad |
✅ |
|||||||||||
Softplus |
✅ |
Softplus |
||||||||||
SpaceToBatch |
✅ |
✅ |
✅ |
✅ |
✅ |
SpaceToBatch |
||||||
SpaceToBatchND |
✅ |
✅ |
✅ |
✅ |
✅ |
SpaceToBatchND |
SpaceToBatchND |
|||||
SpaceToDepth |
✅ |
✅ |
✅ |
SpaceToDepth |
SpaceToDepth |
|||||||
SparseToDense |
✅ |
✅ |
✅ |
SpareToDense |
||||||||
Split |
✅ |
✅ |
✅ |
✅ |
✅ |
Split, SplitV |
Split |
Split, SplitV |
||||
Sqrt |
✅ |
✅ |
✅ |
✅ |
✅ |
✅ |
✅ |
✅ |
Sqrt |
Sqrt |
Sqrt |
|
Square |
✅ |
✅ |
✅ |
✅ |
✅ |
✅ |
✅ |
Square |
Square |
|||
SquaredDifference |
✅ |
✅ |
✅ |
✅ |
✅ |
SquaredDifference |
SquaredDifference |
|||||
Squeeze |
✅ |
✅ |
✅ |
✅ |
✅ |
✅ |
✅ |
Squeeze |
Squeeze |
Squeeze |
||
StridedSlice |
✅ |
✅ |
✅ |
✅ |
✅ |
✅ |
✅ |
StridedSlice |
Slice, |
StridedSlice |
||
Stack |
✅ |
✅ |
✅ |
✅ |
Stack |
Pack |
||||||
Sub |
✅ |
✅ |
✅ |
✅ |
✅ |
✅ |
✅ |
✅ |
Sub |
Sub |
Sub |
|
SubGrad |
✅ |
|||||||||||
Swish |
✅ |
✅ |
||||||||||
Switch |
✅ |
✅ |
Switch |
|||||||||
Tanh |
✅ |
✅ |
✅ |
✅ |
✅ |
✅ |
Tanh |
TanH |
Tanh, Sign |
Tanh |
||
TanhGrad |
✅ |
|||||||||||
TensorListFromTensor |
✅ |
✅ |
TensorListFromTensor |
|||||||||
TensorListGetItem |
✅ |
✅ |
TensorListGetItem |
|||||||||
TensorListReserve |
✅ |
✅ |
TensorListReserve |
|||||||||
TensorListSetItem |
✅ |
✅ |
TensorListSetItem |
|||||||||
TensorListStack |
✅ |
✅ |
TensorListStack |
|||||||||
Tile |
✅ |
✅ |
✅ |
Tile |
Tile |
Tile |
||||||
TopK |
✅ |
✅ |
✅ |
TopKV2 |
TopK |
TopKV2 |
||||||
Transpose |
✅ |
✅ |
✅ |
✅ |
✅ |
✅ |
✅ |
Transpose |
Permute |
Transpose |
Transpose |
|
UniformReal |
✅ |
|||||||||||
Unique |
✅ |
Unique |
||||||||||
Unsqueeze |
✅ |
✅ |
✅ |
✅ |
✅ |
✅ |
Unsqueeze |
|||||
Unstack |
✅ |
Unstack |
||||||||||
Where |
✅ |
Where |
Where |
Where |
||||||||
ZerosLike |
✅ |
ZerosLike |
ZerosLike |
|||||||||
转换工具支持的其他算子[4] |
Loop, Dropout, If |
Dropout, Enter, |
[1] Clip:仅支持将clip(0, 6)转换为Relu6。
[2] Pow:仅支持指数为单个常数。
[3] Sum与Max:仅支持输入个数为2。
[4] 转换工具支持,但不需要具体实现的算子,一般这类算子在转化工具中被优化而消失,如被融合掉或者使用其他算子代替。