# Lite算子支持
`Linux` `Ascend` `端侧` `推理应用` `初级` `中级` `高级`
[![查看源文件](./_static/logo_source.png)](https://gitee.com/mindspore/docs/blob/r1.2/tutorials/lite/source_zh_cn/operator_list_lite.md)
本文列举MindSpore Lite支持的算子。
| 操作名
| CPU
FP16 | CPU
FP32 | CPU
Int8 | CPU
UInt8 | GPU
FP16 | GPU
FP32 | NPU
| 支持的TensorFlow Lite算子 | 支持的Caffe Lite算子 | 支持的Onnx Lite算子 |支持的TensorFlow算子 |
| --------------------- | :------------: | :------------: | :------------: | :-------------: | :------------: | :------------: | :---------: | ------------------------------- | ------------------------ | ----------------------------------------------- | ----------------------------------------------- |
| Abs | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | Abs | | Abs | Abs |
| Add | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | Add | | Add, Int8Add | Add, AddV2 |
| AddGrad | | ✅ | | | | | | | | | |
| AddN | | ✅ | | | | | | AddN | | | |
| Assert | | ✅ | | | | | | | | | Assert |
| Argmax | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | | Argmax | ArgMax | ArgMax | Argmax |
| Argmin | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | | Argmin | | | ArgMin |
| AvgPool | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | MeanPooling | Pooling | AveragePool,
GlobalAveragePool,
Int8AveragePool | AvgPool |
| AvgPoolGrad | | ✅ | | | | | | | | | |
| BatchNorm | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | | | BatchNorm | BatchNormalization | |
| BatchNormGrad | | ✅ | | | | | | | | | |
| BatchToSpace | | ✅ | ✅ | ✅ | ✅ | ✅ | | BatchToSpace,
BatchToSpaceND | | | BatchToSpace,
BatchToSpaceND |
| BiasAdd | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | | | | BiasAdd | BiasAdd |
| BiasAddGrad | | ✅ | | | | | | | | | |
| Broadcast | | ✅ | | | | | | BroadcastTo | | Expand | |
| Cast | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | Cast,
QUANTIZE,
DEQUANTIZE | | Cast | Cast |
| Ceil | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | Ceil | | Ceil | Ceil |
| Concat | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | Concat | Concat | Concat | ConcatV2 |
| ConstantOfShape | | ✅ | | | | | | | | ConstantOfShape | |
| Conv2d | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | Conv2D | Convolution | Conv,
Int8Conv,
ConvRelu,
Int8ConvRelu | Conv2D |
| Conv2dGrad | | ✅ | | | | | | | | | |
| Conv2dTranspose | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | DeConv2D | Deconvolution | ConvTranspose | Conv2DBackpropInput |
| Conv2dTransposeGrad | | ✅ | | | | | | | | | |
| Cos | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | Cos | | Cos | Cos |
| Crop | ✅ | ✅ | ✅ | ✅ | | | | | Crop | | |
| CropAndResize | | ✅ | | | | | | | | | CropAndResize |
| CustomExtractFeatures | | ✅ | | | | | | ExtractFeatures | | | |
| CustomNormalize | | ✅ | | | | | | Normalize | | | |
| CustomPredict | | ✅ | | | | | | Predict | | | |
| DeDepthwiseConv2D | | ✅ | ✅ | ✅ | | | | | Deconvolution | | |
| DepthToSpace | | ✅ | ✅ | ✅ | ✅ | ✅ | | DepthToSpace | | DepthToSpace | |
| DepthwiseConv2dNative | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | DepthwiseConv2D | Convolution | | DepthwiseConv2dNative |
| DetectionPostProcess | | ✅ | ✅ | ✅ | | | | Custom | | | |
| Div | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | Div, RealDiv | | Div | Div,
RealDiv |
| DivGrad | | ✅ | | | | | | | | | |
| Eltwise | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | | Eltwise | Sum, Max[3] | |
| Elu | | ✅ | | | | | | | Elu | Elu,
NonMaxSuppression | NonMaxSuppressionV3 |
| EluGrad | | ✅ | | | | | | | | | |
| Equal | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | Equal | | Equal | Equal |
| Exp | | ✅ | | | ✅ | ✅ | | Exp | Exp | Exp | Exp |
| ExpandDims | ✅ | ✅ | ✅ | ✅ | | | | ExpandDims | | | ExpandDims |
| Fill | | ✅ | | | | | | Fill | | | Fill |
| Flatten | ✅ | ✅ | | | | | | | Flatten | | |
| Floor | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | flOOR | | Floor | Floor |
| FloorDiv | ✅ | ✅ | | | ✅ | ✅ | ✅ | FloorDiv | | | FloorDiv |
| FloorMod | ✅ | ✅ | | | ✅ | ✅ | ✅ | FloorMod | | | FloorMod |
| FullConnection | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | FullyConnected | InnerProduct | | |
| FusedBatchNorm | ✅ | ✅ | ✅ | ✅ | | | ✅ | FusedBatchNorm | | | FusedBatchNorm,
FusedBatchNormV3 |
| GatherNd | | ✅ | ✅ | ✅ | ✅ | ✅ | | GatherND | | | GatherNd |
| Gather | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | Gather | | Gather | GatherV2 |
| Greater | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | Greater | | Greater | Greater |
| GreaterEqual | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | GreaterEqual | | | GreaterEqual |
| GRU | ✅ | ✅ | | | | | | | | | |
| HardTanh | ✅ | ✅ | | | | | | | | | |
| HashtableLookup | | ✅ | | | | | | HashtableLookup | | | |
| HSigmoid | | ✅ | | ✅ | | | | | | | |
| Hswish | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | HardSwish | | | |
| HswishGrad | | ✅ | | | | | | | | | |
| InstanceNorm | ✅ | ✅ | | | | | | InstanceNorm | | | |
| InvertPermutation | | ✅ | | | | | | | | | InvertPermutation |
| L2Norm | | ✅ | ✅ | | | | | L2_NORMALIZATION | | | |
| LayerNorm | | ✅ | ✅ | | | | | | | | |
| LeakyReLU | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | LeakyRelu | | LeakyRelu | LeakyRelu |
| LeakyReLUGrad | | ✅ | | | | | | | | | |
| Less | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | Less | | Less | Less |
| LessEqual | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | LessEqual | | | LessEqual |
| LRN | | ✅ | | | | | | LocalResponseNorm | | Lrn,
LRN | |
| Log | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | Log | | Log | Log |
| LogGrad | ✅ | ✅ | | | | | | | | | |
| LogicalAnd | ✅ | ✅ | | | ✅ | ✅ | ✅ | LogicalAnd | | And | LogicalAnd |
| LogicalNot | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | LogicalNot | | Not | LogicalNot |
| LogicalOr | ✅ | ✅ | | | ✅ | ✅ | ✅ | LogicalOr | | Or | LogicalOr |
| LshProjection | | ✅ | | | | | | LshProjection | | | |
| LSTM | ✅ | ✅ | | | | | | | | LSTM | |
| MatMul | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | | | MatMul | MatMul,
BatchMatMul |
| MatMulGrad | | ✅ | | | | | | | | | |
| Maximum | ✅ | ✅ | | | ✅ | ✅ | ✅ | Maximum | | | Maximum |
| MaximumGrad | | ✅ | | | | | | | | | |
| MaxPool | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | MaxPooling | Pooling | MaxPool,
GlobalMaxPool | MaxPool |
| MaxPoolGrad | | ✅ | | | | | | | | | |
| Merge | ✅ | ✅ | | | | | | | | | Merge |
| Minimum | ✅ | ✅ | | | ✅ | ✅ | ✅ | Minimum | | Min | Minimum |
| MinimumGrad | | ✅ | | | | | | | | | |
| Mul | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | Mul | | Mul | Mul |
| MulGrad | | ✅ | | | | | | | | | |
| Neg | ✅ | ✅ | | | ✅ | ✅ | ✅ | Neg | | Neg | |
| NegGrad | | ✅ | | | | | | | | | |
| NotEqual | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | NotEqual | | | NotEqual |
| OneHot | | ✅ | | | ✅ | ✅ | | OneHot | | OneHot | OneHot |
| Pad | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | Pad,
MirrorPad | | Pad | MirrorPad,
Pad |
| Pow | | ✅ | ✅ | ✅ | ✅ | ✅ | | Pow | Power | Pow[2] | Pow |
| PowGrad | | ✅ | | | | | | | | | |
| PReLU | | ✅ | | | ✅ | ✅ | | PRELU | PReLU | PRelu | |
| RandomStandardNormal | | ✅ | | | | | | | | | RandomStandardNormal |
| RandomUniform | | ✅ | | | | | | | | | RandomUniform |
| Range | | ✅ | | | | | | Range | | | Range,
RaggedRange |
| Rank | | ✅ | | | | | | Rank | | | Rank |
| Reciprocal | ✅ | ✅ | ✅ | | | | ✅ | | | | |
| ReduceAll | | ✅ | | | | | | | | | All |
| ReduceASum | | ✅ | | | ✅ | ✅ | | | Reduction | | |
| ReduceMax | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | | ReduceMax | | ReduceMax | Max |
| ReduceMean | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | | Mean | Reduction | ReduceMean | Mean |
| ReduceMin | | ✅ | ✅ | ✅ | ✅ | ✅ | | ReduceMin | | ReduceMin | Min |
| ReduceProd | | ✅ | ✅ | ✅ | ✅ | ✅ | | ReduceProd | | ReduceProd | Prod |
| ReduceSum | | ✅ | ✅ | ✅ | ✅ | ✅ | | Sum | Reduction | ReduceSum | Sum |
| ReduceSumSquare | | ✅ | ✅ | ✅ | | | | | Reduction | ReduceSumSquare | |
| ReLU | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | Relu | ReLU | Relu | Relu |
| ReLUGrad | ✅ | ✅ | | | | | | | | | |
| ReLU6 | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | Relu6 | ReLU6 | Clip[1] | Relu6 |
| ReLU6Grad | ✅ | ✅ | | | | | | | | | |
| Reshape | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | Reshape | Reshape | Reshape,Flatten | Reshape |
| Resize | | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ResizeBilinear,
NearestNeighbor | Interp | | ResizeBilinear,
ResizeBicubic,
ResizeNearestNeighbor |
| ResizeGrad | | ✅ | | | | | | | | | |
| Reverse | | ✅ | | | | | | reverse | | | ReverseV2 |
| ReverseSequence | | ✅ | | | | | | ReverseSequence | | | ReverseSequence |
| Round | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | Round | | Round | Round |
| Rsqrt | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | Rsqrt | | | Rsqrt |
| Select | | ✅ | | | | | | | | | Select |
| Selu | | | | | | | | | | | Selu |
| Scale | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | | Scale | | |
| ScatterNd | | ✅ | | | | | | ScatterNd | | | |
| Shape | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | Shape | | Shape | Shape |
| Sigmoid | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | Logistic | Sigmoid | Sigmoid | Sigmoid |
| SigmoidGrad | ✅ | ✅ | | | | | | | | | |
| Sin | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | Sin | | Sin | Sin |
| Size | | ✅ | | | | | | | | | Size |
| Slice | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | Slice | Slice | Slice | Slice |
| SkipGram | | ✅ | | | | | | SKipGram | | | |
| Softmax | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | Softmax | Softmax | Softmax | Softmax |
| SoftmaxGrad | | ✅ | | | | | | | | | |
| SpaceToBatch | | ✅ | ✅ | ✅ | ✅ | ✅ | | SpaceToBatch | | | |
| SpaceToBatchND | | ✅ | ✅ | ✅ | ✅ | ✅ | | SpaceToBatchND | | | SpaceToBatchND |
| SpaceToDepth | | ✅ | | | ✅ | ✅ | | SpaceToDepth | | SpaceToDepth | |
| SparseToDense | | ✅ | | | ✅ | ✅ | | SpareToDense | | | |
| Split | ✅ | ✅ | ✅ | ✅ | | | ✅ | Split, SplitV | | Split | Split, SplitV |
| Sqrt | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | Sqrt | | Sqrt | Sqrt |
| Square | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | Square | | | Square |
| SquaredDifference | ✅ | ✅ | | | ✅ | ✅ | ✅ | SquaredDifference | | | SquaredDifference |
| Squeeze | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | | Squeeze | | Squeeze | Squeeze |
| StridedSlice | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | StridedSlice | | | StridedSlice |
| Stack | ✅ | ✅ | | | ✅ | ✅ | | Stack | | | Pack |
| Sub | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | Sub | | Sub | Sub |
| SubGrad | | ✅ | | | | | | | | | |
| Swish | ✅ | ✅ | | | | | | | | | |
| Switch | ✅ | ✅ | | | | | | | | | Switch |
| Tanh | ✅ | ✅ | | | ✅ | ✅ | ✅ | Tanh | TanH | Tanh, Sign | Tanh |
| TanhGrad | | ✅ | | | | | | | | | |
| TensorListFromTensor | ✅ | ✅ | | | | | | | | | TensorListFromTensor |
| TensorListGetItem | ✅ | ✅ | | | | | | | | | TensorListGetItem |
| TensorListReserve | ✅ | ✅ | | | | | | | | | TensorListReserve |
| TensorListSetItem | ✅ | ✅ | | | | | | | | | TensorListSetItem |
| TensorListStack | ✅ | ✅ | | | | | | | | | TensorListStack |
| Tile | ✅ | ✅ | | | | | | Tile | Tile | Tile | Tile |
| TopK | | ✅ | ✅ | ✅ | | | | TopKV2 | | TopK | TopKV2 |
| Transpose | ✅ | ✅ | ✅ | | ✅ | ✅ | ✅ | Transpose | Permute | Transpose | Transpose |
| Unique | | ✅ | | | | | | Unique | | | |
| Unsqueeze | ✅ | ✅ | ✅ | ✅ | | | ✅ | | | Unsqueeze | |
| Unstack | | ✅ | | | | | | Unstack | | | |
| Where | | ✅ | | | | | | Where | | | Where |
| ZerosLike | | ✅ | | | | | | ZerosLike | | | ZerosLike |
| 转换工具支持的其他算子[4] | | | | | | | | | | Loop,
Dropout,
If | Dropout, Enter,
Exit, If,
IsFinite,
LinSpace,
LoopCond,
NextIteration,
StatelessIf,
StatelessWhile,
While |
[1] Clip:仅支持将clip(0, 6)转换为Relu6。
[2] Pow:仅支持指数为单个常数。
[3] Sum与Max:仅支持输入个数为2。
[4] [转换工具](https://www.mindspore.cn/tutorial/lite/zh-CN/r1.2/use/converter_tool.html)支持,但不需要具体实现的算子,一般这类算子在转化工具中被优化而消失,如被融合掉或者使用其他算子代替。