# Lite Operator List [](https://gitee.com/mindspore/docs/blob/master/docs/lite/docs/source_en/reference/operator_list_lite.md) MindSpore Lite supports operator lists for different hardware backends: | Operation <br/> | CPU<br/>FP16 | CPU<br/>FP32 | CPU<br/>Int32 | CPU<br/>Int8 | CPU<br/>UInt8 | CPU<br/>Bool | Mali/Adreno GPU<br/>FP16 | Mali/Adreno GPU<br/>FP32 | Mali/Adreno GPU<br/>Int32 | Mali/Adreno GPU<br/>Int8 | Kirin NPU<br/>FP16 | Nvidia GPU<br/>FP16 | Ascend<br/>FP16 | | ----------------------------------- | :----------: | :----------: | ------------- | :----------: | :-----------: | ------------ | :----------------------: | :----------------------: | ------------------------- | ------------------------ | :--------------------: | :----------------: | :----------------------: | | Abs | ✅ | ✅ | ✅ | ✅ | ✅ | | ✅ | ✅ | | | ✅ | ✅ | ✅ | | AbsGrad | | ✅ | | | | | | | | | | | | | Activation | ✅ | ✅ | ✅ | ✅ | ✅ | | ✅ | ✅ | | | ✅ | ✅ | ✅ | | ActivationGrad | ✅ | ✅ | | | | | | | | | | | | | Adam | | ✅ | | | | | | | | | | | | | AddFusion | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | | ✅ | ✅ | ✅ | ✅ | | AdderFusion | | ✅ | | | | | | | | | | | | | AddGrad | | ✅ | | | | | | | | | | | | | AddN | ✅ | ✅ | | | | | | | | | | | | | Affine | | ✅ | | | | | | | | | | | ✅ | | All | | ✅ | | | | | | | | | | ✅ | | | AllGather | | ✅ | | | | | | | | | | ✅ | | | ApplyMomentum | | ✅ | | | | | | | | | | | ✅ | | Assert | ✅ | ✅ | | | | ✅ | | | | | | | | | Assign | | ✅ | | | | | | | | | | | ✅ | | ArgmaxFusion | ✅ | ✅ | | ✅ | ✅ | | ✅ | ✅ | | | ✅ | ✅ | ✅ | | ArgminFusion | ✅ | ✅ | | ✅ | ✅ | | ✅ | ✅ | | | | | ✅ | | AvgPoolFusion | ✅ | ✅ | | ✅ | ✅ | | ✅ | ✅ | | | ✅ | ✅ | ✅ | | AvgPoolGrad | ✅ | ✅ | | | | | | | | | | | | | BatchNorm | ✅ | ✅ | | ✅ | ✅ | | ✅ | ✅ | | | | | ✅ | | BatchNormGrad | ✅ | ✅ | | | | | | | | | | | | | BatchToSpace | | ✅ | | ✅ | ✅ | | ✅ | ✅ | | | | | | | BatchToSpaceND | ✅ | ✅ | | ✅ | ✅ | | ✅ | ✅ | | | | | | | BiasAdd | ✅ | ✅ | | ✅ | ✅ | | ✅ | ✅ | | | | ✅ | ✅ | | BiasAddGrad | ✅ | ✅ | | | | | | | | | | | | | BinaryCrossEntropy | | ✅ | | | | | | | | | | | ✅ | | BinaryCrossEntropyGrad | | ✅ | | | | | | | | | | | | | BroadcastTo | ✅ | ✅ | ✅ | | | ✅ | | | | | | | | | Call | ✅ | ✅ | ✅ | | | ✅ | | | | | | | ✅ | | Cast | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | | | ✅ | ✅ | ✅ | | Ceil | ✅ | ✅ | | ✅ | ✅ | | ✅ | ✅ | | | ✅ | | ✅ | | Clip | | ✅ | ✅ | | | | | | | | | | ✅ | | Concat | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | | ✅ | ✅ | ✅ | | ConstantOfShape | ✅ | ✅ | ✅ | | | | | | | | | | | | Conv2DFusion | ✅ | ✅ | | ✅ | ✅ | | ✅ | ✅ | | | ✅ | ✅ | ✅ | | Conv2DBackpropFilterFusion | ✅ | ✅ | | | | | | | | | | | | | Conv2DBackpropInputFusion | ✅ | ✅ | | | | | | | | | | | | | Conv2dTransposeFusion | ✅ | ✅ | | ✅ | ✅ | | ✅ | ✅ | | | ✅ | ✅ | ✅ | | Cos | ✅ | ✅ | | ✅ | ✅ | | ✅ | ✅ | | | ✅ | | ✅ | | Crop | ✅ | ✅ | ✅ | ✅ | ✅ | | | | | | | | | | CropAndResize | | ✅ | | | | | | | | | ✅ | | | | CumSum | | ✅ | ✅ | | | | | | | | | | ✅ | | CustomExtractFeatures | | ✅ | | | | | | | | | | | | | CustomNormalize | | ✅ | | | | | | | | | | | | | CustomPredict | | ✅ | ✅ | | | | | | | | | | | | DEConv2DGradFilter | | ✅ | | | | | | | | | | | | | DepthToSpace | ✅ | ✅ | | ✅ | ✅ | | ✅ | ✅ | | | | | | | DetectionPostProcess | | ✅ | | ✅ | ✅ | | | | | | | | | | DivFusion | ✅ | ✅ | ✅ | ✅ | ✅ | | ✅ | ✅ | | | ✅ | ✅ | ✅ | | DivGrad | | ✅ | | | | | | | | | | | | | Dropout | ✅ | ✅ | | | | | | | | | | | ✅ | | DropoutGrad | ✅ | ✅ | | | | | | | | | | | | | DynamicQuant | | ✅ | | | | | | | | | | | | | Eltwise | ✅ | ✅ | | ✅ | ✅ | | ✅ | ✅ | | | ✅ | ✅ | ✅ | | Elu | ✅ | ✅ | | | | | | | | | | | ✅ | | Equal | ✅ | ✅ | ✅ | ✅ | ✅ | | ✅ | ✅ | | | ✅ | ✅ | ✅ | | EmbeddingLookupFusion | | ✅ | | | | | | | | | | | | | Erf | ✅ | ✅ | | | | | | | | | | | ✅ | | ExpFusion | ✅ | ✅ | | | | | ✅ | ✅ | | | | | ✅ | | ExpandDims | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | | ✅ | ✅ | ✅ | | Fill | ✅ | ✅ | ✅ | | | ✅ | ✅ | ✅ | | | | | ✅ | | Flatten | ✅ | ✅ | ✅ | | | | | | | | | ✅ | ✅ | | FlattenGrad | ✅ | ✅ | | | | | | | | | | | | | Floor | ✅ | ✅ | | ✅ | ✅ | | ✅ | ✅ | | | ✅ | | ✅ | | FloorDiv | ✅ | ✅ | ✅ | | | | ✅ | ✅ | | | ✅ | | | | FloorMod | ✅ | ✅ | ✅ | | | | ✅ | ✅ | | | ✅ | | | | FullConnection | ✅ | ✅ | | ✅ | ✅ | | ✅ | ✅ | | | ✅ | ✅ | ✅ | | FusedBatchNorm | ✅ | ✅ | | ✅ | ✅ | | | | | | ✅ | | ✅ | | GatherNd | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | | | | | ✅ | | Gather | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | | ✅ | ✅ | ✅ | | GatherD | ✅ | ✅ | ✅ | | | ✅ | | | | | | | ✅ | | GLU | | ✅ | | | | | | | | | | | | | Greater | ✅ | ✅ | ✅ | ✅ | ✅ | | ✅ | ✅ | | | ✅ | | ✅ | | GreaterEqual | ✅ | ✅ | ✅ | ✅ | ✅ | | ✅ | ✅ | | | ✅ | | ✅ | | GroupNormFusion | | ✅ | | | | | | | | | | | | | GRU | ✅ | ✅ | | | | | | | | | | | | | HashtableLookup | | ✅ | ✅ | | | | | | | | | | | | InstanceNorm | ✅ | ✅ | | | | | | | | | ✅ | | ✅ | | InvertPermutation | ✅ | ✅ | ✅ | | | | | | | | | | | | IsFinite | | ✅ | | | | | | | | | | | ✅ | | L2NormalizeFusion | | ✅ | | ✅ | ✅ | | | | | | | | | | LayerNormFusion | ✅ | ✅ | | ✅ | | | ✅ | ✅ | | | | | ✅ | | LayerNormGrad | ✅ | ✅ | | | | | | | | | | | | | LeakyReLU | ✅ | ✅ | | ✅ | ✅ | | ✅ | ✅ | | | ✅ | | ✅ | | Less | ✅ | ✅ | ✅ | ✅ | ✅ | | ✅ | ✅ | | | ✅ | | ✅ | | LessEqual | ✅ | ✅ | ✅ | ✅ | ✅ | | ✅ | ✅ | | | ✅ | | ✅ | | LRN | | ✅ | | | | | | | | | | | ✅ | | Log | ✅ | ✅ | | ✅ | ✅ | | ✅ | ✅ | | | ✅ | | ✅ | | Log1p | | ✅ | | | | | | | | | | | ✅ | | LogGrad | ✅ | ✅ | | | | | | | | | | | | | LogicalAnd | ✅ | ✅ | ✅ | | | ✅ | ✅ | ✅ | | | ✅ | | | | LogicalNot | ✅ | ✅ | | ✅ | ✅ | ✅ | ✅ | ✅ | | | ✅ | | | | LogicalOr | ✅ | ✅ | | | | ✅ | ✅ | ✅ | | | ✅ | | | | LogSoftmax | ✅ | ✅ | | | | | | | | | | | ✅ | | LshProjection | | ✅ | | | | | | | | | | | | | LSTM | ✅ | ✅ | | | | | | | | | | | | | LSTMGrad | | ✅ | | | | | | | | | | | | | LSTMGradData | | ✅ | | | | | | | | | | | | | LSTMGradWeight | | ✅ | | | | | | | | | | | | | MatMulFusion | ✅ | ✅ | | ✅ | ✅ | | ✅ | ✅ | | | ✅ | ✅ | ✅ | | Maximum | ✅ | ✅ | ✅ | | | | ✅ | ✅ | | | ✅ | | ✅ | | MaximumGrad | ✅ | ✅ | | | | | | | | | | | | | MaxPoolFusion | ✅ | ✅ | | ✅ | ✅ | | ✅ | ✅ | | | ✅ | ✅ | ✅ | | MaxPoolGrad | ✅ | ✅ | | | | | | | | | | | | | Merge | ✅ | ✅ | | | | | | | | | | | | | Minimum | ✅ | ✅ | ✅ | | | | ✅ | ✅ | | | ✅ | | ✅ | | MinimumGrad | ✅ | ✅ | | | | | | | | | | | | | Mod | | ✅ | ✅ | | | | | | | | | | ✅ | | MulFusion | ✅ | ✅ | ✅ | ✅ | ✅ | | ✅ | ✅ | | | ✅ | ✅ | ✅ | | MulGrad | | ✅ | | | | | | | | | | | | | Neg | ✅ | ✅ | ✅ | | | | ✅ | ✅ | | | ✅ | | ✅ | | NegGrad | ✅ | ✅ | | | | | | | | | | | | | NLLLoss | | ✅ | | | | | | | | | | | ✅ | | NLLLossGrad | | ✅ | | | | | | | | | | | | | NotEqual | ✅ | ✅ | ✅ | ✅ | ✅ | | ✅ | ✅ | | | ✅ | | | | NonMaxSupppression | | ✅ | | | | | | | | | | | ✅ | | NonZero | | | | | | ✅ | | | | | | | ✅ | | OneHot | ✅ | ✅ | ✅ | | | | ✅ | ✅ | ✅ | | | | | | OnesLike | ✅ | ✅ | ✅ | | | | | | | | | ✅ | ✅ | | PadFusion | ✅ | ✅ | | ✅ | ✅ | | ✅ | ✅ | | | ✅ | ✅ | ✅ | | PartialFusion | ✅ | ✅ | ✅ | | | ✅ | | | | | | | | | PowFusion | ✅ | ✅ | | ✅ | ✅ | | ✅ | ✅ | | | | ✅ | ✅ | | PowerGrad | | ✅ | | | | | | | | | | | | | PriorBox | | ✅ | | ✅ | ✅ | | | | | | | | ✅ | | PReLUFusion | ✅ | ✅ | | | | | ✅ | ✅ | | | | | ✅ | | QuantDTypeCast | ✅ | ✅ | | ✅ | ✅ | | | | | | | | | | RaggedRange | ✅ | ✅ | ✅ | | | | | | | | | | | | RandomNormal | ✅ | ✅ | | | | | | | | | | | | | RandomStandardNormal | ✅ | ✅ | | | | | | | | | | | | | Range | ✅ | ✅ | ✅ | | | | | | | | | | ✅ | | Rank | ✅ | ✅ | | | | | | | | | | | | | RealDiv | ✅ | ✅ | | | | | | | | | | | ✅ | | Reciprocal | ✅ | ✅ | | ✅ | | | | | | | ✅ | | ✅ | | ReduceFusion | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | | | ✅ | ✅ | ✅ | | ReduceScatter | | ✅ | | | | | | | | | | ✅ | | | Reshape | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | | ✅ | ✅ | ✅ | | Resize | ✅ | ✅ | | ✅ | ✅ | | ✅ | ✅ | | | ✅ | ✅ | | | ResizeGrad | ✅ | ✅ | | | | | | | | | | | | | ReverseV2 | | ✅ | ✅ | | | | | | | | | | | | ReverseSequence | | ✅ | | | | | | | | | | | ✅ | | ROIPooling | | ✅ | | | | | | | | | | | ✅ | | Round | ✅ | ✅ | | ✅ | ✅ | | ✅ | ✅ | | | ✅ | | ✅ | | Rsqrt | ✅ | ✅ | | ✅ | ✅ | | ✅ | ✅ | | | ✅ | | | | RsqrtGrad | | ✅ | | | | | | | | | | | ✅ | | Select | | ✅ | | | | ✅ | | | | | | | | | Selu | | | | | | | | | | | | | | | ScaleFusion | ✅ | ✅ | | ✅ | ✅ | | ✅ | ✅ | | | ✅ | ✅ | ✅ | | ScatterNd | ✅ | ✅ | ✅ | | | | | | | | | | ✅ | | ScatterNdUpdate | ✅ | ✅ | ✅ | | | | | | | | | | | | SGD | | ✅ | | | | | | | | | | | ✅ | | Shape | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | | | | | ✅ | | SigmoidCroosEntropyWithLogits | | ✅ | | | | | | | | | | | ✅ | | SigmoidCroosEntropyWithLogitsGrad | | ✅ | | | | | | | | | | | ✅ | | Sin | ✅ | ✅ | | ✅ | ✅ | | ✅ | ✅ | | | ✅ | | ✅ | | Size | ✅ | ✅ | ✅ | | | | | | | | | | ✅ | | SliceFusion | ✅ | ✅ | ✅ | ✅ | ✅ | | ✅ | ✅ | | | ✅ | | ✅ | | SkipGram | | ✅ | | | | | | | | | | | | | SmoothL1Loss | | ✅ | | | | | | | | | | | ✅ | | SmoothL1LossGrad | | ✅ | | | | | | | | | | | ✅ | | Softmax | ✅ | ✅ | | ✅ | ✅ | | ✅ | ✅ | | | ✅ | ✅ | ✅ | | SoftmaxGrad | | ✅ | | | | | | | | | | | | | Softplus | ✅ | ✅ | | | | | | | | | | | ✅ | | SpaceToBatch | ✅ | ✅ | | ✅ | ✅ | | ✅ | ✅ | | | | | | | SpaceToBatchND | ✅ | ✅ | | ✅ | ✅ | | ✅ | ✅ | | | | | | | SpaceToDepth | ✅ | ✅ | | | | | ✅ | ✅ | | | | | ✅ | | SparseToDense | ✅ | ✅ | ✅ | | | | ✅ | ✅ | ✅ | | | | | | SparseSoftmaxCrossEntropyWithLogits | | ✅ | | | | | | | | | | | ✅ | | Splice | ✅ | ✅ | | | | | | | | | | | | | Split | ✅ | ✅ | ✅ | ✅ | ✅ | | ✅ | ✅ | | | ✅ | | ✅ | | SplitWithOverlap | ✅ | ✅ | | | | | | | | | | | | | Sqrt | ✅ | ✅ | | ✅ | ✅ | | ✅ | ✅ | | | ✅ | ✅ | ✅ | | SqrtGrad | | ✅ | | | | | | | | | | | ✅ | | Square | ✅ | ✅ | | ✅ | ✅ | | ✅ | ✅ | | | ✅ | | ✅ | | SquaredDifference | ✅ | ✅ | | | | | ✅ | ✅ | | | | | | | Squeeze | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | | | ✅ | | | StridedSlice | ✅ | ✅ | ✅ | ✅ | ✅ | | ✅ | ✅ | | | ✅ | ✅ | ✅ | | StridedSliceGrad | ✅ | ✅ | | | | | | | | | | | | | Stack | ✅ | ✅ | ✅ | | | | ✅ | ✅ | | | | | ✅ | | SubFusion | ✅ | ✅ | ✅ | ✅ | ✅ | | ✅ | ✅ | | | ✅ | ✅ | ✅ | | SubGrad | | ✅ | | | | | | | | | | | | | Switch | ✅ | ✅ | ✅ | | | ✅ | | | | | | | | | SwitchLayer | ✅ | ✅ | ✅ | | | ✅ | | | | | | | | | TensorListFromTensor | ✅ | ✅ | ✅ | | | | | | | | | | | | TensorListGetItem | ✅ | ✅ | ✅ | | | | | | | | | | | | TensorListReserve | ✅ | ✅ | ✅ | | | | | | | | | | | | TensorListSetItem | ✅ | ✅ | ✅ | | | | | | | | | | | | TensorListStack | ✅ | ✅ | ✅ | | | | | | | | | | | | TensorScatterAdd | | ✅ | ✅ | | | | | | | | | | | | TileFusion | ✅ | ✅ | ✅ | | | ✅ | | | | | ✅ | | ✅ | | TopKFusion | ✅ | ✅ | ✅ | ✅ | ✅ | | | | | | | | ✅ | | Transpose | ✅ | ✅ | ✅ | ✅ | | ✅ | ✅ | ✅ | | | ✅ | ✅ | ✅ | | UniformReal | | ✅ | ✅ | | | | | | | | | | | | Unique | ✅ | ✅ | ✅ | | | | | | | | | | | | UnsortedSegmentSum | ✅ | ✅ | ✅ | | | | | | | | | | | | Unsqueeze | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | | ✅ | ✅ | | | Unstack | ✅ | ✅ | ✅ | | | | | | | | | | | | Where | ✅ | ✅ | ✅ | | | ✅ | | | | | | | | | ZerosLike | ✅ | ✅ | ✅ | | | | | | | | | | | MindSpore Lite conversion tool supports operator lists for third-party frameworks: | Operation <br/> | TensorFlow Lite<br/>operators supported | Caffe<br/>operators supported | Onnx<br/>operators supported | TensorFlow<br/>operators supported | | ------------------------------------ | ------------------------------------------------------------ | ------------------------------- | ------------------------------------------------------------ | ------------------------------------------------------------ | | Abs | Abs | | Abs | Abs | | Activation | Activation, ReLU, ReLU6, PReLU, <br/>LeakyReLU, Tanh, HardSwish, Logistic | ReLU, ReLU6, Sigmoid, TanH, Elu | Relu, LeakyRelu, PRelu, Elu, Tanh, Sigmoid, HardSigmoid, Softplus,Gelu | Activation, Elu, Relu, Relu6, Sigmoid, Tanh, Selu, LeakyRelu, Softplus | | Adam | Adam | | | Adam | | AddFusion | Add | | Add, Int8Add | Add, AddV2 | | AdderFusion | | | adder_f | | | AddN | AddN | | | | | All | All | | | All | | ApplyMomentum | ApplyMomentum | | | ApplyMomentum | | Assert | | | | Assert | | Assign | Assign | | | Assign | | ArgmaxFusion | Argmax | ArgMax | ArgMax | ArgMax | | ArgminFusion | Argmin | | ArgMin | ArgMin | | AvgPoolFusion | MeanPooling | Pooling | AveragePool,<br/>GlobalAveragePool,<br/>Int8AveragePool | AvgPool | | BatchNorm | | BatchNorm | BatchNormalization | | | BatchToSpace | BatchToSpace | | | BatchToSpace | | BatchToSpaceND | BatchToSpaceND | | | BatchToSpaceND | | BiasAdd | | | BiasAdd | BiasAdd | | BinaryCrossEntropy | BinaryCrossEntropy | | | BinaryCrossEntropy | | BroadcastTo | BroadcastTo | | Expand | BroadcastTo | | Cast | Cast,<br/>QUANTIZE,<br/>DEQUANTIZE | | Cast | Cast | | Ceil | Ceil | | Ceil | Ceil | | Clip | Clip | | Clip | Clip | | Concat | Concat | Concat | Concat | ConcatV2 | | ConstantOfShape | | | ConstantOfShape | | | Conv2DFusion | Conv2D | Convolution | Conv, Int8Conv,<br/>ConvRelu,<br/>Int8ConvRelu | Conv2D | | Conv2dTransposeFusion | DeConv2D | Deconvolution | ConvTranspose | Conv2DBackpropInput | | Cos | Cos | | Cos | Cos | | Crop | | Crop | | | | CropAndResize | | | | CropAndResize | | CumSum | | | Cumsum | Cumsum | | CustomExtractFeatures | ExtractFeatures | | | | | CustomNormalize | Normalize | | | | | CustomPredict | Predict | | | | | DepthToSpace | DepthToSpace | | DepthToSpace | DepthToSpace | | DetectionPostProcess | Custom | | | | | DivFusion | Div, RealDiv | | Div | Div, RealDiv | | Dropout | Dropout | | Dropout | Dropout | | DynamicQuant | | | DynamicQuantizeLinear | | | Eltwise | | Eltwise | Sum, Max<sup>[3]</sup> | | | Elu | | ELU | Elu,<br/>NonMaxSuppression | NonMaxSuppressionV3 | | Equal | Equal | | Equal | Equal | | Erf | Erf | | Erf | Erf | | ExpFusion | Exp | Exp | Exp | Exp | | ExpandDims | ExpandDims | | | ExpandDims | | Fill | Fill | | | Fill | | Flatten | | Flatten | | | | Floor | flOOR | | Floor | Floor | | FloorDiv | FloorDiv | | | FloorDiv | | FloorMod | FloorMod | | | FloorMod | | FullConnection | FullyConnected | InnerProduct | | | | FusedBatchNorm | FusedBatchNorm | | | FusedBatchNorm,<br/>FusedBatchNormV3 | | GatherNd | GatherND | | GatherND | GatherNd | | Gather | Gather | | Gather | GatherV2 | | Greater | Greater | | Greater | Greater | | GreaterEqual | GreaterEqual | | GreaterOrEqual | GreaterEqual | | HashtableLookup | HashtableLookup | | | | | InstanceNorm | InstanceNorm | | InstanceNormalization | | | InvertPermutation | | | | InvertPermutation | | IsFinite | IsFinite | | | IsFinite | | LeakyReLU | LeakyRelu | | LeakyRelu | LeakyRelu | | Less | Less | | Less | Less | | LessEqual | LessEqual | | | LessEqual | | LRN | LocalResponseNorm | | Lrn, LRN | | | Log | Log | | Log | Log | | Log1p | Log1p | | | Log1p | | LogicalAnd | LogicalAnd | | And | LogicalAnd | | LogicalNot | LogicalNot | | Not | LogicalNot | | LogicalOr | LogicalOr | | Or | LogicalOr | | LogSoftmax | LogSoftmax | | LogSoftmax | | | LshProjection | LshProjection | | | | | LSTM | | | LSTM | | | MatMulFusion | BatchMatMul | | MatMul,<br/>Gemm | MatMul,<br/>BatchMatMul,<br/>BatchMatMulV2 | | Maximum | Maximum | | Max | Maximum | | MaxPoolFusion | MaxPooling | Pooling | MaxPool,<br/>GlobalMaxPool | MaxPool | | Merge | | | | Merge | | Minimum | Minimum | | Min | Minimum | | MinimumGrad | | | | | | Mod | Mod | | Mod | Mod | | MulFusion | Mul | | Mul | Mul | | MulGrad | | | | | | Neg | Neg | | Neg | Neg | | NotEqual | NotEqual | | | NotEqual | | NonMaxSupppression | NonMaxSupppression | | NonMaxSupppression | NonMaxSupppression | | NonZero | NonZero | | NonZero | NonZero | | OneHot | OneHot | | OneHot | OneHot | | OnesLike | OnesLike | | | OnesLike | | PadFusion | Pad, MirrorPad, PadV2 | | Pad | MirrorPad, Pad, PadV2 | | PowFusion | Pow | Power | Pow<sup>[2]</sup> | Pow | | PReLUFusion | PRELU | PReLU | PRelu | | | RaggedRange | | | | RaggedRange | | RandomNormal | RandomNormal | | RandomNormal | RandomNormal | | RandomStandardNormal | | | | RandomStandardNormal | | Range | Range | | Range | Range | | Rank | Rank | | | Rank | | Reciprocal | | | Reciprocal | | | ReduceFusion | Sum, Mean, ReduceMax, ReduceMin, ReduceProd | Reduction | ReduceMean, ReduceMax, ReduceMin, ReduceProd, ReduceSum, ReduceSumSquare, ReduceL2,ReduceL1,ReduceLogSum | Sum, Max, Min, Mean, Prod, All | | Reshape | Reshape | Reshape | Reshape,<br/>Flatten | Reshape | | Resize | ResizeBilinear,<br/>NearestNeighbor | Interp | Resize, Upsample | ResizeBilinear,<br/>ResizeBicubic,<br/>ResizeNearestNeighbor | | ReverseV2 | reverse | | | ReverseV2 | | ReverseSequence | ReverseSequence | | ReverseSequence | ReverseSequence | | Round | Round | | Round | Round | | Rsqrt | Rsqrt | | | Rsqrt | | Select | | | | Select | | Selu | | | | Selu | | ScaleFusion | | Scale | | | | ScatterNd | ScatterNd | | ScatterND | | | ScatterNdUpdate | ScatterNdUpdate | | ScatterNdUpdate | | | SGD | SGD | SGD | | SGD | | Shape | Shape | | Shape | Shape | | Sin | Sin | | Sin | Sin | | Size | | | | Size | | SliceFusion | Slice | Slice | Slice | Slice | | SkipGram | SKipGram | | | | | Softmax | Softmax | Softmax | Softmax | Softmax | | Softplus | | | | Softplus | | SpaceToBatch | SpaceToBatch | | | | | SpaceToBatchND | SpaceToBatchND | | | SpaceToBatchND | | SpaceToDepth | SpaceToDepth | | SpaceToDepth | | | SparseToDense | SpareToDense | | | | | Splice | | | Splice | | | Split | Split, SplitV | | Split | Split, SplitV | | Sqrt | Sqrt | | Sqrt | Sqrt | | Square | Square | | | Square | | SquaredDifference | SquaredDifference | | | SquaredDifference | | Squeeze | Squeeze | | Squeeze | Squeeze | | StridedSlice | StridedSlice | | Slice,<br/>DynamicSlice | StridedSlice | | Stack | Stack | | | Pack | | SubFusion | Sub | | Sub | Sub | | Switch | | | | Switch | | TensorListFromTensor | | | | TensorListFromTensor | | TensorListGetItem | | | | TensorListGetItem | | TensorListReserve | | | | TensorListReserve | | TensorListSetItem | | | | TensorListSetItem | | TensorListStack | | | | TensorListStack | | TensorScatterAdd | TensorScatterAdd | | | TensorScatterAdd | | TileFusion | Tile | Tile | Tile | Tile | | TopKFusion | TopKV2 | | TopK | TopKV2 | | Transpose | Transpose | Permute | Transpose, Int8Transpose | Transpose | | Unique | Unique | | | | | UnsortedSegmentSum | | | | UnsortedSegmentSum | | Unsqueeze | | | Unsqueeze | | | Unstack | Unstack | | | | | Where | Where | | NonZero, Where | Where | | ZerosLike | ZerosLike | | | ZerosLike | | Other operators supported by the converter.<sup>[4]</sup> | | | Constant,<br/>Atan, Asin, Tan, <br/>Loop, Dropout, If, Identity,<br/>Int8GivenIntTensorFill,<br/>Int8GivenTensorFill,<br/>Int8Quantize,<br/>Int8Dequantize,<br/>LpNormalization | Dropout, Enter,<br/>Exit, If, <br/>LinSpace,<br/>LoopCond,<br/>NextIteration,<br/>StatelessIf,<br/>StatelessWhile,<br/>TensorArrayGatherV3,<br/>TensorArrayReadV3,<br/>TensorArrayScatterV3,<br/>TensorArraySizeV3,<br/>TensorArrayV3,<br/>TensorArrayWriteV3,<br/>While | [1] Clip: Only support converting clip(0, 6) to Relu6. [2] Pow: Only support the form where the exponent is a single constant. [3] Sum and Max: Only support 2 inputs. [4] Operators supported by [converter](https://www.mindspore.cn/lite/docs/en/master/converter/converter_tool.html) but do not require specific implementation. Generally, such operators are optimized by the conversion tool, such as being merged or replaced by other operators. [5] Currently, we support using the environment variable export KEEP_ORIGIN_DTYPE=1 to keep the data type int64, and this option can be considered when there is an overflow using int32 data type, but it is only an experimental option and will be removed later. [6] Currently, some operators in the MindIR produced by MindSpore are not supported. The interfaces are ops.matmul, ops.dense, ops.max, and ops.min. Among them, the Max and Min operators only do not support when the axis parameter is None, and other scenarios are supported.