# Lite Operator List [](https://gitee.com/mindspore/docs/blob/r2.2/docs/lite/docs/source_en/operator_list_lite.md) This article lists the operators supported by MindSpore Lite. | Operation <br/> | CPU<br/>FP16 | CPU<br/>FP32 | CPU<br/>Int32 | CPU<br/>Int8 | CPU<br/>UInt8 | CPU<br/>Bool | GPU<br/>FP16 | GPU<br/>FP32 | GPU<br/>Int32 | GPU<br/>Int8 | NPU<br/> | TensorRT<br/> | Ascend<br/>(Ascend310) | TensorFlow Lite<br/>operators supported | Caffe<br/>operators supported | Onnx<br/>operators supported | TensorFlow<br/>operators supported | |-----------------------------------------------------------| :------------: | :------------: |--------------| :------------: | :-------------: |--------------| :------------: | :------------: |---------------|--------------| :---------: | :---------: | :-----------------------------: |-------------------------------------|-------------------------------------------------|---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| | Abs | ✅ | ✅ | ✅ | ✅ | ✅ | | ✅ | ✅ | | | ✅ | ✅ | ✅ | Abs | | Abs | Abs | | AbsGrad | | ✅ | | | | | | | | | | | | | | | | | Activation | ✅ | ✅ | ✅ | ✅ | ✅ | | ✅ | ✅ | | | ✅ | ✅ | ✅ | Activation, ReLU, ReLU6, PReLU, <br/>LeakyReLU, Tanh, HardSwish, Logistic | ReLU, ReLU6, Sigmoid, TanH, Elu | Relu, LeakyRelu, PRelu, Elu, Tanh, Sigmoid, HardSigmoid, Softplus | Activation, Elu, Relu, Relu6, Sigmoid, Tanh, Selu, LeakyRelu, Softplus | | ActivationGrad | ✅ | ✅ | | | | | | | | | | | | | | | | | Adam | | ✅ | | | | | | | | | | | | Adam | | | Adam | | AddFusion | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | | ✅ | ✅ | ✅ | ✅ | Add | | Add, Int8Add | Add, AddV2 | | AdderFusion | | ✅ | | | | | | | | | | | | | | adder_f | | | AddGrad | | ✅ | | | | | | | | | | | | | | | | | AddN | ✅ | ✅ | | | | | | | | | | | | AddN | | | | | Affine | | ✅ | | | | | | | | | | | ✅ | | | | | | All | | ✅ | | | | | | | | | |✅ | | All | | | All | | AllGather | | ✅ | | | | | | | | | |✅ | | | | | | | ApplyMomentum | | ✅ | | | | | | | | | | | ✅ | ApplyMomentum | | | ApplyMomentum | | Assert | ✅ | ✅ | | | | ✅ | | | | | | | | | | | Assert | | Assign | | ✅ | | | | | | | | | | | ✅ | Assign | | | Assign | | ArgmaxFusion | ✅ | ✅ | | ✅ | ✅ | | ✅ | ✅ | | | ✅ | ✅ | ✅ | Argmax | ArgMax | ArgMax | ArgMax | | ArgminFusion | ✅ | ✅ | | ✅ | ✅ | | ✅ | ✅ | | | | | ✅ | Argmin | | | ArgMin | | AvgPoolFusion | ✅ | ✅ | | ✅ | ✅ | | ✅ | ✅ | | | ✅ | ✅ | ✅ | MeanPooling | Pooling | AveragePool,<br/>GlobalAveragePool,<br/>Int8AveragePool | AvgPool | | AvgPoolGrad | ✅ | ✅ | | | | | | | | | | | | | | | | | BatchNorm | ✅ | ✅ | | ✅ | ✅ | | ✅ | ✅ | | | | | ✅ | | BatchNorm | BatchNormalization | | | BatchNormGrad | ✅ | ✅ | | | | | | | | | | | | | | | | | BatchToSpace | | ✅ | | ✅ | ✅ | | ✅ | ✅ | | | | | | BatchToSpace | | | BatchToSpace | | BatchToSpaceND | ✅ | ✅ | | ✅ | ✅ | | ✅ | ✅ | | | | | | BatchToSpaceND | | | BatchToSpaceND | | BiasAdd | ✅ | ✅ | | ✅ | ✅ | | ✅ | ✅ | | | | ✅ | ✅ | | | BiasAdd | BiasAdd | | BiasAddGrad | ✅ | ✅ | | | | | | | | | | | | | | | | | BinaryCrossEntropy | | ✅ | | | | | | | | | | |✅| BinaryCrossEntropy | | | BinaryCrossEntropy | | BinaryCrossEntropyGrad | | ✅ | | | | | | | | | | | | | | | | | BroadcastTo | ✅ | ✅ | ✅ | | | ✅ | | | | | | | | BroadcastTo | | Expand | BroadcastTo | | Call | ✅ | ✅ | ✅ | | | ✅ | | | | | | | ✅| | | | | | Cast | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | | | ✅ | ✅ | ✅ | Cast,<br/>QUANTIZE,<br/>DEQUANTIZE | | Cast | Cast | | Ceil | ✅ | ✅ | | ✅ | ✅ | | ✅ | ✅ | | | ✅ | | ✅ | Ceil | | Ceil | Ceil | | Clip | | ✅ | ✅ | | | | | | | | | | | Clip | | Clip | Clip | | Concat | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | | ✅ | ✅ | ✅ | Concat | Concat | Concat | ConcatV2 | | ConstantOfShape | ✅ | ✅ | ✅ | | | | | | | | | | | | | ConstantOfShape | | | Conv2DFusion | ✅ | ✅ | | ✅ | ✅ | | ✅ | ✅ | | | ✅ | ✅ | ✅ | Conv2D | Convolution | Conv, Int8Conv,<br/>ConvRelu,<br/>Int8ConvRelu | Conv2D | | Conv2DBackpropFilterFusion | ✅ | ✅ | | | | | | | | | | | | | | | | | Conv2DBackpropInputFusion | ✅ | ✅ | | | | | | | | | | | | | | | | | Conv2dTransposeFusion | ✅ | ✅ | | ✅ | ✅ | | ✅ | ✅ | | | ✅ | ✅ | ✅ | DeConv2D | Deconvolution | ConvTranspose | Conv2DBackpropInput | | Cos | ✅ | ✅ | | ✅ | ✅ | | ✅ | ✅ | | | ✅ | |✅ | Cos | | Cos | Cos | | Crop | ✅ | ✅ | ✅ | ✅ | ✅ | | | | | | | | | | Crop | | | | CropAndResize | | ✅ | | | | | | | | | ✅ | | | | | | CropAndResize | | CumSum | | ✅ | ✅ | | | | | | | | | | | | | | Cumsum | | CustomExtractFeatures | | ✅ | | | | | | | | | | | | ExtractFeatures | | | | | CustomNormalize | | ✅ | | | | | | | | | | | | Normalize | | | | | CustomPredict | | ✅ | ✅ | | | | | | | | | | | Predict | | | | | DEConv2DGradFilter | | ✅ | | | | | | | | | | | | | | | | | DepthToSpace | ✅ | ✅ | | ✅ | ✅ | | ✅ | ✅ | | | | | | DepthToSpace | | DepthToSpace | DepthToSpace | | DetectionPostProcess | | ✅ | | ✅ | ✅ | | | | | | | | | Custom | | | | | DivFusion | ✅ | ✅ | ✅ | ✅ | ✅ | | ✅ | ✅ | | | ✅ | ✅ | ✅ | Div, RealDiv | | Div | Div, RealDiv | | DivGrad | | ✅ | | | | | | | | | | | | | | | | | Dropout | ✅ | ✅ | | | | | | | | | | | | Dropout | | Dropout | Dropout | | DropoutGrad | ✅ | ✅ | | | | | | | | | | | | | | | | | DynamicQuant | | ✅ | | | | | | | | | | | | | | DynamicQuantizeLinear | | | Eltwise | ✅ | ✅ | | ✅ | ✅ | | ✅ | ✅ | | | ✅ | ✅ | ✅ | | Eltwise | Sum, Max<sup>[3]</sup> | | | Elu | ✅ | ✅ | | | | | | | | | | | ✅ | | ELU | Elu,<br/>NonMaxSuppression | NonMaxSuppressionV3 | | Equal | ✅ | ✅ | ✅ | ✅ | ✅ | | ✅ | ✅ | | | ✅ | ✅ | ✅ | Equal | | Equal | Equal | | EmbeddingLookupFusion | | ✅ | | | | | | | | | | | | | | | | | Erf | ✅ | ✅ | | | | | | | | | | | ✅ | Erf | | Erf | Erf | | ExpFusion | ✅ | ✅ | | | | | ✅ | ✅ | | | | | | Exp | Exp | Exp | Exp | | ExpandDims | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | | ✅ | ✅ | ✅ | ExpandDims | | | ExpandDims | | Fill | ✅ | ✅ | ✅ | | | ✅ | ✅ | ✅ | | | | | ✅ | Fill | | | Fill | | Flatten | ✅ | ✅ | ✅ | | | | | | | | | ✅ | ✅ | | Flatten | | | | FlattenGrad | ✅ | ✅ | | | | | | | | | | | | | | | | | Floor | ✅ | ✅ | | ✅ | ✅ | | ✅ | ✅ | | | ✅ | | ✅ | flOOR | | Floor | Floor | | FloorDiv | ✅ | ✅ | ✅ | | | | ✅ | ✅ | | | ✅ | | | FloorDiv | | | FloorDiv | | FloorMod | ✅ | ✅ | ✅ | | | | ✅ | ✅ | | | ✅ | | | FloorMod | | | FloorMod | | FullConnection | ✅ | ✅ | | ✅ | ✅ | | ✅ | ✅ | | | ✅ | ✅ | ✅ | FullyConnected | InnerProduct | | | | FusedBatchNorm | ✅ | ✅ | | ✅ | ✅ | | | | | | ✅ | | ✅ | FusedBatchNorm | | | FusedBatchNorm,<br/>FusedBatchNormV3 | | GatherNd | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | | | | | ✅ | GatherND | | | GatherNd | | Gather | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | | ✅ | ✅ | ✅ | Gather | | Gather | GatherV2 | | GatherD | ✅ | ✅ | ✅ | | | ✅ | | | | | | | ✅ | | | | | | GLU | | ✅ | | | | | | | | | | | | | | | | | Greater | ✅ | ✅ | ✅ | ✅ | ✅ | | ✅ | ✅ | | | ✅ | | ✅ | Greater | | Greater | Greater | | GreaterEqual | ✅ | ✅ | ✅ | ✅ | ✅ | | ✅ | ✅ | | | ✅ | | ✅ | GreaterEqual | | | GreaterEqual | | GroupNormFusion | | ✅ | | | | | | | | | | | | | | | | | GRU | ✅ | ✅ | | | | | | | | | | | | | | | | | HashtableLookup | | ✅ | ✅ | | | | | | | | | | | HashtableLookup | | | | | InstanceNorm | ✅ | ✅ | | | | | | | | | ✅ | | | InstanceNorm | | InstanceNormalization | | | InvertPermutation | ✅ | ✅ | ✅ | | | | | | | | | | | | | | InvertPermutation | | IsFinite | | ✅ | | | | | | | | | | | ✅ | IsFinite | | | IsFinite | | L2NormalizeFusion | | ✅ | | ✅ | ✅ | | | | | | | | | | | | | | LayerNormFusion | ✅ | ✅ | | ✅ | | | ✅ | ✅ | | | | | | | | | | | LayerNormGrad | ✅ | ✅ | | | | | | | | | | | | | | | | | LeakyReLU | ✅ | ✅ | | ✅ | ✅ | | ✅ | ✅ | | | ✅ | | ✅ | LeakyRelu | | LeakyRelu | LeakyRelu | | Less | ✅ | ✅ | ✅ | ✅ | ✅ | | ✅ | ✅ | | | ✅ | | | Less | | Less | Less | | LessEqual | ✅ | ✅ | ✅ | ✅ | ✅ | | ✅ | ✅ | | | ✅ | | | LessEqual | | | LessEqual | | LRN | | ✅ | | | | | | | | | | | | LocalResponseNorm | | Lrn, LRN | | | Log | ✅ | ✅ | | ✅ | ✅ | | ✅ | ✅ | | | ✅ | | | Log | | Log | Log | | Log1p | | ✅ | | | | | | | | | | | ✅ | Log1p | | | Log1p | | LogGrad | ✅ | ✅ | | | | | | | | | | | | | | | | | LogicalAnd | ✅ | ✅ | ✅ | | | ✅ | ✅ | ✅ | | | ✅ | | | LogicalAnd | | And | LogicalAnd | | LogicalNot | ✅ | ✅ | | ✅ | ✅ | ✅ | ✅ | ✅ | | | ✅ | | | LogicalNot | | Not | LogicalNot | | LogicalOr | ✅ | ✅ | | | | ✅ | ✅ | ✅ | | | ✅ | | | LogicalOr | | Or | LogicalOr | | LogSoftmax | ✅ | ✅ | | | | | | | | | | | | LogSoftmax | | LogSoftmax | | | LshProjection | | ✅ | | | | | | | | | | | | LshProjection | | | | | LSTM | ✅ | ✅ | | | | | | | | | | | | | | LSTM | | | LSTMGrad | | ✅ | | | | | | | | | | | | | | | | | LSTMGradData | | ✅ | | | | | | | | | | | | | | | | | LSTMGradWeight | | ✅| | | | | | | | | | | | | | | | | MatMulFusion | ✅ | ✅ | | ✅ | ✅ | | ✅ | ✅ | | | ✅ | ✅ | ✅ | BatchMatMul | | MatMul,<br/>Gemm | MatMul,<br/>BatchMatMul,<br/>BatchMatMulV2 | | Maximum | ✅ | ✅ | ✅ | | | | ✅ | ✅ | | | ✅ | | ✅ | Maximum | | Max | Maximum | | MaximumGrad | ✅ | ✅ | | | | | | | | | | | | | | | | | MaxPoolFusion | ✅ | ✅ | | ✅ | ✅ | | ✅ | ✅ | | | ✅ | ✅ | ✅ | MaxPooling | Pooling | MaxPool,<br/>GlobalMaxPool | MaxPool | | MaxPoolGrad | ✅ | ✅ | | | | | | | | | | | | | | | | | Merge | ✅ | ✅ | | | | | | | | | | | | | | | Merge | | Minimum | ✅ | ✅ | ✅ | | | | ✅ | ✅ | | | ✅ | | ✅ | Minimum | | Min | Minimum | | MinimumGrad | ✅ | ✅ | | | | | | | | | | | | | | | | | Mod | | ✅ | ✅ | | | | | | | | | |✅ | Mod | | Mod | Mod | | MulFusion | ✅ | ✅ | ✅ | ✅ | ✅ | | ✅ | ✅ | | | ✅ | ✅ | ✅ | Mul | | Mul | Mul | | MulGrad | | ✅ | | | | | | | | | | | | | | | | | Neg | ✅ | ✅ | ✅ | | | | ✅ | ✅ | | | ✅ | | | Neg | | Neg | Neg | | NegGrad | ✅ | ✅ | | | | | | | | | | | | | | | | | NLLLoss | | ✅ | | | | | | | | | | |✅| | | | | | NLLLossGrad | | ✅ | | | | | | | | | | | | | | | | | NotEqual | ✅ | ✅ | ✅ | ✅ | ✅ | | ✅ | ✅ | | | ✅ | | | NotEqual | | | NotEqual | | NonMaxSupppression | | ✅ | | | | | | | | | | | ✅ | NonMaxSupppression | | NonMaxSupppression | NonMaxSupppression | | NonZero | | | | | | ✅ | | | | | | |✅ | NonZero | | NonZero | NonZero | | OneHot | ✅ | ✅ | ✅ | | | | ✅ | ✅ | ✅ | | | | | OneHot | | OneHot | OneHot | | OnesLike | ✅ | ✅ | ✅ | | | | | | | | | ✅ | ✅ | OnesLike | | | OnesLike | | PadFusion | ✅ | ✅ | | ✅ | ✅ | | ✅ | ✅ | | | ✅ | ✅ | ✅ | Pad, MirrorPad, PadV2 | | Pad | MirrorPad, Pad, PadV2 | | PartialFusion | ✅ | ✅ | ✅ | | | ✅ | | | | | | | | | | | | | PowFusion | ✅ | ✅ | | ✅ | ✅ | | ✅ | ✅ | | | | ✅ | ✅ | Pow | Power | Pow<sup>[2]</sup> | Pow | | PowerGrad | | ✅ | | | | | | | | | | | | | | | | | PriorBox | | ✅ | | ✅ | ✅ | | | | | | | |✅ | | | | | | PReLUFusion | ✅ | ✅ | | | | | ✅ | ✅ | | | | | ✅ | PRELU | PReLU | PRelu | | | QuantDTypeCast | ✅ | ✅ | | ✅ | ✅ | | | | | | | | | | | | | | RaggedRange | ✅ | ✅ | ✅ | | | | | | | | | | | | | | RaggedRange | | RandomNormal | ✅ | ✅ | | | | | | | | | | | | RandomNormal | | RandomNormal | RandomNormal | | RandomStandardNormal | ✅ | ✅ | | | | | | | | | | | | | | | RandomStandardNormal | | Range | ✅ | ✅ | ✅ | | | | | | | | | | | Range | | Range | Range | | Rank | ✅ | ✅ | | | | | | | | | | | | Rank | | | Rank | | RealDiv | ✅ | ✅ | | | | | | | | | | | ✅ | | | | | | Reciprocal | ✅ | ✅ | | ✅ | | | | | | | ✅ | | | | | Reciprocal | | | ReduceFusion | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | | | ✅ | ✅ | | Sum, Mean, ReduceMax, ReduceMin, ReduceProd | Reduction | ReduceMean, ReduceMax, ReduceMin, ReduceProd, ReduceSum, ReduceSumSquare, ReduceL2 | Sum, Max, Min, Mean, Prod, All | | ReduceScatter | | ✅ | | | | | | | | | | ✅ | | | | | | | Reshape | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | | ✅ | ✅ | ✅ | Reshape | Reshape | Reshape,<br/>Flatten | Reshape | | Resize | ✅ | ✅ | | ✅ | ✅ | | ✅ | ✅ | | | ✅ | ✅ | | ResizeBilinear,<br/>NearestNeighbor | Interp | Resize, Upsample | ResizeBilinear,<br/>ResizeBicubic,<br/>ResizeNearestNeighbor | | ResizeGrad | ✅ | ✅ | | | | | | | | | | | | | | | | | ReverseV2 | | ✅ | ✅ | | | | | | | | | | | reverse | | | ReverseV2 | | ReverseSequence | | ✅ | | | | | | | | | | | | ReverseSequence | | | ReverseSequence | | ROIPooling | | ✅ | | | | | | | | | | | ✅ | | | | | | Round | ✅ | ✅ | | ✅ | ✅ | | ✅ | ✅ | | | ✅ | | ✅ | Round | | Round | Round | | Rsqrt | ✅ | ✅ | | ✅ | ✅ | | ✅ | ✅ | | | ✅ | | | Rsqrt | | | Rsqrt | | RsqrtGrad | | ✅ | | | | | | | | | | | ✅ | | | | | | Select | | ✅ | | | | ✅ | | | | | | | | | | | Select | | Selu | | | | | | | | | | | | | | | | | Selu | | ScaleFusion | ✅ | ✅ | | ✅ | ✅ | | ✅ | ✅ | | | ✅ | ✅ | ✅ | | Scale | | | | ScatterNd | ✅ | ✅ | ✅ | | | | | | | | | | | ScatterNd | | ScatterND | | | ScatterNdUpdate | ✅ | ✅ | ✅ | | | | | | | | | | | ScatterNdUpdate | | ScatterNdUpdate | | | SGD | | ✅ | | | | | | | | | | | ✅ | SGD | SGD | | SGD | | Shape | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | | | | | ✅ | Shape | | Shape | Shape | | SigmoidCroosEntropyWithLogits | | ✅| | | | | | | | | | |✅ | | | | | | SigmoidCroosEntropyWithLogitsGrad | | ✅| | | | | | | | | | | ✅ | | | | | | Sin | ✅ | ✅ | | ✅ | ✅ | | ✅ | ✅ | | | ✅ | | ✅ | Sin | | Sin | Sin | | Size | ✅ | ✅ | ✅ | | | | | | | | | | | | | | Size | | SliceFusion | ✅ | ✅ | ✅ | ✅ | ✅ | | ✅ | ✅ | | | ✅ | | | Slice | Slice | Slice | Slice | | SkipGram | | ✅ | | | | | | | | | | | | SKipGram | | | | | SmoothL1Loss | | ✅ | | | | | | | | | | | ✅ | | | | | | SmoothL1LossGrad | | ✅ | | | | | | | | | | | ✅ | | | | | | Softmax | ✅ | ✅ | | ✅ | ✅ | | ✅ | ✅ | | | ✅ | ✅ | ✅ | Softmax | Softmax | Softmax | Softmax | | SoftmaxGrad | | ✅ | | | | | | | | | | | | | | | | | Softplus | ✅ | ✅ | | | | | | | | | | | | | | | Softplus | | SpaceToBatch | ✅ | ✅ | | ✅ | ✅ | | ✅ | ✅ | | | | | | SpaceToBatch | | | | | SpaceToBatchND | ✅ | ✅ | | ✅ | ✅ | | ✅ | ✅ | | | | | | SpaceToBatchND | | | SpaceToBatchND | | SpaceToDepth | ✅ | ✅ | | | | | ✅ | ✅ | | | | | | SpaceToDepth | | SpaceToDepth | | | SparseToDense | ✅ | ✅ | ✅ | | | | ✅ | ✅ | ✅ | | | | | SpareToDense | | | | | SparseSoftmaxCrossEntropyWithLogits | | ✅ | | | | | | | | | | | ✅ | | | | | | Splice | ✅ | ✅ | | | | | | | | | | | | | | Splice | | | Split | ✅ | ✅ | ✅ | ✅ | ✅ | | ✅ | ✅ | | | ✅ | | | Split, SplitV | | Split | Split, SplitV | | SplitWithOverlap | ✅ | ✅ | | | | | | | | | | | | | | | | | Sqrt | ✅ | ✅ | | ✅ | ✅ | | ✅ | ✅ | | | ✅ | ✅ | ✅ | Sqrt | | Sqrt | Sqrt | | SqrtGrad | | ✅ | | | | | | | | | | | ✅ | | | | | | Square | ✅ | ✅ | | ✅ | ✅ | | ✅ | ✅ | | | ✅ | | ✅ | Square | | | Square | | SquaredDifference | ✅ | ✅ | | | | | ✅ | ✅ | | | | | | SquaredDifference | | | SquaredDifference | | Squeeze | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | | | ✅ | | Squeeze | | Squeeze | Squeeze | | StridedSlice | ✅ | ✅ | ✅ | ✅ | ✅ | | ✅ | ✅ | | | ✅ | ✅ | ✅ | StridedSlice | | Slice,<br/>DynamicSlice | StridedSlice | | StridedSliceGrad | ✅ | ✅ | | | | | | | | | | | | | | | | | Stack | ✅ | ✅ | ✅ | | | | ✅ | ✅ | | | | | ✅ | Stack | | | Pack | | SubFusion | ✅ | ✅ | ✅ | ✅ | ✅ | | ✅ | ✅ | | | ✅ | ✅ | ✅ | Sub | | Sub | Sub | | SubGrad | | ✅ | | | | | | | | | | | | | | | | | Switch | ✅ | ✅ | ✅ | | | ✅ | | | | | | | | | | | Switch | | SwitchLayer | ✅ | ✅ | ✅ | | | ✅ | | | | | | | | | | | | | TensorListFromTensor | ✅ | ✅ | ✅ | | | | | | | | | | | | | | TensorListFromTensor | | TensorListGetItem | ✅ | ✅ | ✅ | | | | | | | | | | | | | | TensorListGetItem | | TensorListReserve | ✅ | ✅ | ✅ | | | | | | | | | | | | | | TensorListReserve | | TensorListSetItem | ✅ | ✅ | ✅ | | | | | | | | | | | | | | TensorListSetItem | | TensorListStack | ✅ | ✅ | ✅ | | | | | | | | | | | | | | TensorListStack | | TensorScatterAdd | | ✅ | ✅ | | | | | | | | | | | TensorScatterAdd | | | TensorScatterAdd | | TileFusion | ✅ | ✅ | ✅ | | | ✅ | | | | | ✅ | | | Tile | Tile | Tile | Tile | | TopKFusion | ✅ | ✅ | ✅ | ✅ | ✅ | | | | | | | | | TopKV2 | | TopK | TopKV2 | | Transpose | ✅ | ✅ | ✅ | ✅ | | ✅ | ✅ | ✅ | | | ✅ | ✅ | ✅ | Transpose | Permute | Transpose, Int8Transpose | Transpose | | UniformReal | | ✅ | ✅ | | | | | | | | | | | | | | | | Unique | ✅ | ✅ | ✅ | | | | | | | | | | | Unique | | | | | UnsortedSegmentSum | ✅ | ✅ | ✅ | | | | | | | | | | | | | | UnsortedSegmentSum | | Unsqueeze | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | | ✅ | ✅ | | | | Unsqueeze | | | Unstack | ✅ | ✅ | ✅ | | | | | | | | | | | Unstack | | | | | Where | ✅ | ✅ | ✅ | | | ✅ | | | | | | | | Where | | NonZero, Where | Where | | ZerosLike | ✅ | ✅ | ✅ | | | | | | | | | | | ZerosLike | | | ZerosLike | | Other operators supported by the converter.<sup>[4]</sup> | | | | | | | | | | | | | | | | Constant,<br/>Atan, Asin, Tan, <br/>Loop, Dropout, If, Identity,<br/>Int8GivenIntTensorFill,<br/>Int8GivenTensorFill,<br/>Int8Quantize,<br/>Int8Dequantize,<br/>LpNormalization | Dropout, Enter,<br/>Exit, If, <br/>LinSpace,<br/>LoopCond,<br/>NextIteration,<br/>StatelessIf,<br/>StatelessWhile,<br/>TensorArrayGatherV3,<br/>TensorArrayReadV3,<br/>TensorArrayScatterV3,<br/>TensorArraySizeV3,<br/>TensorArrayV3,<br/>TensorArrayWriteV3,<br/>While | [1] Clip: Only support converting clip(0, 6) to Relu6. [2] Pow: Only support the form where the exponent is a single constant. [3] Sum and Max: Only support 2 inputs. [4] Operators supported by [converter](https://www.mindspore.cn/lite/docs/en/r2.2/use/converter_tool.html) but do not require specific implementation. Generally, such operators are optimized by the conversion tool, such as being merged or replaced by other operators.