Operator List
mindspore.nn
Operation |
Ascend |
GPU |
CPU |
Operator Type |
---|---|---|---|---|
Supported |
Supported |
Supported |
layer/activation |
|
Supported |
Supported |
Doing |
layer/activation |
|
Supported |
Supported |
Supported |
layer/activation |
|
Supported |
Supported |
Doing |
layer/activation |
|
Doing |
Supported |
Doing |
layer/activation |
|
Doing |
Supported |
Doing |
layer/activation |
|
Supported |
Doing |
Doing |
layer/activation |
|
Supported |
Supported |
Doing |
layer/activation |
|
Supported |
Supported |
Doing |
layer/activation |
|
Supported |
Supported |
Doing |
layer/activation |
|
Supported |
Doing |
Doing |
layer/activation |
|
Supported |
Supported |
Doing |
layer/basic |
|
Supported |
Supported |
Doing |
layer/basic |
|
Supported |
Supported |
Doing |
layer/basic |
|
Supported |
Doing |
Supported |
layer/basic |
|
Doing |
Supported |
Doing |
layer/basic |
|
Doing |
Supported |
Doing |
layer/basic |
|
Supported |
Supported |
Doing |
layer/basic |
|
Supported |
Doing |
Doing |
layer/basic |
|
Supported |
Supported |
Doing |
layer/container |
|
Supported |
Supported |
Doing |
layer/container |
|
Supported |
Supported |
Supported |
layer/conv |
|
Supported |
Supported |
Doing |
layer/conv |
|
Supported |
Supported |
Supported |
layer/conv |
|
Supported |
Doing |
Doing |
layer/conv |
|
Supported |
Doing |
Doing |
layer/conv |
|
Doing |
Supported |
Doing |
layer/embedding |
|
Doing |
Doing |
Doing |
layer/image |
|
Doing |
Doing |
Doing |
layer/image |
|
Doing |
Doing |
Doing |
layer/image |
|
Supported |
Doing |
Doing |
layer/image |
|
Doing |
Supported |
Supported |
layer/lstm |
|
Supported |
Doing |
Doing |
layer/normalization |
|
Supported |
Doing |
Doing |
layer/normalization |
|
Supported |
Supported |
Doing |
layer/normalization |
|
Supported |
Doing |
Doing |
layer/normalization |
|
Supported |
Supported |
Doing |
layer/normalization |
|
Supported |
Doing |
Doing |
layer/normalization |
|
Supported |
Doing |
Doing |
layer/normalization |
|
Supported |
Doing |
Doing |
layer/normalization |
|
Supported |
Doing |
Doing |
layer/normalization |
|
Supported |
Supported |
Supported |
layer/pooling |
|
Doing |
Supported |
Doing |
layer/pooling |
|
Doing |
Doing |
Doing |
loss/loss |
|
Supported |
Doing |
Doing |
loss/loss |
|
Supported |
Doing |
Doing |
loss/loss |
|
Supported |
Supported |
Doing |
loss/loss |
|
Supported |
Doing |
Doing |
loss/loss |
|
Supported |
Doing |
Doing |
optim/ProximalAdagrad |
|
Supported |
Doing |
Doing |
optim/lazyadam |
|
Supported |
Doing |
Doing |
optim/adam |
|
Supported |
Supported |
Doing |
optim/adam |
|
Supported |
Supported |
Doing |
optim/lamb |
|
Supported |
Doing |
Doing |
optim/lars |
|
Supported |
Supported |
Doing |
optim/momentum |
|
Supported |
Supported |
Doing |
optim/optimizer |
|
Supported |
Support |
Doing |
optim/optimizer |
|
Supported |
Doing |
Doing |
optim/sgd |
|
Supported |
Supported |
Doing |
wrap/cell_wrapper |
|
Supported |
Supported |
Doing |
wrap/cell_wrapper |
|
Supported |
Supported |
Doing |
wrap/cell_wrapper |
|
Doing |
Supported |
Doing |
wrap/cell_wrapper |
|
Doing |
Supported |
Doing |
wrap/cell_wrapper |
|
Supported |
Supported |
Doing |
wrap/cell_wrapper |
|
Supported |
Doing |
Doing |
wrap/cell_wrapper |
|
Supported |
Doing |
Doing |
wrap/grad_reducer |
|
Doing |
Doing |
Doing |
wrap/loss_scale |
|
Doing |
Doing |
Doing |
wrap/loss_scale |
|
Doing |
Doing |
Doing |
wrap/loss_scale |
|
Supported |
Supported |
Supported |
cell |
mindspore.ops.operations
Operation |
Ascend |
GPU |
CPU |
Operator Type |
---|---|---|---|---|
Supported |
Supported |
Doing |
nn_ops |
|
Supported |
Supported |
Supported |
nn_ops |
|
Doing |
Doing |
Doing |
nn_ops |
|
Supported |
Doing |
Doing |
nn_ops |
|
Supported |
Doing |
Doing |
nn_ops |
|
Doing |
Doing |
Doing |
nn_ops |
|
Supported |
Doing |
Doing |
nn_ops |
|
Supported |
Doing |
Doing |
nn_ops |
|
Supported |
Doing |
Doing |
nn_ops |
|
Supported |
Doing |
Doing |
nn_ops |
|
Supported |
Doing |
Doing |
nn_ops |
|
Supported |
Supported |
Doing |
nn_ops |
|
Supported |
Doing |
Doing |
nn_ops |
|
Supported |
Supported |
Supported |
nn_ops |
|
Supported |
Supported |
Doing |
nn_ops |
|
Doing |
Supported |
Doing |
nn_ops |
|
Doing |
Supported |
Doing |
nn_ops |
|
Supported |
Supported |
Doing |
nn_ops |
|
Supported |
Supported |
Doing |
nn_ops |
|
Supported |
Doing |
Doing |
nn_ops |
|
Supported |
Doing |
Doing |
nn_ops |
|
Supported |
Supported |
Supported |
nn_ops |
|
Supported |
Doing |
Doing |
nn_ops |
|
Supported |
Doing |
Doing |
nn_ops |
|
mindspore.ops.operations.DepthwiseConv2dNativeiBackpropFilter |
Supported |
Doing |
Doing |
nn_ops |
Supported |
Doing |
Doing |
nn_ops |
|
Supported |
Supported |
Supported |
nn_ops |
|
Supported |
Supported |
Doing |
nn_ops |
|
Supported |
Supported |
Doing |
nn_ops |
|
Supported |
Supported |
Supported |
nn_ops |
|
Supported |
Doing |
Doing |
nn_ops |
|
Supported |
Supported |
Doing |
nn_ops |
|
mindspore.ops.operations.SparseSoftmaxCrossEntropyWithLogits |
Doing |
Supported |
Supported |
nn_ops |
Supported |
Supported |
Supported |
nn_ops |
|
Supported |
Doing |
Doing |
nn_ops |
|
Supported |
Doing |
Doing |
nn_ops |
|
Supported |
Doing |
Doing |
nn_ops |
|
Supported |
Doing |
Doing |
nn_ops |
|
Supported |
Supported |
Doing |
nn_ops |
|
Supported |
Supported |
Doing |
nn_ops |
|
Supported |
Doing |
Doing |
nn_ops |
|
Supported |
Doing |
Doing |
nn_ops |
|
Supported |
Doing |
Doing |
nn_ops |
|
Doing |
Doing |
Supported |
nn_ops |
|
Supported |
Doing |
Doing |
nn_ops |
|
Doing |
Doing |
Supported |
nn_ops |
|
Doing |
Doing |
Supported |
nn_ops |
|
Supported |
Doing |
Doing |
nn_ops |
|
Supported |
Doing |
Doing |
nn_ops |
|
Supported |
Supported |
Doing |
nn_ops |
|
Supported |
Doing |
Doing |
nn_ops |
|
Supported |
Doing |
Doing |
nn_ops |
|
Supported |
Doing |
Doing |
nn_ops |
|
Supported |
Doing |
Doing |
nn_ops |
|
Supported |
Supported |
Doing |
nn_ops |
|
Supported |
Supported |
Doing |
nn_ops |
|
Supported |
Supported |
Doing |
nn_ops |
|
Supported |
Doing |
Doing |
nn_ops |
|
Doing |
Supported |
Supported |
nn_ops |
|
Doing |
Doing |
Doing |
nn_ops |
|
Supported |
Supported |
Doing |
nn_ops |
|
Supported |
Doing |
Doing |
nn_ops |
|
Supported |
Doing |
Doing |
nn_ops |
|
Supported |
Supported |
Doing |
nn_ops |
|
Supported |
Doing |
Doing |
nn_ops |
|
Supported |
Doing |
Doing |
nn_ops |
|
Supported |
Doing |
Doing |
nn_ops |
|
Supported |
Supported |
Doing |
math_ops |
|
Supported |
Supported |
Doing |
math_ops |
|
Supported |
Doing |
Doing |
math_ops |
|
Supported |
Supported |
Doing |
math_ops |
|
Supported |
Supported |
Doing |
math_ops |
|
Supported |
Doing |
Doing |
math_ops |
|
Supported |
Supported |
Doing |
math_ops |
|
Supported |
Doing |
Doing |
math_ops |
|
Supported |
Doing |
Doing |
math_ops |
|
Supported |
Doing |
Doing |
math_ops |
|
Supported |
Supported |
Supported |
math_ops |
|
Supported |
Supported |
Doing |
math_ops |
|
Supported |
Doing |
Doing |
math_ops |
|
Supported |
Supported |
Doing |
math_ops |
|
Supported |
Supported |
Doing |
math_ops |
|
Supported |
Supported |
Doing |
math_ops |
|
Supported |
Supported |
Supported |
math_ops |
|
Supported |
Supported |
Doing |
math_ops |
|
Supported |
Doing |
Doing |
math_ops |
|
Supported |
Supported |
Doing |
math_ops |
|
Supported |
Supported |
Doing |
math_ops |
|
Supported |
Supported |
Doing |
math_ops |
|
Supported |
Supported |
Doing |
math_ops |
|
Supported |
Supported |
Doing |
math_ops |
|
Supported |
Supported |
Doing |
math_ops |
|
Supported |
Doing |
Doing |
math_ops |
|
Supported |
Supported |
Doing |
math_ops |
|
Supported |
Supported |
Doing |
math_ops |
|
Supported |
Supported |
Doing |
math_ops |
|
Supported |
Doing |
Doing |
math_ops |
|
Supported |
Doing |
Doing |
math_ops |
|
Supported |
Doing |
Doing |
math_ops |
|
Supported |
Doing |
Doing |
math_ops |
|
Supported |
Supported |
Doing |
math_ops |
|
Doing |
Supported |
Supported |
math_ops |
|
Supported |
Supported |
Doing |
math_ops |
|
Supported |
Supported |
Doing |
math_ops |
|
Supported |
Supported |
Doing |
math_ops |
|
Supported |
Supported |
Doing |
math_ops |
|
Supported |
Doing |
Doing |
math_ops |
|
Supported |
Supported |
Doing |
math_ops |
|
Supported |
Supported |
Doing |
math_ops |
|
Supported |
Supported |
Doing |
math_ops |
|
Supported |
Supported |
Doing |
math_ops |
|
Supported |
Doing |
Doing |
math_ops |
|
Supported |
Doing |
Doing |
math_ops |
|
Supported |
Doing |
Doing |
math_ops |
|
Supported |
Doing |
Doing |
math_ops |
|
Supported |
Doing |
Doing |
math_ops |
|
Supported |
Doing |
Doing |
math_ops |
|
Supported |
Doing |
Doing |
math_ops |
|
Supported |
Doing |
Doing |
math_ops |
|
Supported |
Doing |
Doing |
math_ops |
|
Doing |
Supported |
Doing |
math_ops |
|
Supported |
Doing |
Doing |
math_ops |
|
Supported |
Doing |
Doing |
math_ops |
|
Supported |
Doing |
Doing |
math_ops |
|
Supported |
Doing |
Doing |
math_ops |
|
Supported |
Doing |
Doing |
math_ops |
|
Supported |
Doing |
Doing |
math_ops |
|
Supported |
Doing |
Doing |
math_ops |
|
Supported |
Doing |
Doing |
math_ops |
|
Supported |
Doing |
Doing |
math_ops |
|
Supported |
Doing |
Doing |
math_ops |
|
Supported |
Doing |
Doing |
math_ops |
|
Supported |
Doing |
Doing |
math_ops |
|
Supported |
Doing |
Doing |
math_ops |
|
Supported |
Doing |
Doing |
math_ops |
|
Supported |
Doing |
Doing |
math_ops |
|
Supported |
Doing |
Doing |
math_ops |
|
Supported |
Doing |
Doing |
math_ops |
|
Supported |
Doing |
Doing |
math_ops |
|
Supported |
Doing |
Doing |
math_ops |
|
Supported |
Doing |
Doing |
math_ops |
|
Supported |
Doing |
Doing |
math_ops |
|
Supported |
Doing |
Doing |
math_ops |
|
Supported |
Doing |
Doing |
math_ops |
|
Supported |
Supported |
Doing |
array_ops |
|
Supported |
Supported |
Supported |
array_ops |
|
Supported |
Supported |
Supported |
array_ops |
|
Supported |
Supported |
Doing |
array_ops |
|
Supported |
Supported |
Supported |
array_ops |
|
Supported |
Supported |
Supported |
array_ops |
|
Supported |
Supported |
Supported |
array_ops |
|
Supported |
Supported |
Supported |
array_ops |
|
Supported |
Supported |
Doing |
array_ops |
|
Supported |
Supported |
Doing |
array_ops |
|
Supported |
Supported |
Doing |
array_ops |
|
Supported |
Supported |
Doing |
array_ops |
|
Supported |
Supported |
Supported |
array_ops |
|
Doing |
Doing |
Doing |
array_ops |
|
Supported |
Supported |
Supported |
array_ops |
|
Supported |
Supported |
Supported |
array_ops |
|
Supported |
Doing |
Doing |
array_ops |
|
Supported |
Supported |
Doing |
array_ops |
|
Supported |
Supported |
Supported |
array_ops |
|
Supported |
Supported |
Supported |
array_ops |
|
Supported |
Supported |
Supported |
array_ops |
|
Supported |
Doing |
Doing |
array_ops |
|
Supported |
Supported |
Supported |
array_ops |
|
Supported |
Doing |
Doing |
array_ops |
|
Supported |
Supported |
Doing |
array_ops |
|
Supported |
Doing |
Doing |
array_ops |
|
Supported |
Supported |
Doing |
array_ops |
|
Supported |
Supported |
Doing |
array_ops |
|
Supported |
Doing |
Doing |
array_ops |
|
Supported |
Doing |
Doing |
array_ops |
|
Supported |
Supported |
Doing |
array_ops |
|
Supported |
Doing |
Doing |
array_ops |
|
Supported |
Supported |
Doing |
array_ops |
|
Supported |
Supported |
Doing |
array_ops |
|
Supported |
Supported |
Doing |
array_ops |
|
Doing |
Doing |
Doing |
array_ops |
|
Doing |
Doing |
Doing |
array_ops |
|
Supported |
Supported |
Supported |
array_ops |
|
Supported |
Doing |
Doing |
array_ops |
|
Supported |
Doing |
Doing |
array_ops |
|
Supported |
Doing |
Doing |
array_ops |
|
Supported |
Supported |
Doing |
array_ops |
|
Supported |
Doing |
Doing |
array_ops |
|
Doing |
Doing |
Supported |
array_ops |
|
Supported |
Doing |
Doing |
array_ops |
|
Supported |
Doing |
Doing |
array_ops |
|
Supported |
Doing |
Doing |
array_ops |
|
Supported |
Doing |
Doing |
array_ops |
|
Supported |
Doing |
Doing |
array_ops |
|
Supported |
Doing |
Doing |
array_ops |
|
Supported |
Doing |
Doing |
array_ops |
|
Supported |
Doing |
Doing |
array_ops |
|
Supported |
Doing |
Doing |
array_ops |
|
Supported |
Doing |
Doing |
array_ops |
|
Supported |
Doing |
Doing |
array_ops |
|
Supported |
Supported |
Doing |
array_ops |
|
Supported |
Doing |
Doing |
array_ops |
|
Supported |
Doing |
Doing |
array_ops |
|
Supported |
Doing |
Doing |
array_ops |
|
Supported |
Doing |
Doing |
array_ops |
|
Supported |
Doing |
Doing |
array_ops |
|
Supported |
Doing |
Doing |
array_ops |
|
Supported |
Doing |
Doing |
array_ops |
|
Supported |
Doing |
Doing |
array_ops |
|
Supported |
Doing |
Doing |
array_ops |
|
Supported |
Supported |
Doing |
comm_ops |
|
Supported |
Supported |
Doing |
comm_ops |
|
Supported |
Supported |
Doing |
comm_ops |
|
Doing |
Supported |
Doing |
comm_ops |
|
Supported |
Doing |
Doing |
comm_ops |
|
Supported |
Supported |
Supported |
control_ops |
|
Doing |
Doing |
Doing |
control_ops |
|
Doing |
Doing |
Doing |
control_ops |
|
Supported |
Supported |
Supported |
debug_ops |
|
Supported |
Supported |
Supported |
debug_ops |
|
Supported |
Supported |
Supported |
debug_ops |
|
Supported |
Supported |
Supported |
debug_ops |
|
Supported |
Supported |
Supported |
debug_ops |
|
Supported |
Doing |
Doing |
debug_ops |
|
Supported |
Supported |
Doing |
other_ops |
|
Supported |
Doing |
Doing |
other_ops |
|
Supported |
Doing |
Doing |
other_ops |
|
Supported |
Doing |
Doing |
other_ops |
|
Supported |
Doing |
Doing |
other_ops |
|
Supported |
Doing |
Doing |
other_ops |
|
Supported |
Supported |
Supported |
other_ops |
|
Supported |
Doing |
Doing |
other_ops |
|
Supported |
Supported |
Doing |
random_ops |
|
Supported |
Doing |
Doing |
random_ops |
|
Supported |
Doing |
Doing |
random_ops |
|
Supported |
Doing |
Doing |
random_ops |
|
Supported |
Doing |
Doing |
random_ops |
|
Doing |
Doing |
Doing |
random_ops |
|
Supported |
Doing |
Doing |
random_ops |
|
Supported |
Supported |
Supported |
inner_ops |
|
Supported |
Doing |
Doing |
array_ops |
|
Supported |
Doing |
Doing |
image_ops |
Implicit Type Conversion
conversion rules
Scalar and Tensor operations: during operation, the scalar is automatically converted to Tensor, and the data type is consistent with the Tensor data type involved in the operation; when Tensor is a bool data type and the scalar is int or float, both the scalar and Tensor are converted to the Tensor with the data type of int32 or float32.
Tensor operation of different data types: the priority of data type is bool < uint8 < int8 < int16 < int32 < int64 < float16 < float32 <float64, during the operation, first determine the data type with the relatively highest priority among the Tensors involved in the operation, and then convert the low priority data type Tensor to the relatively highest priority data type; when the Tensor of int8 and uint8 data types are operated, they are converted to int16 Tensor.
Data type conversion of Parameter is not supported: If inferred according to the conversion rules, RuntimeError exception will be thrown when the data type conversion of Parameter defined in the network is required.
data types involved in conversion
bool
int8
uint8
int16
int32
int64
float16
float32
float64
support ops
op name |
---|