算子支持
mindspore.nn
操作名 |
Ascend |
GPU |
CPU |
算子类别 |
---|---|---|---|---|
Supported |
Supported |
Supported |
layer/activation |
|
Supported |
Supported |
Doing |
layer/activation |
|
Supported |
Supported |
Supported |
layer/activation |
|
Supported |
Supported |
Doing |
layer/activation |
|
Doing |
Supported |
Doing |
layer/activation |
|
Doing |
Supported |
Doing |
layer/activation |
|
Supported |
Doing |
Doing |
layer/activation |
|
Supported |
Supported |
Doing |
layer/activation |
|
Supported |
Supported |
Doing |
layer/activation |
|
Supported |
Supported |
Doing |
layer/activation |
|
Supported |
Doing |
Doing |
layer/activation |
|
Supported |
Supported |
Doing |
layer/basic |
|
Supported |
Supported |
Doing |
layer/basic |
|
Supported |
Supported |
Doing |
layer/basic |
|
Supported |
Doing |
Supported |
layer/basic |
|
Doing |
Supported |
Doing |
layer/basic |
|
Doing |
Supported |
Doing |
layer/basic |
|
Supported |
Supported |
Doing |
layer/basic |
|
Supported |
Doing |
Doing |
layer/basic |
|
Supported |
Supported |
Doing |
layer/container |
|
Supported |
Supported |
Doing |
layer/container |
|
Supported |
Supported |
Supported |
layer/conv |
|
Supported |
Supported |
Doing |
layer/conv |
|
Supported |
Supported |
Supported |
layer/conv |
|
Supported |
Doing |
Doing |
layer/conv |
|
Supported |
Doing |
Doing |
layer/conv |
|
Doing |
Supported |
Doing |
layer/embedding |
|
Doing |
Doing |
Doing |
layer/image |
|
Doing |
Doing |
Doing |
layer/image |
|
Doing |
Doing |
Doing |
layer/image |
|
Supported |
Doing |
Doing |
layer/image |
|
Doing |
Supported |
Supported |
layer/lstm |
|
Supported |
Doing |
Doing |
layer/normalization |
|
Supported |
Doing |
Doing |
layer/normalization |
|
Supported |
Supported |
Doing |
layer/normalization |
|
Supported |
Doing |
Doing |
layer/normalization |
|
Supported |
Supported |
Doing |
layer/normalization |
|
Supported |
Doing |
Doing |
layer/normalization |
|
Supported |
Doing |
Doing |
layer/normalization |
|
Supported |
Doing |
Doing |
layer/normalization |
|
Supported |
Doing |
Doing |
layer/normalization |
|
Supported |
Supported |
Supported |
layer/pooling |
|
Doing |
Supported |
Doing |
layer/pooling |
|
Doing |
Doing |
Doing |
loss/loss |
|
Supported |
Doing |
Doing |
loss/loss |
|
Supported |
Doing |
Doing |
loss/loss |
|
Supported |
Supported |
Doing |
loss/loss |
|
Supported |
Doing |
Doing |
loss/loss |
|
Supported |
Doing |
Doing |
optim/ProximalAdagrad |
|
Supported |
Doing |
Doing |
optim/lazyadam |
|
Supported |
Doing |
Doing |
optim/adam |
|
Supported |
Supported |
Doing |
optim/adam |
|
Supported |
Supported |
Doing |
optim/lamb |
|
Supported |
Doing |
Doing |
optim/lars |
|
Supported |
Supported |
Doing |
optim/momentum |
|
Supported |
Supported |
Doing |
optim/optimizer |
|
Supported |
Supported |
Doing |
optim/optimizer |
|
Supported |
Doing |
Doing |
optim/sgd |
|
Supported |
Supported |
Doing |
wrap/cell_wrapper |
|
Supported |
Supported |
Doing |
wrap/cell_wrapper |
|
Supported |
Supported |
Doing |
wrap/cell_wrapper |
|
Doing |
Supported |
Doing |
wrap/cell_wrapper |
|
Doing |
Supported |
Doing |
wrap/cell_wrapper |
|
Supported |
Supported |
Doing |
wrap/cell_wrapper |
|
Supported |
Doing |
Doing |
wrap/cell_wrapper |
|
Supported |
Doing |
Doing |
wrap/grad_reducer |
|
Doing |
Doing |
Doing |
wrap/loss_scale |
|
Doing |
Doing |
Doing |
wrap/loss_scale |
|
Doing |
Doing |
Doing |
wrap/loss_scale |
|
Supported |
Supported |
Supported |
cell |
mindspore.ops.operations
操作名 |
Ascend |
GPU |
CPU |
算子类别 |
---|---|---|---|---|
Supported |
Supported |
Doing |
nn_ops |
|
Supported |
Supported |
Supported |
nn_ops |
|
Doing |
Doing |
Doing |
nn_ops |
|
Supported |
Doing |
Doing |
nn_ops |
|
Supported |
Doing |
Doing |
nn_ops |
|
Doing |
Doing |
Doing |
nn_ops |
|
Supported |
Doing |
Doing |
nn_ops |
|
Supported |
Doing |
Doing |
nn_ops |
|
Supported |
Doing |
Doing |
nn_ops |
|
Supported |
Doing |
Doing |
nn_ops |
|
Supported |
Doing |
Doing |
nn_ops |
|
Supported |
Supported |
Doing |
nn_ops |
|
Supported |
Doing |
Doing |
nn_ops |
|
Supported |
Supported |
Supported |
nn_ops |
|
Supported |
Supported |
Doing |
nn_ops |
|
Doing |
Supported |
Doing |
nn_ops |
|
Doing |
Supported |
Doing |
nn_ops |
|
Supported |
Supported |
Doing |
nn_ops |
|
Supported |
Supported |
Doing |
nn_ops |
|
Supported |
Doing |
Doing |
nn_ops |
|
Supported |
Doing |
Doing |
nn_ops |
|
Supported |
Supported |
Supported |
nn_ops |
|
Supported |
Doing |
Doing |
nn_ops |
|
Supported |
Doing |
Doing |
nn_ops |
|
mindspore.ops.operations.DepthwiseConv2dNativeiBackpropFilter |
Supported |
Doing |
Doing |
nn_ops |
Supported |
Doing |
Doing |
nn_ops |
|
Supported |
Supported |
Supported |
nn_ops |
|
Supported |
Supported |
Doing |
nn_ops |
|
Supported |
Supported |
Doing |
nn_ops |
|
Supported |
Supported |
Supported |
nn_ops |
|
Supported |
Doing |
Doing |
nn_ops |
|
Supported |
Supported |
Doing |
nn_ops |
|
mindspore.ops.operations.SparseSoftmaxCrossEntropyWithLogits |
Doing |
Supported |
Supported |
nn_ops |
Supported |
Supported |
Supported |
nn_ops |
|
Supported |
Doing |
Doing |
nn_ops |
|
Supported |
Doing |
Doing |
nn_ops |
|
Supported |
Doing |
Doing |
nn_ops |
|
Supported |
Doing |
Doing |
nn_ops |
|
Supported |
Supported |
Doing |
nn_ops |
|
Supported |
Supported |
Doing |
nn_ops |
|
Supported |
Doing |
Doing |
nn_ops |
|
Supported |
Doing |
Doing |
nn_ops |
|
Supported |
Doing |
Doing |
nn_ops |
|
Doing |
Doing |
Supported |
nn_ops |
|
Supported |
Doing |
Doing |
nn_ops |
|
Doing |
Doing |
Supported |
nn_ops |
|
Doing |
Doing |
Supported |
nn_ops |
|
Supported |
Doing |
Doing |
nn_ops |
|
Supported |
Doing |
Doing |
nn_ops |
|
Supported |
Supported |
Doing |
nn_ops |
|
Supported |
Doing |
Doing |
nn_ops |
|
Supported |
Doing |
Doing |
nn_ops |
|
Supported |
Doing |
Doing |
nn_ops |
|
Supported |
Doing |
Doing |
nn_ops |
|
Supported |
Supported |
Doing |
nn_ops |
|
Supported |
Supported |
Doing |
nn_ops |
|
Supported |
Supported |
Doing |
nn_ops |
|
Supported |
Doing |
Doing |
nn_ops |
|
Doing |
Supported |
Supported |
nn_ops |
|
Doing |
Doing |
Doing |
nn_ops |
|
Supported |
Supported |
Doing |
nn_ops |
|
Supported |
Doing |
Doing |
nn_ops |
|
Supported |
Doing |
Doing |
nn_ops |
|
Supported |
Supported |
Doing |
nn_ops |
|
Supported |
Doing |
Doing |
nn_ops |
|
Supported |
Doing |
Doing |
nn_ops |
|
Supported |
Doing |
Doing |
nn_ops |
|
Supported |
Supported |
Doing |
math_ops |
|
Supported |
Supported |
Doing |
math_ops |
|
Supported |
Doing |
Doing |
math_ops |
|
Supported |
Supported |
Doing |
math_ops |
|
Supported |
Supported |
Doing |
math_ops |
|
Supported |
Doing |
Doing |
math_ops |
|
Supported |
Supported |
Doing |
math_ops |
|
Supported |
Doing |
Doing |
math_ops |
|
Supported |
Doing |
Doing |
math_ops |
|
Supported |
Doing |
Doing |
math_ops |
|
Supported |
Supported |
Supported |
math_ops |
|
Supported |
Supported |
Doing |
math_ops |
|
Supported |
Doing |
Doing |
math_ops |
|
Supported |
Supported |
Doing |
math_ops |
|
Supported |
Supported |
Doing |
math_ops |
|
Supported |
Supported |
Doing |
math_ops |
|
Supported |
Supported |
Supported |
math_ops |
|
Supported |
Supported |
Doing |
math_ops |
|
Supported |
Doing |
Doing |
math_ops |
|
Supported |
Supported |
Doing |
math_ops |
|
Supported |
Supported |
Doing |
math_ops |
|
Supported |
Supported |
Doing |
math_ops |
|
Supported |
Supported |
Doing |
math_ops |
|
Supported |
Supported |
Doing |
math_ops |
|
Supported |
Supported |
Doing |
math_ops |
|
Supported |
Doing |
Doing |
math_ops |
|
Supported |
Supported |
Doing |
math_ops |
|
Supported |
Supported |
Doing |
math_ops |
|
Supported |
Supported |
Doing |
math_ops |
|
Supported |
Doing |
Doing |
math_ops |
|
Supported |
Doing |
Doing |
math_ops |
|
Supported |
Doing |
Doing |
math_ops |
|
Supported |
Doing |
Doing |
math_ops |
|
Supported |
Supported |
Doing |
math_ops |
|
Doing |
Supported |
Supported |
math_ops |
|
Supported |
Supported |
Doing |
math_ops |
|
Supported |
Supported |
Doing |
math_ops |
|
Supported |
Supported |
Doing |
math_ops |
|
Supported |
Supported |
Doing |
math_ops |
|
Supported |
Doing |
Doing |
math_ops |
|
Supported |
Supported |
Doing |
math_ops |
|
Supported |
Supported |
Doing |
math_ops |
|
Supported |
Supported |
Doing |
math_ops |
|
Supported |
Supported |
Doing |
math_ops |
|
Supported |
Doing |
Doing |
math_ops |
|
Supported |
Doing |
Doing |
math_ops |
|
Supported |
Doing |
Doing |
math_ops |
|
Supported |
Doing |
Doing |
math_ops |
|
Supported |
Doing |
Doing |
math_ops |
|
Supported |
Doing |
Doing |
math_ops |
|
Supported |
Doing |
Doing |
math_ops |
|
Supported |
Doing |
Doing |
math_ops |
|
Supported |
Doing |
Doing |
math_ops |
|
Doing |
Supported |
Doing |
math_ops |
|
Supported |
Doing |
Doing |
math_ops |
|
Supported |
Doing |
Doing |
math_ops |
|
Supported |
Doing |
Doing |
math_ops |
|
Supported |
Doing |
Doing |
math_ops |
|
Supported |
Doing |
Doing |
math_ops |
|
Supported |
Doing |
Doing |
math_ops |
|
Supported |
Doing |
Doing |
math_ops |
|
Supported |
Doing |
Doing |
math_ops |
|
Supported |
Doing |
Doing |
math_ops |
|
Supported |
Doing |
Doing |
math_ops |
|
Supported |
Doing |
Doing |
math_ops |
|
Supported |
Doing |
Doing |
math_ops |
|
Supported |
Doing |
Doing |
math_ops |
|
Supported |
Doing |
Doing |
math_ops |
|
Supported |
Doing |
Doing |
math_ops |
|
Supported |
Doing |
Doing |
math_ops |
|
Supported |
Doing |
Doing |
math_ops |
|
Supported |
Doing |
Doing |
math_ops |
|
Supported |
Doing |
Doing |
math_ops |
|
Supported |
Doing |
Doing |
math_ops |
|
Supported |
Doing |
Doing |
math_ops |
|
Supported |
Doing |
Doing |
math_ops |
|
Supported |
Doing |
Doing |
math_ops |
|
Supported |
Supported |
Doing |
array_ops |
|
Supported |
Supported |
Supported |
array_ops |
|
Supported |
Supported |
Supported |
array_ops |
|
Supported |
Supported |
Doing |
array_ops |
|
Supported |
Supported |
Supported |
array_ops |
|
Supported |
Supported |
Supported |
array_ops |
|
Supported |
Supported |
Supported |
array_ops |
|
Supported |
Supported |
Supported |
array_ops |
|
Supported |
Supported |
Doing |
array_ops |
|
Supported |
Supported |
Doing |
array_ops |
|
Supported |
Supported |
Doing |
array_ops |
|
Supported |
Supported |
Doing |
array_ops |
|
Supported |
Supported |
Supported |
array_ops |
|
Doing |
Doing |
Doing |
array_ops |
|
Supported |
Supported |
Supported |
array_ops |
|
Supported |
Supported |
Supported |
array_ops |
|
Supported |
Doing |
Doing |
array_ops |
|
Supported |
Supported |
Doing |
array_ops |
|
Supported |
Supported |
Supported |
array_ops |
|
Supported |
Supported |
Supported |
array_ops |
|
Supported |
Supported |
Supported |
array_ops |
|
Supported |
Doing |
Doing |
array_ops |
|
Supported |
Supported |
Supported |
array_ops |
|
Supported |
Doing |
Doing |
array_ops |
|
Supported |
Supported |
Doing |
array_ops |
|
Supported |
Doing |
Doing |
array_ops |
|
Supported |
Supported |
Doing |
array_ops |
|
Supported |
Supported |
Doing |
array_ops |
|
Supported |
Doing |
Doing |
array_ops |
|
Supported |
Doing |
Doing |
array_ops |
|
Supported |
Supported |
Doing |
array_ops |
|
Supported |
Doing |
Doing |
array_ops |
|
Supported |
Supported |
Doing |
array_ops |
|
Supported |
Supported |
Doing |
array_ops |
|
Supported |
Supported |
Doing |
array_ops |
|
Doing |
Doing |
Doing |
array_ops |
|
Doing |
Doing |
Doing |
array_ops |
|
Supported |
Supported |
Supported |
array_ops |
|
Supported |
Doing |
Doing |
array_ops |
|
Supported |
Doing |
Doing |
array_ops |
|
Supported |
Doing |
Doing |
array_ops |
|
Supported |
Supported |
Doing |
array_ops |
|
Supported |
Doing |
Doing |
array_ops |
|
Doing |
Doing |
Supported |
array_ops |
|
Supported |
Doing |
Doing |
array_ops |
|
Supported |
Doing |
Doing |
array_ops |
|
Supported |
Doing |
Doing |
array_ops |
|
Supported |
Doing |
Doing |
array_ops |
|
Supported |
Doing |
Doing |
array_ops |
|
Supported |
Doing |
Doing |
array_ops |
|
Supported |
Doing |
Doing |
array_ops |
|
Supported |
Doing |
Doing |
array_ops |
|
Supported |
Doing |
Doing |
array_ops |
|
Supported |
Doing |
Doing |
array_ops |
|
Supported |
Doing |
Doing |
array_ops |
|
Supported |
Supported |
Doing |
array_ops |
|
Supported |
Doing |
Doing |
array_ops |
|
Supported |
Doing |
Doing |
array_ops |
|
Supported |
Doing |
Doing |
array_ops |
|
Supported |
Doing |
Doing |
array_ops |
|
Supported |
Doing |
Doing |
array_ops |
|
Supported |
Doing |
Doing |
array_ops |
|
Supported |
Doing |
Doing |
array_ops |
|
Supported |
Doing |
Doing |
array_ops |
|
Supported |
Doing |
Doing |
array_ops |
|
Supported |
Supported |
Doing |
comm_ops |
|
Supported |
Supported |
Doing |
comm_ops |
|
Supported |
Supported |
Doing |
comm_ops |
|
Doing |
Supported |
Doing |
comm_ops |
|
Supported |
Doing |
Doing |
comm_ops |
|
Supported |
Supported |
Supported |
control_ops |
|
Doing |
Doing |
Doing |
control_ops |
|
Doing |
Doing |
Doing |
control_ops |
|
Supported |
Supported |
Supported |
debug_ops |
|
Supported |
Supported |
Supported |
debug_ops |
|
Supported |
Supported |
Supported |
debug_ops |
|
Supported |
Supported |
Supported |
debug_ops |
|
Supported |
Supported |
Supported |
debug_ops |
|
Supported |
Doing |
Doing |
debug_ops |
|
Supported |
Supported |
Doing |
other_ops |
|
Supported |
Doing |
Doing |
other_ops |
|
Supported |
Doing |
Doing |
other_ops |
|
Supported |
Doing |
Doing |
other_ops |
|
Supported |
Doing |
Doing |
other_ops |
|
Supported |
Doing |
Doing |
other_ops |
|
Supported |
Supported |
Supported |
other_ops |
|
Supported |
Doing |
Doing |
other_ops |
|
Supported |
Supported |
Doing |
random_ops |
|
Supported |
Doing |
Doing |
random_ops |
|
Supported |
Doing |
Doing |
random_ops |
|
Supported |
Doing |
Doing |
random_ops |
|
Supported |
Doing |
Doing |
random_ops |
|
Doing |
Doing |
Doing |
random_ops |
|
Supported |
Doing |
Doing |
random_ops |
|
Supported |
Supported |
Supported |
inner_ops |
|
Supported |
Doing |
Doing |
array_ops |
|
Supported |
Doing |
Doing |
image_ops |
隐式类型转换
转换规则
标量与Tensor运算:运算时,将标量自动转为Tensor,数据类型和参与运算的Tensor数据类型保持一致; 而当Tensor是bool数据类型,标量是int或float时,将标量和Tensor都转为数据类型为int32或float32的Tensor。
不同数据类型Tensor运算:数据类型优先级排序为bool < uint8 < int8 < int16 < int32 < int64 < float16 < float32 < float64, 运算时,先确定参与运算的Tensor中优先级相对最高的数据类型,然后将低优先级数据类型Tensor转换为相对最高优先级数据类型; 而当int8和uint8数据类型的Tensor进行运算时,将其都转为int16的Tensor。
不支持对Parameter进行数据类型转换:如果按照转换规则推导,需要对网络中定义的Parameter进行数据类型转换时,会抛出RuntimeError异常。
参与转换的数据类型
bool
int8
uint8
int16
int32
int64
float16
float32
float64
支持算子
算子名 |
---|