mindsponge.cell.OuterProductMean

View Source On Gitee
class mindsponge.cell.OuterProductMean(num_outer_channel, act_dim, num_output_channel, batch_size=None, slice_num=0)[source]

Computing the correlation of the input tensor along its second dimension, the computed correlation could be used to update the correlation features(e.g. the Pair representation).

\[OuterProductMean(\mathbf{act}) = Linear(flatten(mean(\mathbf{act}\otimes\mathbf{act})))\]
Parameters
  • num_outer_channel (float) – The last dimension size of intermediate layer in OuterProductMean.

  • act_dim (int) – The last dimension size of the input act.

  • num_output_channel (int) – The last dimension size of output.

  • batch_size (int) – The batch size of parameters in OuterProductMean, used in while control flow. Default: "None".

  • slice_num (int) – The slice num used in OuterProductMean layer when the memory is overflow. Default: 0.

Inputs:
  • act (Tensor) - The input tensor with shape \((dim_1, dim_2, act\_dim)\).

  • mask (Tensor) - The mask for OuterProductMean with shape \((dim_1, dim_2)\).

  • mask_norm (Tensor) - Squared L2-norm along the first dimension of mask, pre-computed to avoid re-computing, its shape is \((dim_2, dim_2, 1)\).

  • index (Tensor) - The index of while loop, only used in case of while control flow. Default: "None".

Outputs:

Tensor, the float tensor of the output of OuterProductMean layer with shape \((dim_2, dim_2, num\_output\_channel)\).

Supported Platforms:

Ascend GPU

Examples

>>> import numpy as np
>>> from mindsponge.cell import OuterProductMean
>>> from mindspore import dtype as mstype
>>> from mindspore import Tensor
>>> from mindspore.ops import operations as P
>>> model = OuterProductMean(num_outer_channel=32, act_dim=128, num_output_channel=256)
>>> act = Tensor(np.ones((32, 64, 128)), mstype.float32)
>>> mask = Tensor(np.ones((32, 64)), mstype.float32)
>>> mask_norm = P.ExpandDims()(P.MatMul(transpose_a=True)(mask, mask), -1)
>>> output= model(act, mask, mask_norm)
>>> print(output.shape)
(64, 64, 256)