mindspore.ops.BatchMatMul

class mindspore.ops.BatchMatMul(transpose_a=False, transpose_b=False)[source]

Computes matrix multiplication between two tensors by batch.

\[\begin{split}\\text{output}[..., :, :] = \\text{matrix}(x[..., :, :]) * \\text{matrix}(y[..., :, :])\end{split}\]

The two input tensors must have the same rank and the rank must be not less than 3.

Parameters
  • transpose_x (bool) – If true, the last two dimensions of x is transposed before multiplication. Default: False.

  • transpose_y (bool) – If true, the last two dimensions of y is transposed before multiplication. Default: False.

Inputs:
  • x (Tensor) - The first tensor to be multiplied. The shape of the tensor is \((*B, N, C)\), where \(*B\) represents the batch size which can be multidimensional, \(N\) and \(C\) are the size of the last two dimensions. If transpose_a is True, its shape must be \((*B, C, N)\).

  • y (Tensor) - The second tensor to be multiplied. The shape of the tensor is \((*B, C, M)\). If transpose_b is True, its shape must be \((*B, M, C)\).

Outputs:

Tensor, the shape of the output tensor is \((*B, N, M)\).

Raises
  • TypeError – If transpose_x or transpose_y is not a bool.

  • ValueError – If length of shape of x is not equal to length of shape of y or length of shape of x is less than 3.

Supported Platforms:

Ascend GPU CPU

Examples

>>> x = Tensor(np.ones(shape=[2, 4, 1, 3]), mindspore.float32)
>>> y = Tensor(np.ones(shape=[2, 4, 3, 4]), mindspore.float32)
>>> batmatmul = ops.BatchMatMul()
>>> output = batmatmul(x, y)
>>> print(output)
[[[[3. 3. 3. 3.]]
  [[3. 3. 3. 3.]]
  [[3. 3. 3. 3.]]
  [[3. 3. 3. 3.]]]
 [[[3. 3. 3. 3.]]
  [[3. 3. 3. 3.]]
  [[3. 3. 3. 3.]]
  [[3. 3. 3. 3.]]]]
>>> x = Tensor(np.ones(shape=[2, 4, 3, 1]), mindspore.float32)
>>> y = Tensor(np.ones(shape=[2, 4, 3, 4]), mindspore.float32)
>>> batmatmul = ops.BatchMatMul(transpose_a=True)
>>> output = batmatmul(x, y)
>>> print(output)
[[[[3. 3. 3. 3.]]
  [[3. 3. 3. 3.]]
  [[3. 3. 3. 3.]]
  [[3. 3. 3. 3.]]]
 [[[3. 3. 3. 3.]]
  [[3. 3. 3. 3.]]
  [[3. 3. 3. 3.]]
  [[3. 3. 3. 3.]]]]