Differences with torch.broadcast_to

View Source On Gitee

The following mapping relationships can be found in this file.

PyTorch APIs

MindSpore APIs

torch.broadcast_to

mindspore.ops.broadcast_to

torch.Tensor.broadcast_to

mindspore.Tensor.broadcast_to

torch.broadcast_to

torch.broadcast_to(input, shape) -> Tensor

For more information, see torch.broadcast_to.

mindspore.ops.broadcast_to

mindspore.ops.broadcast_to(input, shape) -> Tensor

For more information, see mindspore.ops.broadcast_to.

Differences

PyTorch: Broadcast the input shape to the target shape.

MindSpore: MindSpore API basically implements the same function as PyTorch, with additional support for the -1 dimension in the shape. If there is a -1 dimension in the target shape, it is replaced by the value of the input shape in that dimension. If there is a -1 dimension in the target shape, the -1 dimension cannot be located in a dimension that does not exist.

Categories

Subcategories

PyTorch

MindSpore

Differences

Input

Single input

input

input

Same function, different parameter names

Parameter

Parameter 1

shape

shape

Same function

Code Example 1

# PyTorch
import torch

shape = (2, 3)
x = torch.tensor([[1], [2]]).float()
torch_output = torch.broadcast_to(x, shape)
print(torch_output.numpy())
# [[1. 1. 1.]
#  [2. 2. 2.]]

# MindSpore
import mindspore
from mindspore import Tensor
import numpy as np

shape = (2, 3)
x = Tensor(np.array([[1], [2]]).astype(np.float32))
output = mindspore.ops.function.broadcast_to(x, shape)
print(output)
# [[1. 1. 1.]
#  [2. 2. 2.]]