mindspore.ops.ScatterUpdate

class mindspore.ops.ScatterUpdate(use_locking=True)[source]

Updates tensor values by using input indices and value.

Using given values to update tensor value, along with the input indices.

for each i, …, j in indices.shape:

\[\text{input_x}[\text{indices}[i, ..., j], :] = \text{updates}[i, ..., j, :]\]

Inputs of input_x and updates comply with the implicit type conversion rules to make the data types consistent. If they have different data types, the lower priority data type will be converted to the relatively highest priority data type.

Parameters

use_locking (bool) – Whether to protect the assignment by a lock. Default: True.

Inputs:
  • input_x (Parameter) - The target tensor, with data type of Parameter. The shape is 0-D or \((N, *)\) where \(*\) means,any number of additional dimensions.

  • indices (Tensor) - The index of input tensor. With int32 data type. If there are duplicates in indices, the order for updating is undefined.

  • updates (Tensor) - The tensor to update the input tensor, has the same type as input, and updates.shape = indices.shape + input_x.shape[1:].

Outputs:

Tensor, has the same shape and type as input_x.

Raises
  • TypeError – If use_locking is not a bool.

  • TypeError – If indices is not an int32.

  • ValueError – If the shape of updates is not equal to indices.shape + input_x.shape[1:].

  • RuntimeError – If the data type of input_x and updates conversion of Parameter is required when data type conversion of Parameter is not supported.

Supported Platforms:

Ascend GPU CPU

Examples

>>> np_x = np.array([[-0.1, 0.3, 3.6], [0.4, 0.5, -3.2]])
>>> input_x = mindspore.Parameter(Tensor(np_x, mindspore.float32), name="x")
>>> indices = Tensor(np.array([0, 1]), mindspore.int32)
>>> np_updates = np.array([[2.0, 1.2, 1.0], [3.0, 1.2, 1.0]])
>>> updates = Tensor(np_updates, mindspore.float32)
>>> op = ops.ScatterUpdate()
>>> output = op(input_x, indices, updates)
>>> print(output)
[[2. 1.2  1.]
 [3. 1.2  1.]]