mindspore.get_grad
- mindspore.get_grad(gradients, identifier)[source]
When return_ids of
mindspore.grad()
ormindspore.grad()
is set toTrue
, use return value of mindspore.grad, or the second return value of mindspore.grad as gradients. Then find the specific gradient from gradients according to identifier .As for gradient, two typical cases are included:
identifier is the position of the specific tensor to get gradient.
identifier is a parameter of a network.
- Parameters
gradients (Union[tuple[int, Tensor], tuple[tuple, tuple]]) – The return value of
mindspore.grad()
when return_ids is set to True.identifier (Union[int, Parameter]) – The position number of a tensor, or a parameter that is used in
mindspore.grad()
.
- Returns
The gradient of the tensor on the position or in the parameter that specified by the identifier.
- Raises
RuntimeError – If gradient is not found.
TypeError – If type of Args does not belong to required ones.
- Supported Platforms:
Ascend
GPU
CPU
Examples
>>> import mindspore >>> from mindspore import Tensor, nn >>> from mindspore import grad, get_grad >>> >>> # Cell object to be differentiated >>> class Net(nn.Cell): ... def construct(self, x, y, z): ... return x * y * z >>> x = Tensor([1, 2], mindspore.float32) >>> y = Tensor([-2, 3], mindspore.float32) >>> z = Tensor([0, 3], mindspore.float32) >>> net = Net() >>> out_grad = grad(net, grad_position=(1, 2), return_ids=True)(x, y, z) >>> output = get_grad(out_grad, 1) >>> print(output) [0. 6.]