Comparing the functional differences with torch.autograd.enable_grad and torch.autograd.no_grad

View Source On Gitee

torch.autograd.enable_grad

torch.autograd.enable_grad()

For more information, see torch.autograd.enable_grad.

torch.autograd.no_grad

torch.autograd.no_grad()

For more information, see torch.autograd.no_grad.

mindspore.ops.stop_gradient

mindspore.ops.stop_gradient(input)

For more information, see mindspore.ops.stop_gradient.

Differences

PyTorch: Use torch.autograd.enable_grad to enable gradient calculation, and torch.autograd.no_grad to disable gradient calculation.

MindSpore: Use stop_gradient to disable calculation of gradient for certain operators.