Document feedback

Question document fragment

When a question document fragment contains a formula, it is displayed as a space.

Submission type
issue

It's a little complicated...

I'd like to ask someone.

Please select the submission type

Problem type
Specifications and Common Mistakes

- Specifications and Common Mistakes:

- Misspellings or punctuation mistakes,incorrect formulas, abnormal display.

- Incorrect links, empty cells, or wrong formats.

- Chinese characters in English context.

- Minor inconsistencies between the UI and descriptions.

- Low writing fluency that does not affect understanding.

- Incorrect version numbers, including software package names and version numbers on the UI.

Usability

- Usability:

- Incorrect or missing key steps.

- Missing main function descriptions, keyword explanation, necessary prerequisites, or precautions.

- Ambiguous descriptions, unclear reference, or contradictory context.

- Unclear logic, such as missing classifications, items, and steps.

Correctness

- Correctness:

- Technical principles, function descriptions, supported platforms, parameter types, or exceptions inconsistent with that of software implementation.

- Incorrect schematic or architecture diagrams.

- Incorrect commands or command parameters.

- Incorrect code.

- Commands inconsistent with the functions.

- Wrong screenshots.

- Sample code running error, or running results inconsistent with the expectation.

Risk Warnings

- Risk Warnings:

- Lack of risk warnings for operations that may damage the system or important data.

Content Compliance

- Content Compliance:

- Contents that may violate applicable laws and regulations or geo-cultural context-sensitive words and expressions.

- Copyright infringement.

Please select the type of question

Problem description

Describe the bug so that we can quickly locate the problem.

mindspore.ops.AdamWeightDecay

class mindspore.ops.AdamWeightDecay(use_locking=False)[source]

Updates gradients by the Adaptive Moment Estimation algorithm with weight decay (AdamWeightDecay).

The Adam algorithm is proposed in Adam: A Method for Stochastic Optimization. The AdamWeightDecay variant was proposed in Decoupled Weight Decay Regularization.

The updating formulas are as follows,

m=β1m+(1β1)gv=β2v+(1β2)ggupdate=mv+ϵupdate={update+weight_decayw if weight_decay>0update otherwise w=wlrupdate

m represents the 1st moment vector, v represents the 2nd moment vector, g represents gradient, β1,β2 represent beta1 and beta2, lr represents learning_rate, w represents var, decay represents weight_decay, ϵ represents epsilon.

Parameters

use_locking (bool) – Whether to enable a lock to protect variable tensors from being updated. If True , updates of the var, m, and v tensors will be protected by a lock. If False , the result is unpredictable. Default: False .

Inputs:
  • var (Union[Parameter, Tensor]) - Weights to be updated. The shape is (N,) where means any number of additional dimensions. The data type can be float16 or float32.

  • m (Union[Parameter, Tensor]) - The 1st moment vector in the updating formula, it should have the the shape as var. The data type can be float16 or float32.

  • v (Union[Parameter, Tensor]) - The 2nd moment vector in the updating formula, it should have the same shape as m.

  • lr (float) - lr in the updating formula. The paper suggested value is 108, the data type should be float32.

  • beta1 (float) - The exponential decay rate for the 1st moment estimations, the data type should be float32. The paper suggested value is 0.9

  • beta2 (float) - The exponential decay rate for the 2nd moment estimations, the data type should be float32. The paper suggested value is 0.999

  • epsilon (float) - Term added to the denominator to improve numerical stability, the data type should be float32.

  • decay (float) - The weight decay value, must be a scalar tensor with float32 data type. Default: 0.0 .

  • gradient (Tensor) - Gradient, has the same shape as var.

Outputs:

Tuple of 3 Tensor, the updated parameters.

  • var (Tensor) - The same shape and data type as var.

  • m (Tensor) - The same shape and data type as m.

  • v (Tensor) - The same shape and data type as v.

Raises
  • TypeError – If use_locking is not a bool.

  • TypeError – If lr, beta1, beta2, epsilon or decay is not a float32.

  • TypeError – If var, m or v is neither float16 nor float32.

  • TypeError – If gradient is not a Tensor.

  • ValueError – If epsilon <= 0.

  • ValueError – If beta1, beta2 is not in range (0.0,1.0).

  • ValueError – If decay < 0.

Supported Platforms:

Ascend GPU CPU

Examples

>>> import numpy as np
>>> import mindspore.nn as nn
>>> from mindspore import Tensor, Parameter, ops
>>> class Net(nn.Cell):
...     def __init__(self):
...         super(Net, self).__init__()
...         self.adam_weight_decay = ops.AdamWeightDecay()
...         self.var = Parameter(Tensor(np.ones([2, 2]).astype(np.float32)), name="var")
...         self.m = Parameter(Tensor(np.ones([2, 2]).astype(np.float32)), name="m")
...         self.v = Parameter(Tensor(np.ones([2, 2]).astype(np.float32)), name="v")
...     def construct(self, lr, beta1, beta2, epsilon, decay, grad):
...         out = self.adam_weight_decay(self.var, self.m, self.v, lr, beta1, beta2,
...                               epsilon, decay, grad)
...         return out
>>> net = Net()
>>> gradient = Tensor(np.ones([2, 2]).astype(np.float32))
>>> output = net(0.001, 0.9, 0.999, 1e-8, 0.0, gradient)
>>> print(net.var.asnumpy())
[[0.999 0.999]
[0.999 0.999]]