比较与tf.compat.v1.train.exponential_decay的功能差异
tf.compat.v1.train.exponential_decay
tf.compat.v1.train.exponential_decay(
learning_rate,
global_step,
decay_steps,
decay_rate,
staircase=False,
name=None
) -> Tensor
mindspore.nn.exponential_decay_lr
mindspore.nn.exponential_decay_lr(
learning_rate,
decay_rate,
total_step,
step_per_epoch,
decay_epoch,
is_stair=False
) -> list[float]
差异对比
TensorFlow:基于指数衰减函数计算学习率。
MindSpore:MindSpore此API实现功能与TensorFlow基本一致。
分类 |
子类 |
TensorFlow |
MindSpore |
差异 |
---|---|---|---|---|
参数 |
参数1 |
learning_rate |
learning_rate |
- |
参数2 |
global_step |
total_step |
功能一致,参数名称不同 |
|
参数3 |
decay_steps |
decay_epoch |
功能一致,参数名称不同 |
|
参数4 |
decay_rate |
decay_rate |
- |
|
参数5 |
staircase |
is_stair |
功能一致,参数名称不同 |
|
参数6 |
name |
- |
不涉及 |
|
参数7 |
- |
step_per_epoch |
每个epoch的step数,TensorFlow无此参数 |
代码示例
两API实现功能一致,用法相同。
# TensorFlow
import tensorflow as tf
learning_rate = 1.0
decay_rate = 0.9
step_per_epoch = 2
epochs = 6
lr = []
for epoch in range(epochs):
learning_rate = tf.compat.v1.train.exponential_decay(learning_rate, epoch, step_per_epoch, decay_rate, staircase=True)
lr.append(round(float(learning_rate().numpy()), 2))
print(lr)
# [1.0, 1.0, 0.9, 0.9, 0.81, 0.81]
# MindSpore
import mindspore.nn as nn
learning_rate = 1.0
decay_rate = 0.9
total_step = 6
step_per_epoch = 2
decay_epoch = 1
output = nn.exponential_decay_lr(learning_rate, decay_rate, total_step, step_per_epoch, decay_epoch)
print(output)
# [1.0, 1.0, 0.9, 0.9, 0.81, 0.81]