mindspore.LearningRateScheduler
- class mindspore.LearningRateScheduler(learning_rate_function)[source]
Change the learning_rate during training.
- Parameters
learning_rate_function (Function) – The function about how to change the learning rate during training.
Examples
>>> import numpy as np >>> import mindspore as ms >>> from mindspore import nn >>> from mindspore import LearningRateScheduler >>> import mindspore.nn as nn >>> from mindspore import dataset as ds ... >>> def learning_rate_function(lr, cur_step_num): ... if cur_step_num%1000 == 0: ... lr = lr*0.1 ... return lr ... >>> lr = 0.1 >>> momentum = 0.9 >>> net = nn.Dense(10, 5) >>> loss = nn.SoftmaxCrossEntropyWithLogits(sparse=True, reduction='mean') >>> optim = nn.Momentum(net.trainable_params(), learning_rate=lr, momentum=momentum) >>> model = ms.Model(net, loss_fn=loss, optimizer=optim) ... >>> data = {"x": np.float32(np.random.rand(64, 10)), "y": np.random.randint(0, 5, (64,))} >>> dataset = ds.NumpySlicesDataset(data=data).batch(32) >>> model.train(1, dataset, callbacks=[LearningRateScheduler(learning_rate_function)], ... dataset_sink_mode=False)
- step_end(run_context)[source]
Change the learning_rate at the end of step.
- Parameters
run_context (RunContext) – Include some information of the model.