mindspore.nn.TrainOneStepCell
- class mindspore.nn.TrainOneStepCell(network, optimizer, sens=1.0)[source]
Network training package class.
Wraps the network with an optimizer. The resulting Cell is trained with input ‘*inputs’. The backward graph will be created in the construct function to update the parameter. Different parallel modes are available for training.
- Parameters
network (Cell) – The training network. The network only supports single output.
optimizer (Union[Cell]) – Optimizer for updating the weights.
sens (numbers.Number) – The scaling number to be filled as the input of backpropagation. Default value is 1.0.
- Inputs:
(*inputs) (Tuple(Tensor)) - Tuple of input tensors with shape \((N, \ldots)\).
- Outputs:
Tensor, a tensor means the loss value, the shape of which is usually \(()\).
- Raises
TypeError – If sens is not a number.
- Supported Platforms:
Ascend
GPU
CPU
Examples
>>> net = Net() >>> loss_fn = nn.SoftmaxCrossEntropyWithLogits() >>> optim = nn.Momentum(net.trainable_params(), learning_rate=0.1, momentum=0.9) >>> #1) Using the WithLossCell existing provide >>> loss_net = nn.WithLossCell(net, loss_fn) >>> train_net = nn.TrainOneStepCell(loss_net, optim) >>> >>> #2) Using user-defined WithLossCell >>> class MyWithLossCell(Cell): ... def __init__(self, backbone, loss_fn): ... super(MyWithLossCell, self).__init__(auto_prefix=False) ... self._backbone = backbone ... self._loss_fn = loss_fn ... ... def construct(self, x, y, label): ... out = self._backbone(x, y) ... return self._loss_fn(out, label) ... ... @property ... def backbone_network(self): ... return self._backbone ... >>> loss_net = MyWithLossCell(net, loss_fn) >>> train_net = nn.TrainOneStepCell(loss_net, optim)