mindspore.nn.WithLossCell
- class mindspore.nn.WithLossCell(backbone, loss_fn)[source]
Cell with loss function.
Wraps the network with loss function. This Cell accepts data and label as inputs and the computed loss will be returned.
- Parameters
- Inputs:
data (Tensor) - Tensor of shape \((N, \ldots)\). The dtype of data must be float16 or float32.
label (Tensor) - Tensor of shape \((N, \ldots)\). The dtype of label must be float16 or float32.
- Outputs:
Tensor, a tensor means the loss value, the shape of which is usually \(()\).
- Raises
TypeError – If dtype of data or label is neither float16 nor float32.
- Supported Platforms:
Ascend
GPU
CPU
Examples
>>> import mindspore as ms >>> from mindspore import Tensor, nn >>> import numpy as np >>> # Define the network structure of LeNet5. Refer to >>> # https://gitee.com/mindspore/docs/blob/master/docs/mindspore/code/lenet.py >>> net = LeNet5() >>> loss_fn = nn.SoftmaxCrossEntropyWithLogits(sparse=False) >>> net_with_criterion = nn.WithLossCell(net, loss_fn) >>> >>> batch_size = 2 >>> data = Tensor(np.ones([batch_size, 1, 32, 32]).astype(np.float32) * 0.01) >>> label = Tensor(np.ones([batch_size, 10]).astype(np.float32)) >>> >>> output_data = net_with_criterion(data, label)
- property backbone_network
Get the backbone network.
- Returns
Cell, the backbone network.
Examples
>>> from mindspore import nn >>> # Define the network structure of LeNet5. Refer to >>> # https://gitee.com/mindspore/docs/blob/master/docs/mindspore/code/lenet.py >>> net = LeNet5() >>> loss_fn = nn.SoftmaxCrossEntropyWithLogits(sparse=False) >>> net_with_criterion = nn.WithLossCell(net, loss_fn) >>> backbone = net_with_criterion.backbone_network