mindspore.nn.probability.bnn_layers.WithBNNLossCell
- class mindspore.nn.probability.bnn_layers.WithBNNLossCell(backbone, loss_fn, dnn_factor=1, bnn_factor=1)[source]
Generate a suitable WithLossCell for BNN to wrap the bayesian network with loss function.
- Parameters
backbone (Cell) – The target network.
loss_fn (Cell) – The loss function used to compute loss.
dnn_factor (int, float) – The coefficient of backbone’s loss, which is computed by the loss function. Default: 1.
bnn_factor (int, float) – The coefficient of KL loss, which is the KL divergence of Bayesian layer. Default: 1.
- Inputs:
data (Tensor) - Tensor of shape \((N, \ldots)\).
label (Tensor) - Tensor of shape \((N, \ldots)\).
- Outputs:
Tensor, a scalar tensor with shape \(()\).
- Supported Platforms:
Ascend
GPU
Examples
>>> net = Net() >>> loss_fn = nn.SoftmaxCrossEntropyWithLogits(sparse=False) >>> net_with_criterion = WithBNNLossCell(net, loss_fn) >>> >>> batch_size = 2 >>> data = Tensor(np.ones([batch_size, 16]).astype(np.float32) * 0.01) >>> label = Tensor(np.ones([batch_size, 1]).astype(np.float32)) >>> >>> net_with_criterion(data, label)
- property backbone_network
Returns the backbone network.
- Returns
Cell, the backbone network.