mindspore.nn.probability.bnn_layers.WithBNNLossCell
- class mindspore.nn.probability.bnn_layers.WithBNNLossCell(backbone, loss_fn, dnn_factor=1, bnn_factor=1)[source]
Generate a suitable WithLossCell for BNN to wrap the bayesian network with loss function.
- Parameters
backbone (Cell) – The target network.
loss_fn (Cell) – The loss function used to compute loss.
dnn_factor (int, float) – The coefficient of backbone’s loss, which is computed by the loss function. Default: 1.
bnn_factor (int, float) – The coefficient of KL loss, which is the KL divergence of Bayesian layer. Default: 1.
- Inputs:
data (Tensor) - Tensor of shape \((N, \ldots)\).
label (Tensor) - Tensor of shape \((N, \ldots)\).
- Outputs:
Tensor, a scalar tensor with shape \(()\).
- Supported Platforms:
Ascend
GPU
Examples
>>> import numpy as np >>> import mindspore.nn as nn >>> from mindspore.nn.probability import bnn_layers >>> from mindspore import Tensor >>> class Net(nn.Cell): ... def __init__(self): ... super(Net, self).__init__() ... self.dense = bnn_layers.DenseReparam(16, 1) ... def construct(self, x): ... return self.dense(x) >>> net = Net() >>> loss_fn = nn.SoftmaxCrossEntropyWithLogits(sparse=False) >>> net_with_criterion = bnn_layers.WithBNNLossCell(net, loss_fn) >>> >>> batch_size = 2 >>> data = Tensor(np.ones([batch_size, 16]).astype(np.float32) * 0.01) >>> label = Tensor(np.ones([batch_size, 1]).astype(np.float32)) >>> output = net_with_criterion(data, label) >>> print(output.shape) (2,)
- property backbone_network
Returns the backbone network.
- Returns
Cell, the backbone network.