DNN一键转换成BNN¶

1. 定义DNN模型；

2. 定义损失函数和优化器；

3. 实现功能一：转换整个模型；

4. 实现功能二：转换指定类型的层。

本例适用于GPU和Ascend环境。你可以在这里下载完整的样例代码：https://gitee.com/mindspore/mindspore/tree/r1.7/tests/st/probability/transforms

定义DNN模型¶

[1]:

import mindspore.nn as nn
from mindspore.common.initializer import Normal

class LeNet5(nn.Cell):
"""Lenet network structure."""
# define the operator required
def __init__(self, num_class=10, num_channel=1):
super(LeNet5, self).__init__()
self.conv1 = nn.Conv2d(num_channel, 6, 5, pad_mode='valid')
self.conv2 = nn.Conv2d(6, 16, 5, pad_mode='valid')
self.fc1 = nn.Dense(16 * 5 * 5, 120, weight_init=Normal(0.02))
self.fc2 = nn.Dense(120, 84, weight_init=Normal(0.02))
self.fc3 = nn.Dense(84, num_class, weight_init=Normal(0.02))
self.relu = nn.ReLU()
self.max_pool2d = nn.MaxPool2d(kernel_size=2, stride=2)
self.flatten = nn.Flatten()

# use the preceding operators to construct networks
def construct(self, x):
x = self.max_pool2d(self.relu(self.conv1(x)))
x = self.max_pool2d(self.relu(self.conv2(x)))
x = self.flatten(x)
x = self.relu(self.fc1(x))
x = self.relu(self.fc2(x))
x = self.fc3(x)
return x


定义损失函数和优化器¶

[2]:

import pprint
from mindspore.nn import WithLossCell, TrainOneStepCell
from mindspore.nn.probability import transforms
from mindspore import context

context.set_context(mode=context.GRAPH_MODE, device_target="GPU")

network = LeNet5()
lr = 0.01
momentum = 0.9
criterion = nn.SoftmaxCrossEntropyWithLogits(sparse=True, reduction="mean")
#optimizer = nn.Momentum(network.trainable_params(), lr, momentum)
net_with_loss = WithLossCell(network, criterion)
train_network = TrainOneStepCell(net_with_loss, optimizer)

DNN_layer_name = [i.name for i in network.trainable_params()]
pprint.pprint(DNN_layer_name)

['conv1.weight',
'conv2.weight',
'fc1.weight',
'fc1.bias',
'fc2.weight',
'fc2.bias',
'fc3.weight',
'fc3.bias']


功能实现一：转换整个模型¶

[3]:

bnn_transformer = transforms.TransformToBNN(train_network, 60000, 0.000001)
train_bnn_network = bnn_transformer.transform_to_bnn_model()

BNN_layer_name = [i.name for i in network.trainable_params()]
pprint.pprint(BNN_layer_name)

['conv1.weight_posterior.mean',
'conv1.weight_posterior.untransformed_std',
'conv2.weight_posterior.mean',
'conv2.weight_posterior.untransformed_std',
'fc1.weight_posterior.mean',
'fc1.weight_posterior.untransformed_std',
'fc1.bias_posterior.mean',
'fc1.bias_posterior.untransformed_std',
'fc2.weight_posterior.mean',
'fc2.weight_posterior.untransformed_std',
'fc2.bias_posterior.mean',
'fc2.bias_posterior.untransformed_std',
'fc3.weight_posterior.mean',
'fc3.weight_posterior.untransformed_std',
'fc3.bias_posterior.mean',
'fc3.bias_posterior.untransformed_std']


功能实现二：转换指定类型的层¶

transform_to_bnn_layer方法可以将DNN模型中指定类型的层（nn.Dense或者nn.Conv2d）转换为对应的贝叶斯层。其定义如下:

transform_to_bnn_layer(dnn_layer, bnn_layer, get_args=None, add_args=None):


• dnn_layer：指定将哪个类型的DNN层转换成BNN层。

• bnn_layer：指定DNN层将转换成哪个类型的BNN层。

• get_args：指定从DNN层中获取哪些参数。

• add_args：指定为BNN层的哪些参数重新赋值。

[4]:

from mindspore.nn.probability import bnn_layers

network = LeNet5(10)
criterion = nn.SoftmaxCrossEntropyWithLogits(sparse=True, reduction="mean")
net_with_loss = WithLossCell(network, criterion)
train_network = TrainOneStepCell(net_with_loss, optimizer)
bnn_transformer = transforms.TransformToBNN(train_network, 60000, 0.000001)
train_bnn_network = bnn_transformer.transform_to_bnn_layer(nn.Dense, bnn_layers.DenseReparam)


[5]:

DNN_layer_name = [i.name for i in network.trainable_params()]
pprint.pprint(DNN_layer_name)

['conv1.weight',
'conv2.weight',
'fc1.weight_posterior.mean',
'fc1.weight_posterior.untransformed_std',
'fc1.bias_posterior.mean',
'fc1.bias_posterior.untransformed_std',
'fc2.weight_posterior.mean',
'fc2.weight_posterior.untransformed_std',
'fc2.bias_posterior.mean',
'fc2.bias_posterior.untransformed_std',
'fc3.weight_posterior.mean',
'fc3.weight_posterior.untransformed_std',
'fc3.bias_posterior.mean',
'fc3.bias_posterior.untransformed_std']