mindspore_xai.tool
CV tools.
- class mindspore_xai.tool.cv.OoDNet(underlying, num_classes)[source]
Out of distribution network.
OoDNet takes an underlying classifier and outputs the out of distribution scores of samples.
Note
A training of OoDNet is required with the classifier’s training dataset inorder to give the correct OoD scores.
- Parameters
underlying (Cell) – The underlying classifier, it must has the num_features (int) and output_features (bool) attributes, please check the example code for the details.
num_classes (int) – The number of classes for the classifier.
- Returns
Tensor, classification logits (if set_train(True) was called) or OoD scores (if set_train(False) was called). In the shape of \((N, L)\) (L is number of classes).
- Raises
TypeError – Be raised for any argument or input type problem.
ValueError – Be raised for any input value problem.
AttributeError – Be raised for underlying is missing any required attribute.
- Supported Platforms:
Ascend
GPU
Examples
>>> import numpy as np >>> import mindspore as ms >>> from mindspore import nn, set_context, PYNATIVE_MODE >>> from mindspore_xai.tool.cv import OoDNet >>> from mindspore.common.initializer import Normal >>> >>> >>> class MyLeNet5(nn.Cell): ... def __init__(self, num_class, num_channel): ... super(MyLeNet5, self).__init__() ... ... # must add the following 2 attributes to your model ... self.num_features = 84 # no. of features, int ... self.output_features = False # output features flag, bool ... ... self.conv1 = nn.Conv2d(num_channel, 6, 5, pad_mode='valid') ... self.conv2 = nn.Conv2d(6, 16, 5, pad_mode='valid') ... self.relu = nn.ReLU() ... self.max_pool2d = nn.MaxPool2d(kernel_size=2, stride=2) ... self.flatten = nn.Flatten() ... self.fc1 = nn.Dense(16 * 5 * 5, 120, weight_init=Normal(0.02)) ... self.fc2 = nn.Dense(120, self.num_features, weight_init=Normal(0.02)) ... self.fc3 = nn.Dense(self.num_features, num_class, weight_init=Normal(0.02)) ... ... def construct(self, x): ... x = self.conv1(x) ... x = self.relu(x) ... x = self.max_pool2d(x) ... x = self.conv2(x) ... x = self.relu(x) ... x = self.max_pool2d(x) ... x = self.flatten(x) ... x = self.relu(self.fc1(x)) ... x = self.relu(self.fc2(x)) ... ... # return the features tensor if output_features is True ... if self.output_features: ... return x ... ... x = self.fc3(x) ... return x >>> >>> set_context(mode=PYNATIVE_MODE) >>> # prepare classifier >>> net = MyLeNet5(10, num_channel=3) >>> # prepare OoD network >>> ood_net = OoDNet(net, 10) >>> inputs = ms.Tensor(np.random.rand(1, 3, 32, 32), ms.float32) >>> ood_map = ood_net(inputs) >>> print(ood_map.shape) (1, 10)
- construct(x)[source]
Forward inferences the classification logits or OOD scores.
- Parameters
x (Tensor) – Input tensor for the underlying classifier.
- Returns
Tensor, logits of softmax with temperature (if set_train(True) was called) or OOD scores (if set_train(False) was called). In the shape of \((N, L)\) (L is number of classes).
- get_train_parameters(train_underlying=False)[source]
Get the training parameters.
- Parameters
train_underlying (bool, optional) – Set to
True
to include the underlying classifier parameters. Default:False
.- Returns
list[Parameter], parameters.
- property num_classes
Get the number of classes.
- Returns
int, the number of classes.
- prepare_train(learning_rate=0.1, momentum=0.9, weight_decay=0.0001, lr_base_factor=0.1, lr_epoch_denom=30, train_underlying=False)[source]
Creates necessities for training.
- Parameters
learning_rate (float, optional) – The optimizer learning rate. Default:
0.1
.momentum (float, optional) – The optimizer momentum. Default:
0.9
.weight_decay (float, optional) – The optimizer weight decay. Default:
0.0001
.lr_base_factor (float, optional) – The base scaling factor of learning rate scheduler. Default:
0.1
.lr_epoch_denom (int, optional) – The epoch denominator of learning rate scheduler. Default:
30
.train_underlying (bool, optional) –
True
to train the underlying classifier as well.Default:False
.
- Returns
Optimizer, optimizer.
LearningRateScheduler, learning rate scheduler.
- set_train(mode=True)[source]
Set training mode.
- Parameters
mode (bool, optional) – It is in training mode. Default:
True
.
- train(dataset, loss_fn, callbacks=None, epoch=90, optimizer=None, scheduler=None, **kwargs)[source]
Trains this OoD net.
- Parameters
dataset (Dataset) – The training dataset, expecting (data, one-hot label) items.
loss_fn (Cell) – The loss function, if the classifier’s activation function is nn.Softmax, then use nn.SoftmaxCrossEntropyWithLogits, if the activation function is nn.Sigmoid, then use nn.BCEWithLogitsLoss.
callbacks (Callback, optional) – The train callbacks. Default:
None
.epoch (int, optional) – The number of epochs to be trained. Default:
90
.optimizer (Optimizer, optional) – The optimizer. The one from prepare_train() will be used if which is set to
None
. Default:None
.scheduler (LearningRateScheduler, optional) – The learning rate scheduler. The one from prepare_train() will be used if which is set to None. Default:
None
.**kwargs (any, optional) – Keyword arguments for prepare_train().
- property underlying
Get the underlying classifier.
- Returns
nn.Cell, the underlying classifier.