mindarmour.utils
Util methods of MindArmour.
- class mindarmour.utils.GradWrap(network)[source]
Construct a network to compute the gradient of network outputs in input space and weighted by weight, expressed as a jacobian matrix.
- Parameters
network (Cell) – The target network to wrap.
Examples
>>> data = Tensor(np.ones([1, 1, 32, 32]).astype(np.float32)*0.01) >>> label = Tensor(np.ones([1, 10]).astype(np.float32)) >>> num_classes = 10 >>> sens = np.zeros((data.shape[0], num_classes)).astype(np.float32) >>> sens[:, 1] = 1.0 >>> net = NET() >>> wrap_net = GradWrap(net) >>> wrap_net(data, Tensor(sens))
- class mindarmour.utils.GradWrapWithLoss(network)[source]
Construct a network to compute the gradient of loss function in input space and weighted by weight.
- Parameters
network (Cell) – The target network to wrap.
Examples
>>> data = Tensor(np.ones([1, 1, 32, 32]).astype(np.float32)*0.01) >>> label = Tensor(np.ones([1, 10]).astype(np.float32)) >>> net = NET() >>> loss_fn = nn.SoftmaxCrossEntropyWithLogits() >>> loss_net = WithLossCell(net, loss_fn) >>> grad_all = GradWrapWithLoss(loss_net) >>> out_grad = grad_all(data, labels)
- class mindarmour.utils.LogUtil[source]
Logging module.
- Raises
SyntaxError – If create this class.
- add_handler(handler)[source]
Add other handler supported by logging module.
- Parameters
handler (logging.Handler) – Other handler supported by logging module.
- Raises
ValueError – If handler is not an instance of logging.Handler.
- static get_instance()[source]
Get instance of class LogUtil.
- Returns
Object, instance of class LogUtil.