mindarmour.utils
Util methods of MindArmour.
- class mindarmour.utils.GradWrap(network)[source]
Construct a network to compute the gradient of network outputs in input space and weighted by weight, expressed as a jacobian matrix.
- Parameters
network (Cell) – The target network to wrap.
Examples
>>> data = Tensor(np.ones([1, 1, 32, 32]).astype(np.float32)*0.01) >>> label = Tensor(np.ones([1, 10]).astype(np.float32)) >>> num_classes = 10 >>> sens = np.zeros((data.shape[0], num_classes)).astype(np.float32) >>> sens[:, 1] = 1.0 >>> net = NET() >>> wrap_net = GradWrap(net) >>> wrap_net(data, Tensor(sens))
- class mindarmour.utils.GradWrapWithLoss(network)[source]
Construct a network to compute the gradient of loss function in input space and weighted by weight.
- Parameters
network (Cell) – The target network to wrap.
Examples
>>> data = Tensor(np.ones([1, 1, 32, 32]).astype(np.float32)*0.01) >>> labels = Tensor(np.ones([1, 10]).astype(np.float32)) >>> net = NET() >>> loss_fn = nn.SoftmaxCrossEntropyWithLogits() >>> loss_net = WithLossCell(net, loss_fn) >>> grad_all = GradWrapWithLoss(loss_net) >>> out_grad = grad_all(data, labels)
- class mindarmour.utils.LogUtil[source]
Logging module.
- Raises
SyntaxError – If create this class.
- add_handler(handler)[source]
Add other handler supported by logging module.
- Parameters
handler (logging.Handler) – Other handler supported by logging module.
- Raises
ValueError – If handler is not an instance of logging.Handler.
- static get_instance()[source]
Get instance of class LogUtil.
- Returns
Object, instance of class LogUtil.
- set_level(level)[source]
Set the logging level of this logger, level must be an integer or a string. Supported levels are ‘NOTSET’(integer: 0), ‘ERROR’(integer: 1-40), ‘WARNING’(‘WARN’, integer: 1-30), ‘INFO’(integer: 1-20) and ‘DEBUG’(integer: 1-10). For example, if logger.set_level(‘WARNING’) or logger.set_level(21), then logger.warn() and logger.error() in scripts would be printed while running, while logger.info() or logger.debug() would not be printed.