mindarmour.utils
Util methods of MindArmour.
- class mindarmour.utils.GradWrap(network)[source]
Construct a network to compute the gradient of network outputs in input space and weighted by weight, expressed as a jacobian matrix.
- Parameters
network (Cell) – The target network to wrap.
Examples
>>> from mindspore import Tensor >>> from mindarmour.utils import GradWrap >>> from mindarmour.utils.util import WithLossCell >>> import mindspore.ops.operations as P >>> class Net(nn.Cell): ... def __init__(self): ... super(Net, self).__init__() ... self._softmax = P.Softmax() ... self._Dense = nn.Dense(10,10) ... self._squeeze = P.Squeeze(1) ... def construct(self, inputs): ... out = self._softmax(inputs) ... out = self._Dense(out) ... return self._squeeze(out) >>> net = Net() >>> data = Tensor(np.ones([2, 1, 10]).astype(np.float32)*0.01) >>> labels = Tensor(np.ones([2, 10]).astype(np.float32)) >>> num_classes = 10 >>> sens = np.zeros((data.shape[0], num_classes)).astype(np.float32) >>> sens[:, 1] = 1.0 >>> wrap_net = GradWrap(net) >>> wrap_net(data, Tensor(sens))
- class mindarmour.utils.GradWrapWithLoss(network)[source]
Construct a network to compute the gradient of loss function in input space and weighted by weight.
- Parameters
network (Cell) – The target network to wrap.
Examples
>>> from mindspore import Tensor >>> from mindarmour.utils import GradWrapWithLoss >>> from mindarmour.utils.util import WithLossCell >>> import mindspore.ops.operations as P >>> class Net(nn.Cell): ... def __init__(self): ... super(Net, self).__init__() ... self._softmax = P.Softmax() ... self._Dense = nn.Dense(10,10) ... self._squeeze = P.Squeeze(1) ... def construct(self, inputs): ... out = self._softmax(inputs) ... out = self._Dense(out) ... return self._squeeze(out) >>> data = Tensor(np.ones([2, 1, 10]).astype(np.float32)*0.01) >>> labels = Tensor(np.ones([2, 10]).astype(np.float32)) >>> net = Net() >>> loss_fn = nn.SoftmaxCrossEntropyWithLogits() >>> loss_net = WithLossCell(net, loss_fn) >>> grad_all = GradWrapWithLoss(loss_net) >>> out_grad = grad_all(data, labels)
- class mindarmour.utils.LogUtil[source]
Logging module.
Recording the logging statistics over time in long-running scripts.
- Raises
SyntaxError – If create this class.
- add_handler(handler)[source]
Add other handler supported by logging module.
- Parameters
handler (logging.Handler) – Other handler supported by logging module.
- Raises
ValueError – If handler is not an instance of logging.Handler.
- static get_instance()[source]
Get instance of class LogUtil.
- Returns
Object, instance of class LogUtil.
- set_level(level)[source]
Set the logging level of this logger, level must be an integer or a string. Supported levels are ‘NOTSET’(integer: 0), ‘ERROR’(integer: 1-40), ‘WARNING’(‘WARN’, integer: 1-30), ‘INFO’(integer: 1-20) and ‘DEBUG’(integer: 1-10). For example, if logger.set_level(‘WARNING’) or logger.set_level(21), then logger.warn() and logger.error() in scripts would be printed while running, while logger.info() or logger.debug() would not be printed.