mindarmour.utils

Util methods of MindArmour.

class mindarmour.utils.GradWrap(network)[source]

Construct a network to compute the gradient of network outputs in input space and weighted by weight, expressed as a jacobian matrix.

Parameters

network (Cell) – The target network to wrap.

Examples

>>> from mindspore import Tensor
>>> from mindarmour.utils import GradWrap
>>> from mindarmour.utils.util import WithLossCell
>>> import mindspore.ops.operations as P
>>> class Net(nn.Cell):
...     def __init__(self):
...         super(Net, self).__init__()
...         self._softmax = P.Softmax()
...         self._Dense = nn.Dense(10,10)
...         self._squeeze = P.Squeeze(1)
...     def construct(self, inputs):
...         out = self._softmax(inputs)
...         out = self._Dense(out)
...         return self._squeeze(out)
>>> net = Net()
>>> data = Tensor(np.ones([2, 1, 10]).astype(np.float32)*0.01)
>>> labels = Tensor(np.ones([2, 10]).astype(np.float32))
>>> num_classes = 10
>>> sens = np.zeros((data.shape[0], num_classes)).astype(np.float32)
>>> sens[:, 1] = 1.0
>>> wrap_net = GradWrap(net)
>>> wrap_net(data, Tensor(sens))
construct(*data)[source]

Compute jacobian matrix.

Parameters

data (Tensor) –

Data consists of inputs and weight.

  • inputs: Inputs of network.

  • weight: Weight of each gradient, ‘weight’ has the same shape with labels.

Returns

Tensor, Jacobian matrix.

class mindarmour.utils.GradWrapWithLoss(network)[source]

Construct a network to compute the gradient of loss function in input space and weighted by weight.

Parameters

network (Cell) – The target network to wrap.

Examples

>>> from mindspore import Tensor
>>> from mindarmour.utils import GradWrapWithLoss
>>> from mindarmour.utils.util import WithLossCell
>>> import mindspore.ops.operations as P
>>> class Net(nn.Cell):
...     def __init__(self):
...         super(Net, self).__init__()
...         self._softmax = P.Softmax()
...         self._Dense = nn.Dense(10,10)
...         self._squeeze = P.Squeeze(1)
...     def construct(self, inputs):
...         out = self._softmax(inputs)
...         out = self._Dense(out)
...         return self._squeeze(out)
>>> data = Tensor(np.ones([2, 1, 10]).astype(np.float32)*0.01)
>>> labels = Tensor(np.ones([2, 10]).astype(np.float32))
>>> net = Net()
>>> loss_fn = nn.SoftmaxCrossEntropyWithLogits()
>>> loss_net = WithLossCell(net, loss_fn)
>>> grad_all = GradWrapWithLoss(loss_net)
>>> out_grad = grad_all(data, labels)
construct(inputs, labels)[source]

Compute gradient of inputs with labels and weight.

Parameters
  • inputs (Tensor) – Inputs of network.

  • labels (Tensor) – Labels of inputs.

Returns

Tensor, gradient matrix.

class mindarmour.utils.LogUtil[source]

Logging module.

Recording the logging statistics over time in long-running scripts.

Raises

SyntaxError – If create this class.

add_handler(handler)[source]

Add other handler supported by logging module.

Parameters

handler (logging.Handler) – Other handler supported by logging module.

Raises

ValueError – If handler is not an instance of logging.Handler.

debug(tag, msg, *args)[source]

Log ‘[tag] msg % args’ with severity ‘DEBUG’.

Parameters
  • tag (str) – Logger tag.

  • msg (str) – Logger message.

  • args (Any) – Auxiliary value.

error(tag, msg, *args)[source]

Log ‘[tag] msg % args’ with severity ‘ERROR’.

Parameters
  • tag (str) – Logger tag.

  • msg (str) – Logger message.

  • args (Any) – Auxiliary value.

static get_instance()[source]

Get instance of class LogUtil.

Returns

Object, instance of class LogUtil.

info(tag, msg, *args)[source]

Log ‘[tag] msg % args’ with severity ‘INFO’.

Parameters
  • tag (str) – Logger tag.

  • msg (str) – Logger message.

  • args (Any) – Auxiliary value.

set_level(level)[source]

Set the logging level of this logger, level must be an integer or a string. Supported levels are ‘NOTSET’(integer: 0), ‘ERROR’(integer: 1-40), ‘WARNING’(‘WARN’, integer: 1-30), ‘INFO’(integer: 1-20) and ‘DEBUG’(integer: 1-10). For example, if logger.set_level(‘WARNING’) or logger.set_level(21), then logger.warn() and logger.error() in scripts would be printed while running, while logger.info() or logger.debug() would not be printed.

Parameters

level (Union[int, str]) – Level of logger.

warn(tag, msg, *args)[source]

Log ‘[tag] msg % args’ with severity ‘WARNING’.

Parameters
  • tag (str) – Logger tag.

  • msg (str) – Logger message.

  • args (Any) – Auxiliary value.