mindspore.profiler.schedule

View Source On Gitee
mindspore.profiler.schedule(*, wait: int, active: int, warm_up: int = 0, repeat: int = 0, skip_first: int = 0)

This class use to get the actions of each step. The schedule is as follows:

(NONE)        (NONE)          (NONE)       (WARM_UP)       (RECORD)      (RECORD)     (RECORD_AND_SAVE)    None
START------->skip_first------->wait-------->warm_up-------->active........active.........active----------->stop
                              |                                                             |
                              |                           repeat_1                          |
                              ---------------------------------------------------------------

The profiler will skip the first skip_first steps, then wait for wait steps, then do the warm_up for the next warm_up steps, then do the active recording for the next active steps and then repeat the cycle starting with wait steps. The optional number of cycles is specified with the repeat parameter, the zero value means that the cycles will continue until the profiling is finished.

Parameters
  • wait (int) – The number of steps to wait before starting the warm-up phase.

  • active (int) – The number of steps to record data during the active phase.

  • warm_up (int, optional) – The number of steps to perform the warm-up phase. Default: 0.

  • repeat (int, optional) – The number of times to repeat the cycle. Default: 0.

  • skip_first (int, optional) – The number of steps to skip at the beginning. Default: 0.

Raises

ValueError – When the parameter step is less than 0.

Supported Platforms:

Ascend

Examples

>>> import numpy as np
>>> import mindspore as ms
>>> import mindspore.dataset as ds
>>> from mindspore import context, nn, Profiler
>>> from mindspore.profiler import schedule, tensor_board_trace_handler
>>>
>>> class Net(nn.Cell):
...     def __init__(self):
...         super(Net, self).__init__()
...         self.fc = nn.Dense(2, 2)
...
...     def construct(self, x):
...         return self.fc(x)
>>>
>>> def generator_net():
...     for _ in range(2):
...         yield np.ones([2, 2]).astype(np.float32), np.ones([2]).astype(np.int32)
>>>
>>> def train(test_net):
...     optimizer = nn.Momentum(test_net.trainable_params(), 1, 0.9)
...     loss = nn.SoftmaxCrossEntropyWithLogits(sparse=True)
...     data = ds.GeneratorDataset(generator_net(), ["data", "label"])
...     model = ms.train.Model(test_net, loss, optimizer)
...     model.train(1, data)
>>>
>>> if __name__ == '__main__':
...     context.set_context(mode=ms.PYNATIVE_MODE, device_target="Ascend")
...
...     net = Net()
...     STEP_NUM = 15
...
...     with Profiler(schedule=schedule(wait=1, warm_up=1, active=2, repeat=1, skip_first=2),
...                   on_trace_ready=tensor_board_trace_handler) as prof:
...         for i in range(STEP_NUM):
...             train(net)
...             prof.step()