mindspore.profiler.DynamicProfilerMonitor
- class mindspore.profiler.DynamicProfilerMonitor(cfg_path, output_path='./dyn_profile_data', poll_interval=2, **kwargs)[source]
This class to enable the dynamic profile monitoring of MindSpore neural networks.
- Parameters
cfg_path (str) – Dynamic profile json config file directory. The requirement is a shared path that can be accessed by all nodes.
output_path (str, optional) – Output data path. Default:
"./dyn_profile_data"
.poll_interval (int, optional) – The polling period of the monitoring process, in seconds. Default value:
2
.
- Raises
RuntimeError – When create shared memory times exceeds max times.
- Supported Platforms:
Ascend
GPU
Examples
>>> import numpy as np >>> import mindspore as ms >>> from mindspore import nn >>> import mindspore.dataset as ds >>> from mindspore.profiler import DynamicProfilerMonitor >>> >>> class Net(nn.Cell): ... def __init__(self): ... super(Net, self).__init__() ... self.fc = nn.Dense(2,2) ... def construct(self, x): ... return self.fc(x) >>> >>> def generator(): ... for i in range(2): ... yield (np.ones([2, 2]).astype(np.float32), np.ones([2]).astype(np.int32)) >>> >>> def train(net): ... optimizer = nn.Momentum(net.trainable_params(), 1, 0.9) ... loss = nn.SoftmaxCrossEntropyWithLogits(sparse=True) ... data = ds.GeneratorDataset(generator, ["data", "label"]) ... dynprof_cb = DynamicProfilerMonitor(cfg_path="./dyn_cfg", output_path="./dyn_prof_data") ... model = ms.train.Model(net, loss, optimizer) ... # register DynamicProfilerMonitor to model.train() ... model.train(10, data, callbacks=[dynprof_cb])