mindspore.dataset.BatchInfo

class mindspore.dataset.BatchInfo[source]

Only the batch size function and per_batch_map of the batch operation can dynamically adjust parameters based on the number of batches and epochs during training.

get_batch_num()[source]

Return the batch number of the current batch.

Examples

>>> # Create a dataset where its batch size is dynamic
>>> # Define a callable batch size function and let batch size increase 1 each time.
>>> import mindspore.dataset as ds
>>> from mindspore.dataset import BatchInfo
>>> dataset = ds.GeneratorDataset([i for i in range(10)], "column1")
>>> def add_one(BatchInfo):
...     return BatchInfo.get_batch_num() + 1
>>> dataset = dataset.batch(batch_size=add_one)
get_epoch_num()[source]

Return the epoch number of the current batch.

Examples

>>> # Create a dataset where its batch size is dynamic
>>> # Define a callable batch size function and let batch size increase 1 each epoch.
>>> import mindspore.dataset as ds
>>> from mindspore.dataset import BatchInfo
>>> dataset = ds.GeneratorDataset([i for i in range(10)], "column1")
>>> def add_one_by_epoch(BatchInfo):
...     return BatchInfo.get_epoch_num() + 1
>>> dataset = dataset.batch(batch_size=add_one_by_epoch)