mindspore.nn.BatchNorm3d

class mindspore.nn.BatchNorm3d(num_features, eps=1e-05, momentum=0.9, affine=True, gamma_init='ones', beta_init='zeros', moving_mean_init='zeros', moving_var_init='ones', use_batch_statistics=None, data_format='NCDHW')[source]

Batch Normalization layer over a 5D input.

Batch Normalization is widely used in convolutional networks. This layer applies Batch Normalization over a 5D input (a mini-batch of 3D inputs with additional channel dimension) to avoid internal covariate shift.

\[y = \frac{x - \mathrm{E}[x]}{\sqrt{\mathrm{Var}[x] + \epsilon}} * \gamma + \beta\]

Note

The implementation of BatchNorm is different in graph mode and pynative mode, therefore that mode can not be changed after net was initialized. Note that the formula for updating the running_mean and running_var is \(\hat{x}_\text{new} = (1 - \text{momentum}) \times x_t + \text{momentum} \times \hat{x}\), where \(\hat{x}\) is the estimated statistic and \(x_t\) is the new observed value.

Parameters
  • num_features (int) – C from an expected input of size (N, C, D, H, W).

  • eps (float) – A value added to the denominator for numerical stability. Default: 1e-5.

  • momentum (float) – A floating hyperparameter of the momentum for the running_mean and running_var computation. Default: 0.9.

  • affine (bool) – A bool value. When set to True, gamma and beta can be learned. Default: True.

  • gamma_init (Union[Tensor, str, Initializer, numbers.Number]) – Initializer for the gamma weight. The values of str refer to the function initializer including ‘zeros’, ‘ones’, etc. Default: ‘ones’.

  • beta_init (Union[Tensor, str, Initializer, numbers.Number]) – Initializer for the beta weight. The values of str refer to the function initializer including ‘zeros’, ‘ones’, etc. Default: ‘zeros’.

  • moving_mean_init (Union[Tensor, str, Initializer, numbers.Number]) – Initializer for the moving mean. The values of str refer to the function initializer including ‘zeros’, ‘ones’, etc. Default: ‘zeros’.

  • moving_var_init (Union[Tensor, str, Initializer, numbers.Number]) – Initializer for the moving variance. The values of str refer to the function initializer including ‘zeros’, ‘ones’, etc. Default: ‘ones’.

  • use_batch_statistics (bool) – If true, use the mean value and variance value of current batch data. If false, use the mean value and variance value of specified value. If None, the training process will use the mean and variance of current batch data and track the running mean and variance, the evaluation process will use the running mean and variance. Default: None.

  • data_format (str) – The optional value for data format is ‘NCDHW’. Default: ‘NCDHW’.

Inputs:
  • x (Tensor) - Tensor of shape \((N, C_{in}, D_{in}, H_{in}, W_{in})\).

Outputs:

Tensor, the normalized, scaled, offset tensor, of shape \((N, C_{out}, D_{out},H_{out}, W_{out})\).

Raises
Supported Platforms:

Ascend GPU CPU

Examples

>>> import numpy as np
>>> import mindspore.nn as nn
>>> from mindspore import Tensor
>>> net = nn.BatchNorm3d(num_features=3)
>>> x = Tensor(np.ones([16, 3, 10, 32, 32]).astype(np.float32))
>>> output = net(x)
>>> print(output.shape)
(16, 3, 10, 32, 32)